Awards & Research

ACM SIGGRAPH '24 APPY HOUR • MOBISYS '25 PROCEEDINGS

Balboa Park Alive! Exploring Biodiversity through Augmented Reality

Published: This paper showcases Balboa Park Alive!, a series of immersive, interactive phone-based AR installations that explore the biodiversity of the San Diego/Tijuana region. By integrating Niantic Lightship ARDK, Mapbox, and mobile hand tracking, it creates embodied encounters with 3D digital representations of local plants, insects, and animals. Unlike other conservation-focused AR apps, Balboa Park Alive! emphasizes first-hand perspective taking and evidence-based inquiry, guiding families to engage actively with their environment and each other.

PaperProject

IEEE VIS '25 PROCEEDINGS

Fragmented Oceans: A Telepresence-Driven Visualization of Ocean Garbage Patches

Fragmented Oceans presents a telepresence-driven visualization combining robotic interaction and immersive environments to raise awareness of ocean garbage patches. Installed in two connected spaces at ASU's MIX Center, it uses a UR20 robotic arm with acrylic cubes to represent ocean waste data and interactive visuals developed in TouchDesigner. Participants engage both physically and virtually, exploring dynamic wave simulations through telepresence.

NSF GRANT • ONGOING RESEARCH

ALERT: AI-Enhanced Learning in Environmental Robotics Training

A cloud-based AR platform funded by a $400K NSF grant to train AEC students in environmental robotics. This research applies AI to personalize learning experiences and support sustainable design practices, developing new frameworks for immersive education in environmental engineering.

Personalized Recommender System Visualization

AMERICAN SCIENTIFIC PUBLISHING GROUP '21

A Personalized Recommender System

Published: Designed and implemented collaborative-filtering algorithms on the TMDB dataset to deliver context-aware, personalized media recommendations. By integrating both user-rating profiles and situational factors (e.g., time of day, device type), the system consistently outperformed baseline methods in precision and user satisfaction during offline evaluations. It is demonstrating the value of combining personal preferences with contextual cues and has garnered over 60,000 reads.

WORK IN PROGRESS • MIXED REALITY RESEARCH

Spatially-Grounded Human-AI Collaboration Through Embodied Virtual Agents in Mixed Reality Environments

Recent advances in mixed reality, artificial intelligence, and embodied agent systems have enabled the development of virtual characters that inhabit users' physical workspaces. These agents integrate multimodal perception, spatial mapping, and natural dialogue into an integrated framework where they understand the physical context and user behavior. Through gaze tracking, speech recognition, and embodied non-verbal cues, they are capable of participating in natural, real-time conversations that mirror human-to-human collaboration.

Research Interests

Immersive Technologies

Exploring AR/VR applications for education, environmental awareness, and social impact through hands-on prototyping and user research.

Human-Computer Interaction

Designing intuitive interfaces and interactions that bridge the gap between complex technologies and meaningful user experiences.

Agentic AI Systems

Exploring how autonomous AI agents can perceive, reason, and take initiative, adapting to user behaviors and environmental contexts to enable proactive, intelligent interactions across diverse applications.

Quantum UX

Looking at ways to design interfaces that aren't fixed but can change as you interact. By borrowing ideas from quantum computing-such as having multiple possibilities at once, these interfaces can adjust on the fly to what you do, making the experience feel more flexible and responsive.

This portfolio is featured on UX Remote Talent

Want to get in touch?
I'd love to connect with you

Mail
LinkedIn
Medium
GitHub