Graduate Research
University of California, Santa Cruz Social Emotional Technology Lab
October 2020 - June 2022
Overview
With the Social Emotional Technology lab at UCSC, I studied how multi-user VR environments can affect collaborative success in meeting use cases where users are represented by stylized avatars.
Thesis topic
My Master's thesis, titled 'Avatar Styles and Choices,' explored the intricacies of avatar customization in virtual environments and its impact on collaborative creativity and performance. This research delved into the motivations and preferences behind users' avatar design choices, utilizing the virtual environment as a lens to understand these dynamics. Key questions addressed include:
- The influence of different avatar styles on user interaction within Social VR contexts.
- The significance of real-world identity in the selection of avatar designs, examining preferences for realistic versus fantastical avatars.
- The role of avatar design in facilitating interpersonal connections and collaboration in VR settings.
- The effect of varying avatar styles on group creativity and the dynamics of collaboration.
- Through this investigation, the study aims to shed light on how avatar design and environmental context shape user experiences, collaboration, and creative outcomes in virtual reality
→ Check out my thesis , all 68 pages of scholarly delight, under the "Education" tab
Notable Methods
- Experiments
- Research-through-design
- Landscape analysis
- Autobiographical
- Ethnography, autoethnography
- Games user research
- Literature reviews
- Wizard of Oz
- Think-aloud protocol
- Affinity diagramming
- Observation
Research
My research journey has deep roots in Cognitive Science, with a focus on social emotional technology, the nuances of non-verbal human communication and expression, and the pioneering fields of augmented and virtual reality. In the sections that follow, I delineate my research endeavors across industry applications, graduate studies, and undergraduate exploration. Accompanying each segment, you'll find a detailed overview of the research methods employed, showcasing the breadth and depth of my investigative approach
Industry Research
Meta - AR/VR Face Tracking Team
June 2018 - Present
Overview
Within Meta's AR/VR Face Tracking Team, I began my journey as a Research Assistant and subsequently advanced to the role of Senior Research Assistant. In January 2020, I achieved the position of Researcher. My research primarily revolves around human expression, with a particular emphasis on facial expressions and linguistic cues. Within the face tracking team, I serve as a key point of contact for inquiries related to facial expression, visemes, research design for face tracking data collection, and data annotations.
Areas
- AR and VR Face Tracking: In my role as a central point of contact for expression research, I provide guidance to engineers, artists, and other cross-functional teams on the accurate capture and modeling of authentic expressions. Additionally, I stay updated on emerging research in this domain to ensure that the guidance I offer is founded on solid scientific principles. This responsibility is ongoing and essential for numerous face tracking projects.
- Validating importance of face tracking for avatar tech: In my role, I've crafted several research strategies intending to gather data that confirms the vital role of face tracking in avatar technology, particularly within multi-user VR environments.
- Data collection design for training ML models: In pursuit of enhancing our machine learning models, I have developed data collection materials aimed at addressing the existing gaps in our product's dataset.
Examples
- Emotion recognition accuracy improvement: Spearheaded a detailed study focused on elevating the accuracy of emotion recognition within our virtual avatars. This initiative entailed a rigorous quantitative analysis of existing algorithm performance, complemented by an in-depth survey study to understand user perceptions and experiences. The research led to notable advancements in our system's proficiency in identifying and differentiating complex emotions. As a result, we significantly enhanced user interactions and engagement, offering a more intuitive and emotionally resonant experience with our avatars.
- Competitive analysis with Apple's Memoji: Conducted a competitive analysis of Apple’s Memoji to benchmark our avatar technology against a market leader. This analysis highlighted our strengths and pinpointed areas for improvement, particularly in artistic representation and face tracking. The insights obtained guided enhancements to our avatars' visual appeal and functionality
- Face tracking's impact on problem-solving: Designed and executed a user study to evaluate the role of facial tracking in enhancing problem-solving environments, with the goal to see its effect on user collaboration and decision-making. The findings helped to refine facial tracking tech, leading to improved user interactions in complex settings.
Notable Methods
- Facial Action Coding System (FACS) Frameworks
- User studies
- Observational research
- Research through design
- Usability testing
- Competitive testing
- Eyetracking
- Interviews
- A/B testing
- Surveys
- Prototyping
Undergraduate Research
University of California, Santa Cruz
March 2016 - March 2018
Overview
During my undergraduate studies, I had the privilege of contributing as a Research Assistant (RA) across three distinct research labs, each focusing on a different area of psychology and cognitive science: developmental psychology, computational and experimental cognition, and high-level perception.
Developmental Psychology
- Research area: Exploring moral reasoning among college students.
- Role: Conducted interviews on contentious scenarios, digitally coded and analyzed data, and played a key role in the development of the final publication.
Computational & Experimental Cognition
- Research area: Delved into psycholinguistics, specifically examining articulatory onsets in natural language production.
- Role: Analyzed video and acoustic data to understand speech patterns and contributed to data processing and interpretation.
High-Level Perception
- Research area: Intensive research on human perception within virtual reality, focusing on face perception and gaze biases.
- Role: Led participants through experimental protocols, utilized MatLab for data coding, engaged in bi-weekly team discussions, and contributed as an author to final publications.
Notable Methods
- Eyetracking
- Interviews
- Literature reviews
- Think-aloud protocol