Dissertation: "On Virtual, Augmented, and Mixed Reality for Socially Assistive Robotics".

This website contains information about my research, publications, and projects during my Ph.D. I am now a fulltime software engineer and am no longer updating this website. The publications page is up to date as of 2/9/23 and includes my dissertation. If you would like to contact me, please email me using the "Contact" link above (I am more than happy to talk about my research/Ph.D.).

I successfully defended my dissertation on 2/9/23 and am an incoming remote Ph.D. intern at iRobot focusing on ROS2 development from February-July 2023. Once my internship is complete, I am looking for a full-time software development position either based in Chicago or remote + open to travel. I am interested in software architecture and user experience.

My background is in software development (C++, C#, Python) focused on robotics (ROS/ROS2/ROS#), games (Unity), and augmented reality (MRTK, ARKit, ARCore). My core projects are open-source and include a custom AR visual programming language - MoveToCode - with an autonomous robot tutor that responds to student curiosity and a learning behavior trees from demonstration pipeline - RE-BT-Espresso. (Github)

  • Languages: C#, C++, Python, Javascript, R, Bash
  • Tools: Unity, Robot Operating System (ROS), RosSharp (ROS#), Mixed Reality Toolkit (MRTK), Jupyterlab (pandas, seaborn, sklearn)

For research, I co-authored three funded grants with my Ph.D. advisor Maja Matarić (~$1.55 million total funding), published 20 papers (2 Journal, 9 Conference, 9 Workshop), and mentored 26 Master's, undergraduate, and high school students. These students first-authored 7 papers, won 7 undergraduate research awards, were nominated for 2 best paper conference awards, contributed code to our open-source projects, and were co-authors on the majority of my papers.

Research Work


Kinesthetic pair-programming with an autonomous AR robot tutor

Created a custom AR visual programming language with a robot tutor that responds to student curiosity. Created for students ages 8-12 and deployed in Los Angeles classrooms. Publication accepted to Ro-MAN 2023.

Reimagining RViz

Multidimensional Augmented Reality Robot Signal Design

Created AR signal designs looking at interaction effect between individual Virtual Design Elements. Conference paper accepted in RO-MAN 2022.

What and How Are We Reporting in HRI?

A Review and Recommendations for Reporting Recruitment, Compensation, and Gender

Undergraduate led work on surveying the state of HRI study meta data reporting for HRI and Ro-MAN conference papers. First workshop paper in HRI Workshop on Fairness and Transparency in HRI: Algorithms, Methods, and Metrics with the conference paper accepted in RO-MAN 2022.


Chromebook accessible, pose-based coding.

Undergraduate led work on having students pose to create code blocks that control a virtual robot. Demo. Paper in ACHI 2022.

Augmented Reality Appendages for Robots

Combining Social and Functional Designs

Undergraduate led work on design considerations for increasing social and functional perception of AR robot appendages. Paper in VAM-HRI 2022.

Learning Behavior Trees from expert robot demonstrations

extending the BT-Espresso algorithm.

Work on exploiting different respresetations of underlying BT structure. Paper in ICRA 2022.

Robot Tutors for Students with Autism Spectrum Disorder (ASD)

long-term deployments and modelling.

Long term work on the NSF Expeditions in Computer: Socially Assitive Robotics. Paper on THRI.

Kinesthetic Curiosity: Towards Personalized Embodied Learning with a Robot Tutor Teaching Programming in Mixed Reality

for measuring student movement and curiosity.

Explored measure of student kinesthetic curiosity in pilot study. Paper in ISER '20

Usability Metrics in Mixed Reality

for measuring real-time system understanding

Explored mixed reality data from pilot study of MoveToCode for correlations with System Usability Scale scores. Paper in ICSR '20

Telepresence UI and Accesibility

for understanding audio levels

Deployed telepresence robots with audio level UI. Extended Abstract in Companion-HRI '20. Modelling paper in ICMI 2021.

Mixed Reality Robot Extensions and Gestures

to increase robot social expressivity.

Created mixed reality arms for more social expression of a robot. Paper in proceedings of RO-MAN 2019. A video of the gestures can be found here.

Personalized Telepresence Robots

to increase embodiment

Had users personalize telepresence robots in order to observe differences in user interaction based upon user pair relationships. Paper in HRI '20

Dyanamic Planning and Navigation

for navigating to affordance templates

Worked at TRACLabs implementing a smoothed A-star planner with a dynamic window low level controller able to replan with dynamic objects. This included reviving the TRACBot reconfigurable modular mobile manipulator.

Geometrically Informed Iterative Closest Point (ICP)

With point filtering

Looked into different point filtering and methods for a geometrically informed ICP. This also included creating a web based ICP visualization.

Simultaneous Localization and Mapping (SLAM)

focusing on post processing visualization

Reimplemented a SLAM post processing method proposed by Edwin Olson in Fast Iterative Alignment of Pose Graphs with Poor Initial Estimates. Tested with the Fetch mobile manipulator.

Prior Mentoring and Outreach

Previously I have overseen İpek Göktan, Karen Ly, Massimiliano Nigro, Rachel Channell, Evelyn Miguel Vargas, Charles Gary, Dara Macareno, Chloe Kuo, Julia Cordero, Adam Wathieu , Jenny Lee, Nisha Chatwani, Karen Berba, Daniel Ramirez, Radhika Agrawal, Kartik Mahajan, Roddur Dasgupta, Roxanna Pakkar, Zhonghao Shi, Ryan Stevenson and Adnan Karim. I also have mentored high school students Annika Modi, Jacob Zhi, İpek Göktan, Mena Hassan, Ashley Perez, and Bryan Pyo as a part of the USC SHINE program.

Outside of lab, I used to help out at Clifford Street Elementary School with their new 5th grade VEX robotics team. I also was a part of a TEALS volunteer team at LACES Highschool teaching AP Computer Science for the 2019-2020 school year. I created the US Women in Academic Robotics Research website. This website is now overseen by Amy O'Connell of the USC Interaction Lab. If you would like to help out, feel free to directly submit requests to add people via the website.


School Degree Started Completed
University of Southern California - Los Angeles Computer Science PhD 2018 2023
University of Southern California - Los Angeles Computer Science Master's 2018 2021
University of Michigan - Ann Arbor Computer Science B.S.E. 2014 2018

If you are interested in getting involved with the Interaction Lab, please read the current research before contacting. I am no longer in the Interaction Lab but am happy to answer questions about how to get involved in the lab!