At the Extended Reality and Games Lab (XRG Lab), we perform research on novel interaction techniques and enhancing extended (virtual/augmented/mixed) reality systems for improved usability and user experience. Our work mainly consists of design, development and evaluation (through empirical user studies) of these interaction techniques and enhanced extended reality systems.
In general, we engage in work that falls under one of the following larger-scope research questions:
- How can extended reality be enhanced for improving individuals’ lives in terms of better education, health and well-being, training, accessibility, empathy, awareness and entertainment?
- What are the effects of novel interaction techniques on usability and user experience in extended reality?
- How can the boundary between the real and virtual worlds be blurred such that the technology that connects these two worlds becomes seamless, and more intuitive and immersive user experiences are afforded?
- How can video games be leveraged for beneficial purposes, such as healthier lifestyles, increased knowledge and improved skills?
Our work is mainly driven by curiosity and a desire to push the boundaries in development of innovative and interactive playful technologies. We thrive for contributing to the existing knowledge in this field through peer-reviewed publications and presentations.
Examples from the ongoing research projects at the XRG Lab can be seen below:
Give Me a Hand? (Funded by the National Science Foundation)
Enabling mutual tangible embodied interaction with virtual characters through shared objects that extend from the virtual world into the real world.
Tangiball (Funded by the University of Arizona SBSRI)
Enhancing virtual reality through interaction with a virtual ball’s dynamic physical real-world extension.
Investigating effects of visual cues and virtual object behavior in mixed reality on usability and user experience.
Investigating effects of interaction technique in mixed reality on usability an user experience (i.e., gesture-based interaction, controller-based interaction and tangible object-based interaction where the tangible object has matching geometry with the projected hologram and is tracked in-real time).
Enabling non-verbal interaction between a human and a virtual avatar through a novel virtual reality system, which includes a game of tic-tac-toe that is played on a shared physical glass board interface in real-time.
Game On (Funded by the University of Arizona SBSRI)
Using dynamic tangible objects as a video game display through real-time projections.
Bounce: A Mixed Reality Serious Game for Teaching Newtonian Physics Concepts
Mixed reality with increased context awareness where hologram behavior is adjusted based on the surface materials in the physical environment.
Displaying real-time cartoon rendering of user’s eyes on a head-mounted display to decrease the degree of isolation in VR and to increase social communication and interaction between HMD and non-HMD users.
Effects of horizontally reversed interaction on spatial judgement, usability and user experience in virtual reality.
The XRG Lab is located in School of Information’s Room 418 at the University of Arizona and it includes cutting-edge equipment such as advanced stationary and mobile virtual reality systems, mixed reality systems, high-performance computer workstations and custom tangible interaction prototypes.
For more information or inquiries, send an e-mail to firstname.lastname@example.org