Unreal Realm

Investigating gestural interaction techniques for object manipulation in mixed reality.

Date

2022

Duration

4 weeks

Project Type

Vision

Keywords

Modified Oculus Quest 2, Gestural interactions, Vision design

Role

Research, Ideation, Technical Pipeline, Programming, Prototyping

Team

Aria Xiying Bao, Nix Liu Xin

Overview

Unreal Realm is a set of interoperable gestures across devices. It serves as a vision for how we could interact across different spaces enabled by spatial computing.

Background

Screen-based devices have become the dominant medium through which we interact with digital content. Despite the variety of device types and screen sizes, our experience rests on a handful of touch gestures (swiping, pinching, tapping) that act as the backbone of mobile interactions today. Even with the emerging popularity of mixed-reality experiences today, screen space, AR space, and physical space have been in relative isolation. 

Problem Space

💡

What are the foundational 3D interactions of a mixed-reality future?

What are the ways through which the boundaries between spaces can be blurred?


We define the 3 spaces as follows:

Screen space: 2D pixel coordinate system of a computer display
AR space: 3D virtual coordinate system overlayed onto a real-world camera feed
Physical space: the real world with tangible materials and matters

Narrowing down the scope - how might we make object customization intuitive in a mixed reality future?

Use Case

Early Explorations

Picking colors from physical space to screen space (Left)

Launching balls from AR space into screen space (Right)

Technical Pipeline

Interactions

Extract real textures and apply them to AR objects

Applying materials is a common function in 3D rendering and the product design process. The process of obtaining a material that mimics the real object is not intuitive and efficient enough. In our implementation, we propose that users can directly search for and extract materials from real-world objects, and transform them into digital content for further AR-based and screen-based creation. The user uses an ‘OK’ sign gesture that mimics a magnifier to indicate texture on a real object. 


After the material is confirmed, a texture ball will be generated beside the user’s hand. The user drags the texture ball to the object to apply the texture.


Flip to show menu

The menu is designed to be displayed at the right side of the left palm when the user's palm is facing toward the user. To press a button in the hand menu, the user uses the non-dominant hand (right hand) to press the virtual button with the index finger.


Grab to manipulate

By grabbing virtual objects naturally, the user is able to move them to another position and orient them as they wish. 


Scan to create a digital twin

We envision a gesture-based 3D scanning process that produces a digital twin in a seamless and intuitive way. The user moves the hand in one direction to perform the scan.


Pull to explode

Two hands moving apart diagonally while pinching is often a metaphor for opening things up. The user explodes the chair to have a closer look at the product’s structure. When the user moves the two hands closer while pinching, the exploded chair is manipulated back to its original state. 


Swipe to flip through a catalog

The user swipes to flip through a virtual object catalog. This was inspired by the finger swiping motion prevalent in 2D screen interactions.


Point to delete

The user removes the virtual object via a pointing gesture.


Next Steps

For near-future work, we plan to conduct a within-subjects user study to investigate the elaborated sets of interaction techniques for cross-space content manipulation tasks. We are particularly interested in finding more about the practicality and consistency of the proposed techniques for these interaction tasks. For this, we design cross-level tasks with varied target sizes and respective destination areas to investigate whether the combined interaction techniques bring seamless transition between different spaces.


We mainly aim to receive both qualitative and quantitative user feedback to gain further insights into hand gesture interaction for cross-space workspace tasks. As a result, the goal of this evaluation was not to beat a certain baseline, but rather to see how people cope with the newly developed approaches for completing a given cross-space content manipulation task.


For a long-term future plan, we aim to establish a database platform that collects XR hand gesture catalogs. This platform aims to provide a reference for universal XR user experience design.

Designed with ❤️ across realities

Davide Zhang © 2016 - 2023. All Rights Reserved.

Davide Zhang © 2016 - 2023. All Rights Reserved.

Davide Zhang © 2016 - 2023. All Rights Reserved.