Mid-air Interaction

Gestures and interaction design in Autostereocopic 3D
2018 summer, I worked with Alibaba NHCI Lab in Bay Area, as a research intern. We explored innovative interactions to define futuristic shopping experience, serve Alibaba customers in various businesses. The project I worked on is to validate mid-air interactions with auto-stereoscopic 3D objects. We went through design, prototype and user study phases. We worked closely with business team, aiming to quickly transform the conceptual design and get feedback from marketplace.


SeeFront 3D monitor
Autostereoscopic 3D monitor - See Front, tracks user’s eyes, then projects different views for left and right eyes, so the user could see 3D objects pop out the screen due to parallax.
Leap Motion
With Leap Motion, user’s hand gestures are captured to interact with objects in the mid-air.
Ultrahaptic, a sonic device that acts on user’s hand, and gives meaningful haptic feelings, is used for providing touch feedback when users interact with mid-air virtual object.

User Scenario Design

With the business goal in mind, we started from thinking features of mid-air interaction:


We used two cameras in Unity 3D to generate parallax view for left and right eyes, display on the autostereoscopic screen.
Programmed on leap motion to define gestures beyond stander library. Especially rotation, we tried different implementation to get around interrupted gestures and detection inaccuracy.
To map virtually object to real world, we read eye position data, put haptic cue right at the virtual object’s position according to where the user is.

User Study

We designed two user studies to understand how user perceive virtual object’s position.
Research Questions:
1. Is the perception consistent within one user?
2. Is the perception consistent cross users?
3. Is the perception changed when touching is involved?


The 15-week internship is quite an unique project experience for me. I’ve been longing for designing cutting-edge technology experience. Thanks for the internship, I get the chance to work on a bunch of new technology, define future product interactions. The process is a combination of HCI research and product design. Certainly, we faced a lot of ambiguity in business and technical implementation. The consideration from both sides pushed me to think actively for new ways of approaching a problem. A good thing we did is that we always kept our goal in mind, meanwhile being flexible to changes. We, as a team, didn’t make design opposite to development, but to understand the boundary of the project, and try to deliver the best user experience within constrains. In the end, I harvest the ability to thrive and deliver results in changing situations🎉.