Mid-air Interaction

Gestures and interaction design in Autostereocopic 3D
2018 summer, I worked with Alibaba NHCI Lab in Bay Area, as a research intern. We explored innovative interactions to define futuristic shopping experience, serve Alibaba customers in various businesses. The project I worked on is to validate mid-air interactions with auto-stereoscopic 3D objects. We went through design, prototype and user study phases. We worked closely with business team, aiming to quickly transform the conceptual design and get feedback from marketplace.

Device

SeeFront 3D monitor
Autostereoscopic 3D monitor - See Front, tracks user’s eyes, then projects different views for left and right eyes, so the user could see 3D objects pop out the screen due to parallax.
Leap Motion
With Leap Motion, user’s hand gestures are captured to interact with objects in the mid-air.
Ultrahaptic
Ultrahaptic, a sonic device that acts on user’s hand, and gives meaningful haptic feelings, is used for providing touch feedback when users interact with mid-air virtual object.

User Scenario Design

With the business goal in mind, we started from thinking features of mid-air interaction:

Gesture and Visual Design

(Due to Non-disclosure Agreement, I can only put design process and brainstorm ideas in this blog)
To design interaction with 3D virtuality, we thought of the real physical shopping experience as well as how people shop using technological interface.
When designing interactions with 3D, apart from real-life, we considered gestures drawn from:
Tackpad
Tablet
Science Fiction


To create an emerging experience, we try to be closed to real-world scenarios to display products, rather than using traditional UI elements
Conveyor belt sushi
Jewelry display case
Visit a room
Selecting includes both remote selection and direct pick. Apart from examples such as spotlight in DayDream, we came up with other metaphors that are unique to our system.
Pulling a string to fetch
Summon a remote item
Direct touch an item closeby
We used participatory design to design exit gesture. We asked participants to perform any gesture to go back to home page. The result indicates following actions are commonly performed:
Push away
Home Button
Phone gesture - swipe up
We did several rounds of test to understand how user directly manipulate virtual objects in mid-air. We had some findings when we bring user’s hands on the Unity prototype:
1. As the physical attribute of the product maybe not uniformly distributed, user would expect gravity plays an role.
2. When user’s physical hand approaching, it may confuse eye focusing, thus affecting stereoscopic effects.
One trade-off we made is to put the pop-out object in a bubble-like container.
We tried two types of rotation techniques, and in the end we combined them to make the movement matches user’s perception:
Follow hand movement, seems like it is attached to hand.
Swipe the ball, the ball rotates in the the same direction of swiping. The faster user swipes, the ball rotates more and faster.

Prototype

We used two cameras in Unity 3D to generate parallax view for left and right eyes, display on the autostereoscopic screen.
Programmed on leap motion to define gestures beyond stander library. Especially rotation, we tried different implementation to get around interrupted gestures and detection inaccuracy.
To map virtually object to real world, we read eye position data, put haptic cue right at the virtual object’s position according to where the user is.

User Study

We designed two user studies to understand how user perceive virtual object’s position.
Research Questions:
1. Is the perception consistent within one user?
2. Is the perception consistent cross users?
3. Is the perception changed when touching is involved?

Reflection

The 15-week internship is quite an unique project experience for me. I’ve been longing for designing cutting-edge technology experience. Thanks for the internship, I get the chance to work on a bunch of new technology, define future product interactions. The process is a combination of HCI research and product design. Certainly, we faced a lot of ambiguity in business and technical implementation. The consideration from both sides pushed me to think actively for new ways of approaching a problem. A good thing we did is that we always kept our goal in mind, meanwhile being flexible to changes. We, as a team, didn’t make design opposite to development, but to understand the boundary of the project, and try to deliver the best user experience within constrains. In the end, I harvest the ability to thrive and deliver results in changing situations🎉.