Tencent VR Dock

In 2023, I had the role of an interaction designer at Tencent’s XR Department. Tencent was preparing to launch its own VR device, which required a system-level launcher. Central to this design was the Dock bar, situated at the base of the virtual space. It was more than just a bar; it was filled with the VR system’s application icons, making it easy for users to switch between applications and functions. Nearby was ‘All Apps’, a hub for users to find and manage all their applications. My responsibility was clear: design the interaction for the Dock, application icons, and ‘All Apps’.

 

But every project has its challenges. Our main hurdle? The input method. Unlike most VR devices that used controllers, Tencent wanted to innovate with Hand Tracking. This reminded us of the intuitive interactions offered by devices like the Sony PlayStation VR’s Sony Move, Oculus Touch, and HTC Vive’s controllers. But it wasn’t without issues.

 

In traditional GUIs, users have a clear roadmap: actions are visible, interfaces are clear, and feedback is immediate. However, gestures, promising as they might seem, were unpredictable in VR. For example, without the tactile feedback like gravity or touch, even simple gestures could feel disconnected. And some tasks, like changing an object’s appearance, demanded unfamiliar hand movements. Even intuitive gestures sometimes weren’t recognised by the software.

 

Our takeaways?
1. VR interactions should resemble real-world actions.
2. Given the potential nuances of the interface, users needed clear guidance and immediate feedback for their gestures.

 

Setting Our Direction
To set our VR device apart from others in the market, we decided to reimagine the Dock and application icons in 3D rather than traditional 2D. They were designed as interactive 3D models within the VR environment. Knowing the challenges of hand tracking, it was crucial to provide users with clear visual and auditory feedback. This wasn’t just about aesthetics; it was about creating an intuitive and immersive user experience.

Intuitive Gestures

The principle of intuitive gestures, like pressing and pinching, is rooted in the natural interactions we have with our hands. The design of UI affordances directs how users should interact with 3D objects. For instance, a button is designed to encourage users to press it using their index fingertip, while grabbing a smaller object may naturally invite a pinch between the thumb and index finger. To streamline user interaction, only these two basic hand gestures – press and pinch – are utilised for two core actions: launching applications and adjusting icon positions.

Dock Icons: Press and Pinch Interaction Test

Visual Affordance-based manipulation

The bounding box signals to users that the object can be scaled. The expansion and contraction of the frame adhere to a universally recognised pattern, reminiscent of pressing a switch. This visual aid provides users with a complete understanding of the object’s area, even if some portions are hidden outside of an active mode. In the absence of this visual cue, a three-dimensional icon, when attached to another object or surface, might give the illusion of unexpected surrounding space. It’s imperative to use visual affordances that align seamlessly with user behaviours.

Launching an Application from the 3D Icon on the Dock

Surface diffusion is utilised to portray the clicking state. When a fingertip initially makes contact with the front side of the boundary box, the Touch Point materialises as a mere dot. As the pressure intensifies and reaches its zenith, this dot expands to encompass the entire front side. A corresponding visual indicator is the Base Surface; its transparency shifts in accordance with the depth of the press, oscillating from full transparency to absolute opacity. These visual mechanisms aid users in gauging the requisite pressing depth and serve to compensate for the palpable absence of tactile feedback inherent in gesture-based controls.

Repositioning an Icon on the Dock

When users pinch an Icon, the previously faint boundary frame becomes more pronounced, thickening its lines. This adjustment amplifies the object’s sense of volume, bolstering the user’s confidence in their interaction. It emphasises the genuine feeling of having secured a hold of the object.

2D Icon and All Apps

In addition to the default 3D icons displayed on the dock, consideration must be given to the form and interaction of application icons within the ‘All Apps’ window. Contrary to the dock, the ‘All Apps’ window presents as a 2D flat interface upon opening. Hence, facilitating a smooth transition when dragging a 2D icon from this window to the 3D dock, and ensuring that the transformation from 2D to 3D feels cohesive and seamless, poses a significant challenge.

Structure of 2D Icon

In its default state, the arrangement of ‘All Apps’ appears largely traditional: 2D graphics accompanied by their respective names below. The transformation begins in the hover state when the boundary frame reveals itself as the user’s hand approaches an icon. Generally speaking, the structure of a 2D icon closely mirrors its 3D counterpart. However, the 3D model seems to be flattened to replace the Base Surface. Additionally, 2D icons lack the floating name bar; names are constantly displayed within the window. This design both retains the conventional understanding of iconography and extends the interaction methods of 3D icons, minimising the learning curve for users.

Interacting with 2D Icons in All Apps

When the user’s hand approaches a 2D icon, a boundary frame appears to indicate its spatial volume. A touch point emerges as the fingertip contacts the frame, expanding with pressure to convey depth. Once it fully envelops the front side, the associated application is launched. For non-immersive apps, the new window replaces the All Apps interface.

Dock Icon Placement Interaction

To place a 2D icon onto the Dock, the user performs a pinch gesture on the boundary frame. The frame responds by changing colour and thickening, entering movement mode. The original icon greys out, and a 3D shortcut version appears inside the frame, following the user’s hand. As the hand nears the Dock, it extends to create an empty slot. Upon release, the icon snaps into place and reverts to its default state.

Conclusion

This project explores how intuitive gesture-based interaction and spatial UI design can enhance user experience in a VR system. Tasked with designing the Dock launcher for Tencent’s in-house VR platform, I focused on solving challenges unique to hand-tracking—such as the lack of tactile feedback, spatial uncertainty, and gesture recognition stability—while balancing usability and immersion. Through iterative prototyping and refinement, I developed a gesture-driven system that simplifies navigation and reinforces user confidence, using minimal input yet offering clear visual and spatial feedback. The seamless transition between 2D and 3D icons, along with responsive UI affordances, helped ensure a low learning curve and a more natural interaction flow.

 

The VR Dock is more than just a toolbar—it is the entry point to the virtual world. It organises information, guides emotion, and shapes the first impression of a system’s usability. This project deepened my understanding that usability in virtual environments depends not only on technology, but on how well we translate human perception and instinct into digital space.