Tencent VR Keyboard
Sogou and Tencent’s XR team jointly developed an efficient input method tailored for virtual reality. As the interaction designer on this project, I focused on crafting an intuitive, multi-modal input experience that supports natural communication in immersive environments.
My key contributions included:
Keyboard positioning and spatial layout: I designed the keyboard’s placement and size based on ergonomic principles and spatial relationships between the user’s head and hands, ensuring a comfortable and accessible input posture in VR.
Multi-modal input interaction: I defined support for various input methods, including hand tracking, controller typing, and voice input, allowing users to choose their preferred input style depending on context.
Feedback mechanisms: I implemented visual, auditory, and behavioural feedback (e.g. highlight animations and subtle sound cues) to reinforce user confidence and reduce input uncertainty.
This project was a comprehensive exploration into redefining text input in spatial computing contexts—balancing technical feasibility with usability, and enhancing the VR typing experience through human-centred design.
Feasibility Study | Competing Brands’ Keyboard Analysis

Virtual Keyboard + Raycasting
Virtual Keyboard + Index Fingers Type
Virtual Keyboard + Physical Keyboard
In Oculus and Pico, since the primary input device is a controller, UIs positioned too close to the user can easily be obscured by the controller. Consequently, the UIs are designed to be set more than a metre away from the user on mid-to-far panels, interacting with them using raycasting. This method of interaction has a low error rate. Using the controller button provides tactile feedback when clicking. However, because the controller is somewhat heavy, the comfort level and input efficiency are average.
Hololens relies on gesture input, so crucial interactive elements must be within the user’s arm’s reach (0.5m). Only the index fingers are used for input, offering no tactile feedback. As a result, the input efficiency is average, as is the level of comfort during input.
HTC Vive fully maps a physical keyboard into virtual space, making the typing experience in VR indistinguishable from reality. As a result, both comfort and efficiency are at their peak.
Given that Tencent’s VR devices primarily use gesture input and rings with OFN and vibration feedback, directly using fingers to click and pinch interactive elements aligns more with users’ real-world instincts. To accommodate this input method, all interactive objects should be within a user’s comfortable reach. As such, Hololens’ close-range virtual keyboard with gesture input offers a valuable reference. However, while Hololens operates in MR, our device is VR-based, necessitating considerations of the visual differences between VR’s UI and MR’s semi-transparent projections.
Feasibility Study | Academic Conclusions
Based on the literature Performance Envelopes of Virtual Keyboard Text Input Strategies in Virtual Reality, the following key conclusions and justifications for layout design have been summarised:

Using the standard QWER layout
Operated solely with the index fingers
Avoid overly wide keyboard layouts
Users’ high familiarity with the standard Qwerty layout has led to its continued use in the XR environment.
Compared to using only two index fingers, the vast majority of users struggle to type effectively in mid-air with all ten fingers: the typing speed is slower and the error rate is higher.
The variance of touch errors is typically higher on the x-axis than on the y-axis, and it increases towards the edges.
Hence, Tencent’s VR input method will design the VR keyboard in line with the familiar QWER layout for users. During the design process, we will avoid making the main typing area overly wide. The keyboard will be positioned within easy reach for users to comfortably type with both index fingers. Due to the lack of passive tactile feedback, besides appropriate placement and layout, auditory and visual cues are also crucial.
Positioning
The virtual keyboard is positioned 0.45 metres away from the user and is 0.55 metres lower than the headset position, tilted at a 40-degree angle towards the user.
The radius of 0.45 m is the target distance for direct hand interaction; within this zone, users can type efficiently by moving their forearms horizontally. As each user’s height varies, to ensure a consistent input experience, the keyboard’s height is designed to be 0.55 metres below the headset position. The 40-degree tilt of the keyboard not only maintains a comfortable typing angle but also ensures a pleasant viewing angle.
Orthographic view
Axonometric projection

Interaction
Though typing solely with the index fingers has been academically proven to be more efficient in the absence of a physical keyboard, there’s a learning curve for new VR users. When the keyboard is invoked for the first time, there’s a small bright spot on both index fingers, hinting users to type using them. Once the user successfully types using both index fingers, these bright spots vanish.
Visual Cue & Feedback
In its default state, the keyboard shows letters on a flat plane. As hands approach a key, a boundary frame appears, helping users perceive the button’s volume. When a button is pressed, the boundary frame adjusts, providing a visual depth cue.
When the button is pressed halfway, the ring gives a slight vibration accompanied by the sound of a button click, signalling a successful letter entry.
VR Keyboard – Demo Test:
Swipe input
In VR environments, depth perception can be a challenge due to the lack of tactile feedback and the disparity between visual cues and actual physical sensations. Users might struggle to gauge the exact distance or depth of virtual objects, leading to imprecise interactions. Swipe input mitigates this issue by offering a continuous and fluid method of text entry that doesn’t rely heavily on depth precision. By enabling users to glide over keys instead of pressing them individually, it enhances input speed and accuracy whilst reducing the potential for errors and hand fatigue, leading to a more intuitive and comfortable user experience. Consequently, in addition to typing using the index fingers, we also provide a swipe input method.
Swipe Input – Demo Test:
Functional zoning and detailed design
Font and Key Size: For close interactions at 0.45 m, the minimum easily readable font angle and height are 0.65°-0.8° / 5.1-6.3 mm. This equates to approximately 15-18 pt in Unity. Button dimensions should exceed 1.6 x 1.6 cm to ensure adequate space for both icons and text.
Layout: This design employs a zoned keyboard. The primary keyboard in the centre is optimised for usability, greatly simplifying its functions and choices to reduce width and minimise inadvertent touches. The left sidebar is a persistent switch for input modes and language. The extended keyboard on the right offers less common but essential input options, ensuring customisation and flexibility.
Conclusion
This project explores the reimagining of text input within virtual environments, where the absence of physical feedback and the constraints of spatial perception pose significant design challenges. Through gesture-driven interaction, ergonomic layout, and multimodal feedback—including visual, auditory, and vibrotactile cues—the Tencent VR Keyboard aims to deliver an intuitive, efficient, and comfortable input experience in immersive settings.
Drawing on academic research and real-world device comparisons, the design balances familiarity with innovation: from preserving the QWERTY layout to integrating swipe input and optimised spatial positioning. It represents a thoughtful response to the question of how we might type, communicate, and express ourselves in virtual space—not by replicating the physical world, but by extending its logic into new dimensions.