A virtual reality project about gaze-controlled robotic arms for people with tetraplegia, using eye tracking and assisted manipulation.
Spinal cord injuries affect approximately 15.4 million people worldwide, and tetraplegia can lead to partial or complete loss of function in all four limbs. For many people, everyday tasks such as opening a pill bottle or pouring a glass of water become difficult.
Wheel2VR is a master project at Universität Bremen. We built a virtual reality simulation of a wheelchair-mounted robotic arm system (WMRA) that can be controlled through gaze, either with eye tracking or head tracking.
The system includes two six-degree-of-freedom robotic arms based on the Kinova JACO design. One arm follows direct user input, while the second can suggest supportive actions in assisted mode. The full interaction takes place in a Unity-based VR environment.
Watch how users control two robotic arms through gaze interaction while completing tasks in a virtual kitchen.
The interaction pipeline turns gaze input into robotic movement, with a preview before each action.
The HTC Vive Pro Eye with an integrated Tobii sensor tracks eye movements or head direction. An XR ray is cast along the gaze direction to detect interactive objects in the virtual scene. When the user looks at an object, the system registers it.
After a brief 150ms onset delay to filter accidental fixations, a dwell timer begins. If the user keeps looking at the same target for 1.5 seconds, the selection is confirmed. Visual feedback expands around the cursor as the timer progresses.
Before any arm moves, a semi-transparent ghost copy shows the planned position. Manual and assisted previews use different colors so the user can distinguish them clearly. Nothing happens until the action is confirmed with gaze.
After the user looks at the confirm button, the robotic arm moves to the previewed pose using inverse kinematics. The IK solver translates the spatial target into joint configurations for the arm.
In assisted mode, a behaviour tree controls the second arm. It observes the current task state, proposes supportive actions such as unscrewing a bottle cap, and waits for confirmation before acting.
Integrated Tobii eye tracking via the HTC Vive Pro Eye enables hands-free interaction. Objects are selected through sustained gaze and dwell-based activation.
For users who prefer or require it, head-direction tracking provides an alternative way to control robotic arm movements.
Two 6-DOF robotic arms based on the Kinova JACO design work together. One is controlled directly by the user, and the other can support the task in assisted mode.
Every movement is previewed as a transparent ghost before execution. Different colors distinguish manual actions from assisted actions.
A behaviour-tree system proposes supportive actions for the second arm, such as opening a bottle cap while the first arm holds the bottle. The user confirms each action with gaze.
Users control a single spatial target, and the IK solver computes the joint configurations. This avoids the need to manipulate individual joints directly.
We conducted a controlled lab study with 18 participants to evaluate usability, workload, and preferences across different interaction conditions.
Our within-subjects study compared manual control, where both arms were user-operated, with assisted control, where one arm supported the task. Each participant completed a pill-bottle task that included grabbing the bottle, removing the cap, and dispensing medication.
Participants first completed a familiarization phase with both eye tracking and head tracking, then chose their preferred input mode for the main task. The order of manual and assisted conditions was randomized to reduce learning effects.
We collected NASA-TLX workload scores, System Usability Scale (SUS) ratings, task completion times, and qualitative interview feedback after each condition for direct comparison.
Eye Tracking + Manual, Eye Tracking + AI, Head Tracking + Manual, Head Tracking + AI
Integrated Tobii eye tracking sensor for precise gaze detection while seated and without controllers
Standardized questionnaires after each condition for paired workload and usability comparison
Realistic assistive scenario: grab bottle, remove cap, and pour pills with precise bimanual coordination
Master Project · Universität Bremen · 2026
Wheel2VR implements a virtual reality simulation of a wheelchair-mounted robotic arm system. Users control two six-degree-of-freedom robotic arms based on the Kinova JACO design through gaze input, either with eye tracking or head tracking. The system supports two control modes: a fully manual mode where the user operates both arms sequentially, and an assisted mode where one arm proposes supportive actions through a behaviour tree. A preview-and-confirm interaction pattern helps maintain user control. We conducted a within-subjects user study (N=18) to evaluate usability, workload, and preferences across the different interaction conditions using NASA-TLX, SUS, and qualitative interviews.
The slide deck for the project presentation as a standalone PDF.
Preview the poster directly on the page or open the PDF in a separate tab.
Universität Bremen
Universität Bremen
Universität Bremen
Universität Bremen
Universität Bremen
Universität Bremen
Universität Bremen
AG CGVR
AG CGVR