Master Project · Universität Bremen 2026

Wheel2VR

A virtual reality project about gaze-controlled robotic arms for people with tetraplegia, using eye tracking and assisted manipulation.

Reimagining Assistive Robotics
Through Virtual Reality

Spinal cord injuries affect approximately 15.4 million people worldwide, and tetraplegia can lead to partial or complete loss of function in all four limbs. For many people, everyday tasks such as opening a pill bottle or pouring a glass of water become difficult.

Wheel2VR is a master project at Universität Bremen. We built a virtual reality simulation of a wheelchair-mounted robotic arm system (WMRA) that can be controlled through gaze, either with eye tracking or head tracking.

The system includes two six-degree-of-freedom robotic arms based on the Kinova JACO design. One arm follows direct user input, while the second can suggest supportive actions in assisted mode. The full interaction takes place in a Unity-based VR environment.

0
Participants
0
Conditions Tested
0
Robot Arms
0
Degrees of Freedom
The Vision What if your eyes could move a robot arm?

See Wheel2VR in Action

Watch how users control two robotic arms through gaze interaction while completing tasks in a virtual kitchen.

From Gaze to Grasp

The interaction pipeline turns gaze input into robotic movement, with a preview before each action.

01

Gaze Detection

The HTC Vive Pro Eye with an integrated Tobii sensor tracks eye movements or head direction. An XR ray is cast along the gaze direction to detect interactive objects in the virtual scene. When the user looks at an object, the system registers it.

02

Dwell Selection

After a brief 150ms onset delay to filter accidental fixations, a dwell timer begins. If the user keeps looking at the same target for 1.5 seconds, the selection is confirmed. Visual feedback expands around the cursor as the timer progresses.

03

Ghost Preview

Before any arm moves, a semi-transparent ghost copy shows the planned position. Manual and assisted previews use different colors so the user can distinguish them clearly. Nothing happens until the action is confirmed with gaze.

04

Confirm & Execute

After the user looks at the confirm button, the robotic arm moves to the previewed pose using inverse kinematics. The IK solver translates the spatial target into joint configurations for the arm.

05

AI Assistance

In assisted mode, a behaviour tree controls the second arm. It observes the current task state, proposes supportive actions such as unscrewing a bottle cap, and waits for confirmation before acting.

Key Capabilities Two arms controlled through one gaze-based interface.

Accessible Interaction for Dual-Arm Control

Eye Tracking Control

Integrated Tobii eye tracking via the HTC Vive Pro Eye enables hands-free interaction. Objects are selected through sustained gaze and dwell-based activation.

Head Tracking Alternative

For users who prefer or require it, head-direction tracking provides an alternative way to control robotic arm movements.

Dual Robotic Arms

Two 6-DOF robotic arms based on the Kinova JACO design work together. One is controlled directly by the user, and the other can support the task in assisted mode.

Ghost Preview System

Every movement is previewed as a transparent ghost before execution. Different colors distinguish manual actions from assisted actions.

AI-Assisted Mode

A behaviour-tree system proposes supportive actions for the second arm, such as opening a bottle cap while the first arm holds the bottle. The user confirms each action with gaze.

Inverse Kinematics

Users control a single spatial target, and the IK solver computes the joint configurations. This avoids the need to manipulate individual joints directly.

Results from the User Study

We conducted a controlled lab study with 18 participants to evaluate usability, workload, and preferences across different interaction conditions.

Our within-subjects study compared manual control, where both arms were user-operated, with assisted control, where one arm supported the task. Each participant completed a pill-bottle task that included grabbing the bottle, removing the cap, and dispensing medication.

Participants first completed a familiarization phase with both eye tracking and head tracking, then chose their preferred input mode for the main task. The order of manual and assisted conditions was randomized to reduce learning effects.

We collected NASA-TLX workload scores, System Usability Scale (SUS) ratings, task completion times, and qualitative interview feedback after each condition for direct comparison.

🎯

4 Interaction Conditions

Eye Tracking + Manual, Eye Tracking + AI, Head Tracking + Manual, Head Tracking + AI

🥽

HTC Vive Pro Eye

Integrated Tobii eye tracking sensor for precise gaze detection while seated and without controllers

📊

NASA-TLX & SUS

Standardized questionnaires after each condition for paired workload and usability comparison

💊

Pill Bottle Task

Realistic assistive scenario: grab bottle, remove cap, and pour pills with precise bimanual coordination

Research From project idea to published results

The Paper

Wheel2VR: Gaze-Controlled Dual Robotic Arms for Assistive Wheelchair Interaction in Virtual Reality

Master Project · Universität Bremen · 2026

Ahmed Seyit Küçük Dmitry Tschernobai Hannah Köper Xinyun Yu Jayon K Vinod Ahmed Ibrahim Arne Winter

Wheel2VR implements a virtual reality simulation of a wheelchair-mounted robotic arm system. Users control two six-degree-of-freedom robotic arms based on the Kinova JACO design through gaze input, either with eye tracking or head tracking. The system supports two control modes: a fully manual mode where the user operates both arms sequentially, and an assisted mode where one arm proposes supportive actions through a behaviour tree. A preview-and-confirm interaction pattern helps maintain user control. We conducted a within-subjects user study (N=18) to evaluate usability, workload, and preferences across the different interaction conditions using NASA-TLX, SUS, and qualitative interviews.

Presentation & Poster

Project Presentation

The slide deck for the project presentation as a standalone PDF.

Research Poster

Preview the poster directly on the page or open the PDF in a separate tab.

Wheel2VR research poster

Meet the Team

ASK

Ahmed Seyit Küçük

Universität Bremen

DT

Dmitry Tschernobai

Universität Bremen

HK

Hannah Köper

Universität Bremen

XY

Xinyun Yu

Universität Bremen

JKV

Jayon K Vinod

Universität Bremen

AI

Ahmed Ibrahim

Universität Bremen

AW

Arne Winter

Universität Bremen

Project Supervisors

GZ

Prof. Dr. Gabriel Zachmann

AG CGVR

RW

Dr. René Weller

AG CGVR