Master-Project: Natural User Interaction for Cars

Prof. Dr. Gabriel Zachmann

Master Project: NUI4Cars – Innovative and contact-free user interfaces for cars

If your browser does not support html5 or if you would like to see an HD version, you can see the video in youtube.

Demo day, 11 April 2014

In the foreseeable and not too distant future, contact-free hand tracking and gesture recognition will reach the same level as current full-body motion capturing. Hand, face and gaze tracking have already been solved to some degree, which allows a brand new type of Human Machine Interfaces, the so called “Natural User Interfaces” (NUIs). A recent field of research focuses on the control of cars with the help of NUIs. One of the main goals is the reduction of accidents, but various other advantages may emerge.

Therefore this project is researching contact-free and multimodal interaction metaphors: Gesture, speech, skin resistance, line of sight, etc.  Using the example of setting a mirror: The voice command “Adjust mirror” and a corresponding hand or finger movement may be used to configure the right side mirror. A driving simulator provides the necessary means for testing by linking these new metaphors to parts of a virtual car.