IVOR - The interactive virtual operating room


About this project

The bachelor's project IVOR - Interactive Virtual Operating Room is a cooperation between the computer graphics and the human interaction group at the University of Bremen. Our main goal is to develop a system that simplifies processes and interactions in the operating room. To reach this goal we chose a human-centered approach. Each of the developers will attend a surgery to get their own ideas about the situation in the operating room. To get as much feedback from the surgeons as possible, we work in short, fast, and consecutive iterations.


The project is supervised by Prof. Dr. Rainer Malaka, Prof. Dr. Gabriel Zachmann, Dr. Marc Herrlich and Jörn Teuber.

Problems in the surgery

After some of our developers visited the operating room in the Klinikum Bremen Mitte we noticed that it is often difficult for surgeons and other surgical employees to take a look at the images which are previously generated by devices such as ultra sound or CT while operating.

We talked to the surgeons and some experts at the University of Bremen about this. They explained that the so called ''Up-Down''-problem is one of the oldest and most usual issues. Surgeons are not allowed to touch anything non-sterile so they are not able to control the displays. A non-sterile nurse has to swipe through the images while the surgeon says ''Up'' and ''Down'' until they found the right image.

Our idea: No more ''Up-Down''

The surgeons confirmed that it would be extremely helpful if they could control the displays by themselves. What we want to do now is exploring how this could be possible. We created to interaction methaphors, controlling the display by foot motion and controlling them by hand gestures. By implementing both of them we want to find out which metaphor is the better one. For evaluation we created a virtual operating room.

Inside the operating room, you can find a hanging screen displaying a stack of CT images. You can browse through them with the foot- and handmotion control system we developed. The idea behind the interactive virtual operating room is that we can ask experts such as real surgeons but also human-computer-interaction professionals to test our control systems and give a comprehensive feedback without making the effort to organize a real surgery room.

Modules

Since we have three systems to build in this project we decided to split into three modules. The Foot Motion Module takes care of the planned foot motion control while the Hand Gesture Module is implementing gesture control. Finally, the Virtual Reality Module creates the virtual surgery room and implements all the functionalities such as the movement, headtracking and haptical feedback.

Evaluation

To confirm our thesis, we designed a user study where both control methods are tested by surgeons and compared to each other and the common "up-down" method. We created a system that hides three fair, organ-like artifacts in the CT images on the screen. While scrolling through the images, the artifacts are constantly fading in until they reach their maximal brightness. Afterwards they are getting darker until they are invisible again. The artifacts appear at random positions in the image stack. The surgeons who participated in our study had to find the slide where the artifacts were the brightest. For further information take a look at our paper or watch the info video below.

The Team