News

Jun 17, 2017: Haptic and hand tracking demos at the Open Campus 2017.

Feb-Apr 2017: David Vilela (Mechanical Engineering Laboratory, University of Coruna, Spain) visit our Workgroup from February to April. His main work is to compare different intersection calculation methods in collisions, and also different force models.

Feb 2017: G. Zachmann and J. Teuber visited the Mahidol University in Bangkok, Thailand as part of a delegation from the University of Bremen. The goal of the visit was to foster the cooperation between the two universities and lay ground-work for future colaborations.

Jun 2016: Radio Bremen visits our lab to film the works of the Creative Unit "Intra-Operative Information" for a news magazine on the local TV station. Click here for the film at Radio Bremen. And Click here for the same film on our Website.

May 16, 2016: Patrick Lange was honored with the SIGSIM Best PhD Award at the ACM SIGSIM PADS Conference 2016.

Jun 19-21, 2015: G. Zachmann gives invited talk at the DAAD-Stipendiatentreffen in Bremen, Germany.

Jun 2015: Haptic and hand tracking demos at the Open Campus 2015.

Dec 08-10, 2014: ICAT-EGVE 2014 and EuroVR 2014 conferences at the University of Bremen organized by G. Zachmann.

Sep 25-26, 2014: GI VR/AR 2014 conference at the University of Bremen organized by G. Zachmann.

Sep 24-25, 2014: VRIPHYS 2014 conference at the University of Bremen organized by G. Zachmann .

Feb 4, 2014: G. Zachmann gives invited talk on Interaction Metaphors for Collaborative 3D Environments at Learntec.

Jan 2014: G. Zachmann got invited to be a Member of the Review Panel in the Human Brain Project for the Competitive Call for additional project partners

Nov 2013: Invited Talk at the "Cheffrühstück 2013"

Oct 2013: Dissertation of Rene Weller published in the Springer Series on Touch and Haptic Systems.

Jun 2013: G. Zachmann participated in the Dagstuhl Seminar Virtual Realities (13241)

Jun 2013: Haptic and hand tracking demos at the Open Campus 2013.

Jun 2013: Invited talk at Symposium für Virtualität und Interaktion 2013 in Heidelberg by Rene Weller.

Apr 2013: Rene Weller was honored with the EuroHaptics Ph.D Award at the IEEE World Haptics Conference 2013.

Jan 2013: Talk at the graduation ceremony of the University of Bremen by Rene Weller.

Oct 2012: Invited Talk by G. Zachmann at the DLR VROOS Workshop Servicing im Weltraum -- Interaktive VR-Technologien zum On-Orbit Servicing in Oberpfaffenhofen, Munich, Germany.

Oct 2012: Daniel Mohr earned his doctorate in the field of vision-based pose estimation.

Sept 2012: G. Zachmann: Keynote Talk at ICEC 2012, 11th International Conference on Entertainment Computing.

Sep 2012: "Best Paper Award" at GI VR/AR Workshop in Düsseldorf.

Sep 2012: Rene Weller earned his doctorate in the field of collision detection.

Aug 2012: GI-VRAR-Calendar 2013 is available!

Autonomous Surgical Lamps

Autonomous Surgical Lamps

As part of the Creative Unit - Intra-Operative Information, we are developing algorithms for the autonomous positioning of surgical lamps in open surgery. These algorithms work solely on the input of one depth camera, which is positioned above the patient during the surgery. The algorithms identify the operation site (aka the situs) and all possible occlusions. They then move the lamps to avoid occlusions and collisions while optimizing for the least amount of movement feasible over time.

The basic idea is to take the point cloud given by the depth camera and render it from the perspective of the situs towards the working space of the lamps above the operating table. Out of this rendering, we directly get the information, which parts of the lamps workspace are occluded and which not. To be able to minimize the movement over time, we also use information about past occlusions and movements to position the lamps in areas, that are most likely to not be occluded in the future. We arranged the algorithms in a pipeline, which takes the depth image of the depth camera as input, analyzes it to find the situs, and at last outputs the current optimal positions for a given set of lamps.

Pipeline ORScene

Our pipeline with illustrations of the outputs of some stages (left). A Screenshot of our testing and visualization environment with input data from a real surgery (right).

Publications

Videos on Youtube

Download the video: ASuLa_inDepth.mp4

Video on local TV station

Download the video: ButenUnBinnen_CU-IOI.mp4
Source: Radio Bremen, Buten un Binnen

RGB-D Data

The following downloadable data is a recording of a complete, open, abdominal surgery. It was recorded using a Microsoft Kinect v2, which was mounted directly above the patient, using a costom recording program. The recordings are stored in HDF5-files which can be read using the appropriate HDF5 library and the following C++ class: header, source. They are also compressed using gzip to preserve server space. Each file contains at most 27000 frames, which corresponds to roughly 16-17 minutes of recording.

This data may only be used for scientific purposes. If you publish research which is using this material, cite the above paper. Also, please contact us, we are always interested to hear what others are doing with this data.

As this is real-life data, there are long stretches of the recording, where a surgical lamp obstructs most of the kinects field of view. As a rule of thumb, the smaller the compressed file size, the bigger and longer the obstruction of the field of view.

 

This work was partially supported by the grant Creative Unit - Intra-Operative Information.