News

Feb, 2018: The House of Science, Bremen hosts an exhibition about local scientists and science projects with collaborators around the world. One of the featured exhibits is a demo of our Autonomous Surgical Lamps, developed by Jörn Teuber of the Computer Graphics and Virtual Reality group. The exhibition will be open until the 21st of April (photos).

Feb, 2018: The University of Bremen participates in the opening of a research laboratory in Bangkok.

Nov, 2017: 2017 VRST Best Poster Award Winner. Michael Bonfert, Melina Cahnbley, Inga Lehne, Ralf Morawe, Gabriel Zachmann and Johannes Schöning are winning the award for the poster titled "Augmented Invaders: A Mixed Reality Multiplayer Outdoor Game."

Nov, 2017: Organizers of the French VR conference and trade show Laval Virtual immersed themselves into a variety of different virtual environments where they learned about current projects of the Computer Graphics & Virtual Reality lab at the University of Bremen (full report, in German).

Sep, 2017: Founding Everyday Activity Science and Engineering (EASE). EASE is a interdisciplinary research center at the University of Bremen that investigates everyday activity science & engineering. For more Information click here.

Jun 17, 2017: Haptic and hand tracking demos at the Open Campus 2017.

Feb-Apr 2017: David Vilela (Mechanical Engineering Laboratory, University of Coruna, Spain) visited our lab. He is working on benchmarks to compare different intersection calculation methods in collisions, and also different force models.

Feb 2017: G. Zachmann and J. Teuber visited the Mahidol University in Bangkok, Thailand as part of a delegation from the University of Bremen. The goal of the visit was to foster the cooperation between the two universities and lay ground-work for future colaborations.

Jun 2016: Radio Bremen visited our lab to film the works of the Creative Unit "Intra-Operative Information" for a news magazine on the local TV station. Click here for the film at Radio Bremen. And Click here for the same film on our Website.

May 16, 2016: Patrick Lange was honored with the SIGSIM Best PhD Award at the ACM SIGSIM PADS Conference 2016.

Jun 19-21, 2015: G. Zachmann gives invited talk at the DAAD-Stipendiatentreffen in Bremen, Germany.

Jun 2015: Haptic and hand tracking demos at the Open Campus 2015.

Dec 08-10, 2014: ICAT-EGVE 2014 and EuroVR 2014 conferences at the University of Bremen organized by G. Zachmann.

Sep 25-26, 2014: GI VR/AR 2014 conference at the University of Bremen organized by G. Zachmann.

Sep 24-25, 2014: VRIPHYS 2014 conference at the University of Bremen organized by G. Zachmann .

Feb 4, 2014: G. Zachmann gives invited talk on Interaction Metaphors for Collaborative 3D Environments at Learntec.

Jan 2014: G. Zachmann got invited to be a Member of the Review Panel in the Human Brain Project for the Competitive Call for additional project partners

Nov 2013: Invited Talk at the "Cheffrühstück 2013"

Oct 2013: PhD thesis of Rene Weller published in the Springer Series on Touch and Haptic Systems.

Jun 2013: G. Zachmann participated in the Dagstuhl Seminar Virtual Realities (13241)

Jun 2013: Haptic and hand tracking demos at the Open Campus 2013.

Jun 2013: Invited talk at Symposium für Virtualität und Interaktion 2013 in Heidelberg by Rene Weller.

Apr 2013: Rene Weller was honored with the EuroHaptics Ph.D Award at the IEEE World Haptics Conference 2013.

Jan 2013: Talk at the graduation ceremony of the University of Bremen by Rene Weller.

Oct 2012: Invited Talk by G. Zachmann at the DLR VROOS Workshop Servicing im Weltraum -- Interaktive VR-Technologien zum On-Orbit Servicing in Oberpfaffenhofen, Munich, Germany.

Oct 2012: Daniel Mohr earned his doctorate in the field of vision-based pose estimation.

Sept 2012: G. Zachmann: Keynote Talk at ICEC 2012, 11th International Conference on Entertainment Computing.

Sep 2012: "Best Paper Award" at GI VR/AR Workshop in Düsseldorf.

Sep 2012: Rene Weller earned his doctorate in the field of collision detection.

Aug 2012: GI-VRAR-Calendar 2013 is available!

KaNaRiA

KaNaRiA (from its German acronym: Kognitionsbasierte, autonome Navigation am Beispiel des Ressourcenabbaus im All) is a collaborative project of the University of Bremen (institute for computer graphics and virtual reality, institute for cognitive neuroinformatics, institute for optimisation and optimal control) and the Universität der Bundeswehr in Munich (institute of space systems and institute of space navigation ) financed by the German Aerospace Centre (DLR - Deutsches Zentrum für Luft- und Raumfahrt). The terrestrial follow-up project of KaNaRiA, AO-Car, researches novel autonomous car manoeuvres.

The extraction of asteroid resources is of high interest for a great number of upcoming deep space missions aiming at a combined industrial, commercial and scientific utilization of space. One main technological enabler or mission concepts in deep space is on-board autonomy. Such mission concepts generally include long cruise phases, multi-body fly-bys, planetary approach and rendezvous, orbiting in a-priori unknown dynamic environments, controlled descent, surface navigation and precise soft landing, docking or impacting.

Kooperationspartner

The project comprises two major goals:

In summary, our features for the KaNaRiA are:

Related Publications

Invited Talks

Videos

Our KaNaRiA image movie, illustrating the
main concepts of the KaNaRiA projects.
Our visualization of the used
particle filter for spacecraft localization.

Our visualization of the asteroid main belt
and the corresponding cruise phase
operations: optimal trajectories are flown
by the spacecraft.
Our visualization of the asteroid main belt
and the corresponding cruise phase
operations, showcasing the rendering of
200,000 asteroids.

Proximity phase operations: visualizing the
SLAM approach from the
institute for cognitive neuroinformatics.
Our demo overview of the
PTCM spacecraft concept, done by the
Universität der Bundeswehr München.

Our artistic visualization of the ReDoLa
landing sequence.
Our procedurally generated asteroids
for arbitrary simulation purposes.

Using our procedurally generated
asteroids, camera images for
asteroid approaches can
be synthesized (close approach).
Using our procedurally generated
asteroids, camera images for
asteroid approaches can
be synthesized (far approach).

Using our procedurally generated
asteroids, camera image for
asteroid approaches can
be synthesized (proximity).
Early demo of the cruise phase, illustrating
optimal trajectory computation and
spacecraft localization

Images

ptcm itokawa
Our artistically enhanced PTCM model (left) based on the original design. Our artistically enhanced Itokawa model. The model is based on the original Itokawa data.

spherepacking spconcept
Itokawa sphere packing for gravity computation and material distribution of asteroids. Illustration of our sphere-packing concept: arbitrary mass distributions can be approximated with uniform spheres each with different masses.
procasteroids
Procedurally generated look-alike asteroids by our approach. On the left, the synthesized replicas can be seen, on the right are the original models

spherepacking spconcept
Rendering of 200,000 asteroids in the cruise phase

spherepacking spconcept
Illustration of the proximity operations, the flown trajectory
is shown in green.
Visualizing lidar measurements (reds) in the
proximity operation simulation.

spherepacking spconcept
Visualizing surface landmarks (light blue) and lidar
measurements (red) in the proximity operation simulation.
Visualizing the uncertainty of the spacecraft SLAM approach
as a transparent sphere.

Related theses