Welcome to the Virtual Reality & Immersive Visualization Group
at RWTH Aachen University!

The Virtual Reality and Immersive Visualization Group started in 1998 as a service team in the RWTH IT Center. Since 2015, we are a research group (Lehr- und Forschungsgebiet) at i12 within the Computer Science Department. Moreover, the Group is member of the Visual Computing Institute and continues to be an integral part of the RWTH IT Center.

In a unique combination of research, teaching, services and infrastructure, we provide Virtual Reality technologies and the underlying methodology as a powerful tool for scientific-technological applications.

In terms of basic research, we develop advanced methods and algorithms for multimodal 3D user interfaces and explorative analyses in virtual environments. Furthermore, we focus on application-driven, interdisciplinary research in collaboration with RWTH Aachen institutes, Forschungszentrum Jülich, research institutions worldwide, and partners from business and industry, covering fields like simulation science, production technology, neuroscience, and medicine.

To this end, we are members of / associated with the following institutes and facilities:

Our offices are located in the RWTH IT Center, where we operate one the largest Virtual Reality labs worldwide. The aixCAVE, a 30 sqm visualization chamber, makes it possible to interactively explore virtual worlds, is open to use by any RWTH Aachen research group.


New RASimAs Newsletter

The release of this 5th newsletter comes after the RASimAs project (Regional Anaesthesia Simulator and Assistant) passed the second year evaluation, again with certified success and good rate. All contributions by the partners show intensive and interdisciplinary interaction within the entire RASimAs team, composed of 11 partners from 10 European countries. The assistant was introduced and demonstrated at the International Winter Symposium on Anesthesia and Perioperative Care Trauma in Leuven. Furthermore, the evaluation of the simulator and the assistant begins soon in the clinical sites.

Download the newsletter directly from the project website or simply here (2MB) .

May 13, 2016

TGGS visited our VR Lab

On May 4th 2016, a delegation of Thai-German Graduate School of Engineering (TGGS) visited our VR Lab. Besides a demonstration of our aixCAVE, options of extending the academic collaboration between TGGS and RWTH were discussed. Have a look here for some impressions.

Thanks to the existing collaboration, one Thai student already stayed with our research group for an internship and his master thesis. It was a successful time, as the work done was accepted on the IEEE Symposium on 3D User Interfaces (paper). In August 2016, another Thai student will stay with us, again for an internship and a master thesis.

May 9, 2016

New RASimAs Press Release

The developments in our RASimAs (Regional Anaesthesia Simulator and Assistant) EU-Project, which aims at creating a training and assistant system for a medical procedure, are nearly finished. Viktor Voski, a project partner from the Department of Anaesthesia, Uniklinik RWTH Aachen, was at SINTEF, Trondheim, Norway, and at SenseGraphics, Stockholm, Sweden to perform the final tests in the assistant and simulator prototypes before they are released to the clinical centers for evaluation.

Download the press release directly from the project website or simply here (0.2MB).

May 2, 2016

Hannover Industry Fair 2016: Experiencing a Biorefinery in VR

The alliance of leading Institutes of Technology in Germany (TU9) is currently presenting projects from different research sectors, such as Industry 4.0, energy systems, and entrepreneurship at the Hannover Industry Fair (25 - 29 April 2016). The RWTH Aachen University is represented by members of the Cluster of Excellence “Tailor-Made Fuels from Biomass” (CoE TMFB). Due to the rising energy demand and the limited availability of fossil energy resources, this CoE focuses on developing new alternative fuels from biomass which will not be competing with the food chain. The Virtual Reality & Immersive Visualization Group supports the TMFB’s exhibition with an VR application in which booth visitors can immerse themselves in a virtual biorefinery to see different process steps, proposed by the CoE, that are normally inaccessible.

April 25, 2016

Awards at 3DUI

The best technote award of the 3DUI was given to Sebastian Freitag for his paper entitled “Automatic Speed Adjustment for Travel through Immersive Virtual Environments based on Viewpoint Quality”.

Furthermore the Award Honorable Mention for Best Technote was given to Andrea Bönsch for her paper entitled “Collision Avoidance in the Presence of a Virtual Agent in Small-Scale Virtual Environments”.

March 21, 2016

Contest chair of 3DUI contest 2016 & Presentations at 3DUI and IEEE VR

Dr.-Ing. Benjamin Weyers is this years contest chair of the 3DUI contest in Greenville, South Carolina, USA.

The Virtual Reality and Immersive Visualization Group presents overall seven papers and two posters at 3DUI and IEEE VR 2016. One additional paper is presented by our cooperation partner Deutsches Zentrum für Luft- und Raumfahrt, Simulation and Software Technology.

March 21, 2016

Recent Publications

Accurate and adaptive contact modeling for multi-rate multi-point haptic rendering of static and deformable environments

Computers & Graphics (Journal) (2016)

Common approaches for the haptic rendering of complex scenarios employ multi-rate simulation schemes. Here, the collision queries or the simulation of a complex deformable object are often performed asynchronously at a lower frequency, while some kind of intermediate contact representation is used to simulate interactions at the haptic rate. However, this can produce artifacts in the haptic rendering when the contact situation quickly changes and the intermediate representation is not able to reflect the changes due to the lower update rate. We address this problem utilizing a novel contact model. It facilitates the creation of contact representations that are accurate for a large range of motions and multiple simulation time-steps. We handle problematic geometrically convex contact regions using a local convex decomposition and special constraints for convex areas. We combine our accurate contact model with an implicit temporal integration scheme to create an intermediate mechanical contact representation, which reflects the dynamic behavior of the simulated objects. To maintain a haptic real time simulation, the size of the region modeled by the contact representation is automatically adapted to the complexity of the geometry in contact. Moreover, we propose a new iterative solving scheme for the involved constrained dynamics problems. We increase the robustness of our method using techniques from trust region-based optimization. Our approach can be combined with standard methods for the modeling of deformable objects or constraint-based approaches for the modeling of, for instance, friction or joints. We demonstrate its benefits with respect to the simulation accuracy and the quality of the rendered haptic forces in several scenarios with one or more haptic proxies.


Interactive 3D Force-Directed Edge Bundling

Computer Graphics Forum (Journal) (2016) (to be published)

Interactive analysis of 3D relational data is challenging. A common way of representing such data are node-link diagrams as they support analysts in achieving a mental model of the data. However, naïve 3D depictions of complex graphs tend to be visually cluttered, even more than in a 2D layout. This makes graph exploration and data analysis less efficient. This problem can be addressed by edge bundling. We introduce a 3D cluster-based edge bundling algorithm that is inspired by the force-directed edge bundling (FDEB) algorithm [Holten2009] and fulfills the requirements to be embedded in an interactive framework for spatial data analysis. It is parallelized and scales with the size of the graph regarding the runtime. Furthermore, it maintains the edge’s model and thus supports rendering the graph in different structural styles. We demonstrate this with a graph originating from a simulation of the function of a macaque brain.


Visual Quality Adjustment for Volume Rendering in a Head-Tracked Virtual Environment

IEEE Transactions on Visualization and Computer Graphics (Journal) (2016)

To avoid simulator sickness and improve presence in immersive virtual environments (IVEs), high frame rates and low latency are required. In contrast, volume rendering applications typically strive for high visual quality that induces high computational load and, thus, leads to low frame rates. To evaluate this trade-off in IVEs, we conducted a controlled user study with 53 participants. Search and count tasks were performed in a CAVE with varying volume rendering conditions which are applied according to viewer position updates corresponding to head tracking. The results of our study indicate that participants preferred the rendering condition with continuous adjustment of the visual quality over an instantaneous adjustment which guaranteed for low latency and over no adjustment providing constant high visual quality but rather low frame rates. Within the continuous condition, the participants showed best task performance and felt less disturbed by effects of the visualization during movements. Our findings provide a good basis for further evaluations of how to accelerate volume rendering in IVEs according to user’s preferences.

Disclaimer Home Visual Computing institute RWTH Aachen University