header

Welcome


bdrp


Welcome to the Virtual Reality & Immersive Visualization Group
at RWTH Aachen University!

The Virtual Reality and Immersive Visualization Group started in 1998 as a service team in the RWTH IT Center. Since 2015, we are a research group (Lehr- und Forschungsgebiet) at i12 within the Computer Science Department. Moreover, the Group is member of the Visual Computing Institute and continues to be an integral part of the RWTH IT Center.

In a unique combination of research, teaching, services, and infrastructure, we provide Virtual Reality technologies and the underlying methodology as a powerful tool for scientific-technological applications.

In terms of basic research, we develop advanced methods and algorithms for multimodal 3D user interfaces and explorative analyses in virtual environments. Furthermore, we focus on application-driven, interdisciplinary research in collaboration with RWTH Aachen institutes, Forschungszentrum Jülich, research institutions worldwide, and partners from business and industry, covering fields like simulation science, production technology, neuroscience, and medicine.

To this end, we are members of / associated with the following institutes and facilities:

Our offices are located in the RWTH IT Center, where we operate one the largest Virtual Reality labs worldwide. The aixCAVE, a 30 sqm visualization chamber, makes it possible to interactively explore virtual worlds, is open to use by any RWTH Aachen research group.

News

Daniel Zielasko receives doctoral degree from RWTH Aachen University

Today, our colleague Daniel Zielasko successfully passed his Ph.D. defense and received a doctoral degree from RWTH Aachen University for his thesis on "DeskVR: Seamless Integration of Virtual Reality into Desk-based Data Analysis Workflows". Congratulations!

Feb. 21, 2020

If you are interest in a student worker position dealing with context menus in immersive, virtual environment, click here.

Oct. 11, 2019

M.Sc. Networked Production Engineering

Networking is tool. Thus, the highly interdisciplinary Master program Networked Production Engineering” (NPE) enables students to obtain technology-related qualifications for our increasingly networked world of work. Thereby three specialization are offered: Additive Manufacturing, Smart Factory, and E-Mobility. Obviously, Virtual Reality as smart technology - to take production to the networked level - is one of many important aspects here.

Interested in finding out more? Watch NPE's imagefilm here and see our aixCAVE at 1:09.

Oct. 11, 2019

Sebastian Pick receives doctoral degree from RWTH Aachen University

Today, our colleague Sebastian Pick successfully passed his Ph.D. defense and received a doctoral degree from RWTH Aachen University for his thesis on "Interactive Data Annotation for Virtual Reality Applications". Congratulations!

July 15, 2019

Successful Presentations at ISC'19

At this year's International Supercomputing Conference (ISC) exhibition in Frankfurt, scientists of JARA-HPC have presented an application based on Unreal Engine for visualizing a direct numerical simulation of early flame kernel development. In the interactive exhibit, the visitors of the booth were provided with stereo glasses and flysticks to interactively explore the combustion simulation projected through the display wall. The application has gathered interest from a wide range of people, ranging from youth interested in utilization of game engines for scientific visualization, to domain scientists asking the specifics of the direct numerical simulation in display. The responses were overall positive, and included valuable feedback for further development. A particular theme was extension of the immersive visualization environment to arbitrary datasets.

June 20, 2019

Astronauts Cassidy and Hansen Visting Virtual AMS-02

The astronauts Christopher J. Cassidy (US) and Jeremy R. Hansen (Canada) are preparing for one of the most demanding missions in space: Replacing the cooling system at the Alpha Magnetic Spectrometer AMS-02 on the International Space Station ISS in at least five intensive space operations.
As the new cooling system was developed by RWTH, in close collaboration with the Massachusetts Institute of Technology MIT, NASA, and other partners, the astronauts visited Aachen to become familiar with the cooling system's technology. During their stay, they also visited our aixCAVE to experience a full-scale, virtual AMS-02, a demo prepared by Prof. Roßmann's Institute for Man-Machine Interaction.
News regarding this spectacular visit can, e.g., be found here (University news: visit announcement), here (University news: visit retrospective) and here (video by Aachener Nachrichten, German).

May 21, 2019

Recent Publications

pubimg
High-Fidelity Point-Based Rendering of Large-Scale 3D Scan Datasets

IEEE Computer Graphics and Applications

Digitalization of 3D objects and scenes using modern depth sensors and high-resolution RGB cameras enables the preservation of human cultural artifacts at an unprecedented level of detail. Interactive visualization of these large datasets, however, is challenging without degradation in visual fidelity. A common solution is to fit the dataset into available video memory by downsampling and compression. The achievable reproduction accuracy is thereby limited for interactive scenarios, such as immersive exploration in Virtual Reality (VR). This degradation in visual realism ultimately hinders the effective communication of human cultural knowledge. This article presents a method to render 3D scan datasets with minimal loss of visual fidelity. A point-based rendering approach visualizes scan data as a dense splat cloud. For improved surface approximation of thin and sparsely sampled objects, we propose oriented 3D ellipsoids as rendering primitives. To render massive texture datasets, we present a virtual texturing system that dynamically loads required image data. It is paired with a single-pass page prediction method that minimizes visible texturing artifacts. Our system renders a challenging dataset in the order of 70 million points and a texture size of 1.2 terabytes consistently at 90 frames per second in stereoscopic VR.

fadeout
 
pubimg
Calibratio - A Small, Low-Cost, Fully Automated Motion-to-Photon Measurement Device

10th Workshop on Software Engineering and Architectures for Realtime Interactive Systems (SEARIS), 2020

Since the beginning of the design and implementation of virtual environments, these systems have been built to give the users the best possible experience. One detrimental factor for the user experience was shown to be a high end-to-end latency, here measured as motionto-photon latency, of the system. Thus, a lot of research in the past was focused on the measurement and minimization of this latency in virtual environments. Most existing measurement-techniques require either expensive measurement hardware like an oscilloscope, mechanical components like a pendulum or depend on manual evaluation of samples. This paper proposes a concept of an easy to build, low-cost device consisting of a microcontroller, servo motor and a photo diode to measure the motion-to-photon latency in virtual reality environments fully automatically. It is placed or attached to the system, calibrates itself and is controlled/monitored via a web interface. While the general concept is applicable to a variety of VR technologies, this paper focuses on the context of CAVE-like systems.

fadeout
 
pubimg
Towards a Graphical User Interface for Exploring and Fine-Tuning Crowd Simulations

IEEE Virtual Humans and Crowds for Immersive Environments (VHCIE), 2020

Simulating a realistic navigation of virtual pedestrians through virtual environments is a recurring subject of investigations. The various mathematical approaches used to compute the pedestrians’ paths result, i.a., in different computation-times and varying path characteristics. Customizable parameters, e.g., maximal walking speed or minimal interpersonal distance, add another level of complexity. Thus, choosing the best-fitting approach for a given environment and use-case is non-trivial, especially for novice users. To facilitate the informed choice of a specific algorithm with a certain parameter set, crowd simulation frameworks such as Menge provide an extendable collection of approaches with a unified interface for usage. However, they often miss an elaborated visualization with high informative value accompanied by visual analysis methods to explore the complete simulation data in more detail – which is yet required for an informed choice. Benchmarking suites such as SteerBench are a helpful approach as they objectively analyze crowd simulations, however they are too tailored to specific behavior details. To this end, we propose a preliminary design of an advanced graphical user interface providing a 2D and 3D visualization of the crowd simulation data as well as features for time navigation and an overall data exploration.

fadeout
Disclaimer Home Visual Computing institute RWTH Aachen University