header

Publications


 

Wayfinding in Immersive Virtual Environments as Social Activity Supported by Virtual Agents


Andrea Bönsch, Jonathan Ehret, Daniel Rupp, Torsten Wolfgang Kuhlen
Frontiers in Virtual Reality, Section Virtual Reality and Human Behaviour
pubimg

Effective navigation and interaction within immersive virtual environments rely on thorough scene exploration. Therefore, wayfinding is essential, assisting users in comprehending their surroundings, planning routes, and making informed decisions. Based on real-life observations, wayfinding is, thereby, not only a cognitive process but also a social activity profoundly influenced by the presence and behaviors of others. In virtual environments, these 'others' are virtual agents (VAs), defined as anthropomorphic computer-controlled characters, who enliven the environment and can serve as background characters or direct interaction partners. However, little research has been done to explore how to efficiently use VAs as social wayfinding support. In this paper, we aim to assess and contrast user experience, user comfort, and the acquisition of scene knowledge through a between-subjects study involving n = 60 participants across three distinct wayfinding conditions in one slightly populated urban environment: (i) unsupported wayfinding, (ii) strong social wayfinding using a virtual supporter who incorporates guiding and accompanying elements while directly impacting the participants' wayfinding decisions, and (iii) weak social wayfinding using flows of VAs that subtly influence the participants' wayfinding decisions by their locomotion behavior. Our work is the first to compare the impact of VAs' behavior in virtual reality on users' scene exploration, including spatial awareness, scene comprehension, and comfort. The results show the general utility of social wayfinding support, while underscoring the superiority of the strong type. Nevertheless, further exploration of weak social wayfinding as a promising technique is needed. Thus, our work contributes to the enhancement of VAs as advanced user interfaces, increasing user acceptance and usability.

» Show BibTeX

@article{Boensch2024,
title={Wayfinding in Immersive Virtual Environments as Social Activity Supported by Virtual Agents},
author={B{\"o}nsch, Andrea and Ehret, Jonathan and Rupp, Daniel and Kuhlen, Torsten W.},
journal={Frontiers in Virtual Reality},
volume={4},
year={2024},
pages={1334795},
publisher={Frontiers},
doi={10.3389/frvir.2023.1334795}
}





Virtual Reality as a Tool for Monitoring Additive Manufacturing Processes via Digital Shadows


Daniel Rupp, Torsten Wolfgang Kuhlen, Sven Rarbach, Dominik Wiechel, Jens Pottebaum, Tizia Weidemann, Duc Tranh Tran, Robin Day, Jonas Zielinski, Valentina Koenig, Jan Bremer, Thomas Kosche, Andreas Grimm, Thomas Bergs, Iris Graessler, Tim Weissker
GI VR/AR Workshop 2024, Gesellschaft für Informatik e.V.
pubimg

We present a data acquisition and visualization pipeline that allows experts to monitor additive manufacturing processes, in particular laser metal deposition with wire (LMD-w) processes, in immersive virtual reality. Our virtual environment consists of a digital shadow of the LMD-w production site enriched with additional measurement data shown on both static as well as handheld virtual displays. Users can explore the production site by enhanced teleportation capabilities that enable them to change their scale as well as their elevation above the ground plane. In an exploratory user study with 22 participants, we demonstrate that our system is generally suitable for the supervision of LMD-w processes while generating low task load and cybersickness. Therefore, it serves as a first promising step towards the successful application of virtual reality technology in the comparatively young field of additive manufacturing.




Semi-Automated Guided Teleportation through Immersive Virtual Environments


Tim Weissker, Marius Meier-Krueger, Pauline Bimberg, Robert Lindeman, Torsten Wolfgang Kuhlen
Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology
pubimg

Immersive knowledge spaces like museums or cultural sites are often explored by traversing pre-defined paths that are curated to unfold a specific educational narrative. To support this type of guided exploration in VR, we present a semi-automated, handsfree path traversal technique based on teleportation that features a slow-paced interaction workflow targeted at fostering knowledge acquisition and maintaining spatial awareness. In an empirical user study with 34 participants, we evaluated two variations of our technique, differing in the presence or absence of intermediate teleportation points between the main points of interest along the route. While visiting additional intermediate points was objectively less efficient, our results indicate significant benefits of this approach regarding the user’s spatial awareness and perception of interface dependability. However, the user’s perception of flow, presence, attractiveness, perspicuity, and stimulation did not differ significantly. The overall positive reception of our approach encourages further research into semi-automated locomotion based on teleportation and provides initial insights into the design space of successful techniques in this domain.




A Lecturer’s Voice Quality and its Effect on Memory, Listening Effort, and Perception in a VR Environment


Isabel Sarah Schiller, Carolin Breuer, Lukas Aspöck, Jonathan Ehret, Andrea Bönsch, Torsten Wolfgang Kuhlen, Janina Fels, Sabine Janina Schlittmeier
Scientific Reports
pubimg

Many lecturers develop voice problems, such as hoarseness. Nevertheless, research on how voice quality influences listeners’ perception, comprehension, and retention of spoken language is limited to a small number of audio-only experiments. We aimed to address this gap by using audio-visual virtual reality (VR) to investigate the impact of a lecturer’s hoarseness on university students’ heard text recall, listening effort, and listening impression. Fifty participants were immersed in a virtual seminar room, where they engaged in a Dual-Task Paradigm. They listened to narratives presented by a virtual female professor, who spoke in either a typical or hoarse voice. Simultaneously, participants performed a secondary task. Results revealed significantly prolonged secondary-task response times with the hoarse voice compared to the typical voice, indicating increased listening effort. Subjectively, participants rated the hoarse voice as more annoying, effortful to listen to, and impeding for their cognitive performance. No effect of voice quality was found on heard text recall, suggesting that, while hoarseness may compromise certain aspects of spoken language processing, this might not necessarily result in reduced information retention. In summary, our findings underscore the importance of promoting vocal health among lecturers, which may contribute to enhanced listening conditions in learning spaces.

» Show Videos
» Show BibTeX

@article{Schiller2024,
author = {Isabel S. Schiller and Carolin Breuer and Lukas Aspöck and
Jonathan Ehret and Andrea Bönsch and Torsten W. Kuhlen and Janina Fels and
Sabine J. Schlittmeier},
doi = {10.1038/s41598-024-63097-6},
issn = {2045-2322},
issue = {1},
journal = {Scientific Reports},
keywords = {Audio-visual language processing,Virtual reality,Voice
quality},
month = {5},
pages = {12407},
pmid = {38811832},
title = {A lecturer’s voice quality and its effect on memory, listening
effort, and perception in a VR environment},
volume = {14},
url = {https://www.nature.com/articles/s41598-024-63097-6},
year = {2024},
}





IntenSelect+: Enhancing Score-Based Selection in Virtual Reality


Marcel Krüger, Tim Gerrits, Timon Römer, Torsten Wolfgang Kuhlen, Tim Weissker
2024 IEEE Transactions on Visualization and Computer Graphics
pubimg

Object selection in virtual environments is one of the most common and recurring interaction tasks. Therefore, the used technique can critically influence a system’s overall efficiency and usability. IntenSelect is a scoring-based selection-by-volume technique that was shown to offer improved selection performance over conventional raycasting in virtual reality. This initial method, however, is most pronounced for small spherical objects that converge to a point-like appearance only, is challenging to parameterize, and has inherent limitations in terms of flexibility. We present an enhanced version of IntenSelect called IntenSelect+ designed to overcome multiple shortcomings of the original IntenSelect approach. In an empirical within-subjects user study with 42 participants, we compared IntenSelect+ to IntenSelect and conventional raycasting on various complex object configurations motivated by prior work. In addition to replicating the previously shown benefits of IntenSelect over raycasting, our results demonstrate significant advantages of IntenSelect+ over IntenSelect regarding selection performance, task load, and user experience. We, therefore, conclude that IntenSelect+ is a promising enhancement of the original approach that enables faster, more precise, and more comfortable object selection in immersive virtual environments.

» Show Videos
» Show BibTeX

@ARTICLE{10459000,

author={Krüger, Marcel and Gerrits, Tim and Römer, Timon and Kuhlen, Torsten and Weissker, Tim},

journal={IEEE Transactions on Visualization and Computer Graphics},

title={IntenSelect+: Enhancing Score-Based Selection in Virtual Reality},

year={2024},

volume={},

number={},

pages={1-10},

keywords={Visualization;Three-dimensional displays;Task analysis;Usability;Virtual environments;Shape;Engines;Virtual Reality;3D User Interfaces;3D Interaction;Selection;Score-Based Selection;Temporal Selection;IntenSelect},





Authentication in Immersive Virtual Environments through Gesture-Based Interaction with a Virtual Agent


Daniel Rupp, Philipp Grießer, Andrea Bönsch, Torsten Wolfgang Kuhlen
2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
pubimg

Authentication poses a significant challenge in VR applications, as conventional methods, such as text input for usernames and passwords, prove cumbersome and unnatural in immersive virtual environments. Alternatives such as password managers or two-factor authentication may necessitate users to disengage from the virtual experience by removing their headsets. Consequently, we present an innovative system that utilizes virtual agents (VAs) as interaction partners, enabling users to authenticate naturally through a set of ten gestures, such as high fives, fist bumps, or waving. By combining these gestures, users can create personalized authentications akin to PINs, potentially enhancing security without compromising the immersive experience. To gain first insights into the suitability of this authentication process, we conducted a formal expert review with five participants and compared our system to a virtual keypad authentication approach. While our results show that the effectiveness of a VA-mediated gesture-based authentication system is still limited, they motivate further research in this area.

» Show Videos



VRScenarioBuilder: Free-Hand Immersive Authoring Tool for Scenario-based Testing of Automated Vehicles


Sevinc Eroglu, Arthur Voigt, Benjamin Weyers, Torsten Wolfgang Kuhlen
2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
pubimg

Virtual Reality has become an important medium in the automotive industry, providing engineers with a simulated platform to actively engage with and evaluate realistic driving scenarios for testing and validating automated vehicles. However, engineers are often restricted to using 2D desktop-based tools for designing driving scenarios, which can result in inefficiencies in the development and testing cycles. To this end, we present VRScenarioBuilder, an immersive authoring tool that enables engineers to create and modify dynamic driving scenarios directly in VR using free-hand interactions. Our tool features a natural user interface that enables users to create scenarios by using drag-and-drop building blocks. To evaluate the interface components and interactions, we conducted a user study with VR experts. Our findings highlight the effectiveness and potential improvements of our tool. We have further identified future research directions, such as exploring the spatial arrangement of the interface components and managing lengthy blocks.




Demo: Webcam-based Hand- and Object-Tracking for a Desktop Workspace in Virtual Reality


Sebastian Pape, Jonathan Heinrich Beierle, Torsten Wolfgang Kuhlen, Tim Weissker
ACM Symposium on Spatial User Interaction
pubimg

As virtual reality overlays the user’s view, challenges arise when interaction with their physical surroundings is still needed. In a seated workspace environment interaction with the physical surroundings can be essential to enable productive working. Interaction with e.g. physical mouse and keyboard can be difficult when no visual reference is given to where they are placed. This demo shows a combination of computer vision-based marker detection with machine-learning-based hand detection to bring users’ hands and arbitrary objects into VR.

» Show BibTeX

@inproceedings{10.1145/3677386.3688879,
author = {Pape, Sebastian and Beierle, Jonathan Heinrich and Kuhlen, Torsten Wolfgang and Weissker, Tim},
title = {Webcam-based Hand- and Object-Tracking for a Desktop Workspace in Virtual Reality},
year = {2024},
isbn = {9798400710889},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3677386.3688879},
doi = {10.1145/3677386.3688879},
abstract = {As virtual reality overlays the user’s view, challenges arise when interaction with their physical surroundings is still needed. In a seated workspace environment interaction with the physical surroundings can be essential to enable productive working. Interaction with e.g. physical mouse and keyboard can be difficult when no visual reference is given to where they are placed. This demo shows a combination of computer vision-based marker detection with machine-learning-based hand detection to bring users’ hands and arbitrary objects into VR.},
booktitle = {Proceedings of the 2024 ACM Symposium on Spatial User Interaction},
articleno = {64},
numpages = {2},
keywords = {Hand-Tracking, Object-Tracking, Physical Props, Virtual Reality, Webcam},
location = {Trier, Germany},
series = {SUI '24}





Come Look at This: Supporting Fluent Transitions between Tightly and Loosely Coupled Collaboration in Social Virtual Reality


Pauline Bimberg, Daniel Zielasko, Benjamin Weyers, Bernd Fröhlich, Tim Weissker
IEEE Transactions on Visualization and Computer Graphics
pubimg

Collaborative work in social virtual reality often requires an interplay of loosely coupled collaboration from different virtual locations and tightly coupled face-to-face collaboration. Without appropriate system mediation, however, transitioning between these phases requires high navigation and coordination efforts. In this paper, we present an interaction system that allows collaborators in virtual reality to seamlessly switch between different collaboration models known from related work. To this end, we present collaborators with functionalities that let them work on individual sub-tasks in different virtual locations, consult each other using asymmetric interaction patterns while keeping their current location, and temporarily or permanently join each other for face-to-face interaction. We evaluated our methods in a user study with 32 participants working in teams of two. Our quantitative results indicate that delegating the target selection process for a long-distance teleport significantly improves placement accuracy and decreases task load within the team. Our qualitative user feedback shows that our system can be applied to support flexible collaboration. In addition, the proposed interaction sequence received positive evaluations from teams with varying VR experiences.

» Show BibTeX

@ARTICLE{10568966,
author={Bimberg, Pauline and Zielasko, Daniel and Weyers, Benjamin and Froehlich, Bernd and Weissker, Tim},
journal={IEEE Transactions on Visualization and Computer Graphics},
title={Come Look at This: Supporting Fluent Transitions between Tightly and Loosely Coupled Collaboration in Social Virtual Reality},
year={2024},
volume={},
number={},
pages={1-17},
keywords={Collaboration;Virtual environments;Navigation;Task analysis;Virtual reality;Three-dimensional displays;Teleportation;Virtual Reality;3D User Interfaces;Multi-User Environments;Social VR;Groupwork;Collaborative Interfaces},
doi={10.1109/TVCG.2024.3418009}}





Poster: Travel Speed, Spatial Awareness, And Implications for Egocentric Target-Selection-Based Teleportation - A Replication Design


Daniel Zielasko, Tim Weissker, Doug Bowman
Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology

Virtual travel in Virtual Reality experiences is common, offering users the ability to explore expansive virtual spaces. Various interfaces exist for virtual travel, with speed playing a crucial role in user experience and spatial awareness. Teleportation-based interfaces provide instantaneous transitions, whereas continuous and semi-continuous methods vary in speed and control. Prior research by Bowman et al. highlighted the impact of travel speed on spatial awareness demonstrating that instantaneous travel can lead to user disorientation. However, additional cues, such as visual target selection, can aid in reorientation. This study replicates and extends Bowman’s experiment, investigating the influence of travel speed and visual target cues on spatial orientation.




On the Computation of User Placements for Virtual Formation Adjustments during Group Navigation


Tim Weissker, Matthis Franzgrote, Torsten Wolfgang Kuhlen, Tim Gerrits
2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
pubimg

Several group navigation techniques enable a single navigator to control travel for all group members simultaneously in social virtual reality. A key aspect of this process is the ability to rearrange the group into a new formation to facilitate the joint observation of the scene or to avoid obstacles on the way. However, the question of how users should be distributed within the new formation to create an intuitive transition that minimizes disruptions of ongoing social activities is currently not explored. In this paper, we begin to close this gap by introducing four user placement strategies based on mathematical considerations, discussing their benefits and drawbacks, and sketching further novel ideas to approach this topic from different angles in future work. Our work, therefore, contributes to the overarching goal of making group interactions in social virtual reality more intuitive and comfortable for the involved users.

» Show Videos
» Show BibTeX

@INPROCEEDINGS{10536250,
author={Weissker, Tim and Franzgrote, Matthis and Kuhlen, Torsten and Gerrits, Tim},
booktitle={2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)},
title={On the Computation of User Placements for Virtual Formation Adjustments During Group Navigation},
year={2024},
volume={},
number={},
pages={396-402},
keywords={Three-dimensional displays;Navigation;Conferences;Virtual reality;Human factors;User interfaces;Task analysis;Human-centered computing—Human computer interaction (HCI)—Interaction paradigms—Virtual reality;Human-centered computing—Interaction design—Interaction design theory, concepts and paradigms},
doi={10.1109/VRW62533.2024.00077}}





Try This for Size: Multi-Scale Teleportation in Immersive Virtual Reality


Tim Weissker, Matthis Franzgrote, Torsten Wolfgang Kuhlen
IEEE Transactions on Visualization and Computer Graphics
pubimg

The ability of a user to adjust their own scale while traveling through virtual environments enables them to inspect tiny features being ant-sized and to gain an overview of the surroundings as a giant. While prior work has almost exclusively focused on steering-based interfaces for multi-scale travel, we present three novel teleportation-based techniques that avoid continuous motion flow to reduce the risk of cybersickness. Our approaches build on the extension of known teleportation workflows and suggest specifying scale adjustments either simultaneously with, as a connected second step after, or separately from the user’s new horizontal position. The results of a two-part user study with 30 participants indicate that the simultaneous and connected specification paradigms are both suitable candidates for effective and comfortable multi-scale teleportation with nuanced individual benefits. Scale specification as a separate mode, on the other hand, was considered less beneficial. We compare our findings to prior research and publish the executable of our user study to facilitate replication and further analyses.

» Show Videos
» Show BibTeX

@ARTICLE{10458384,
author={Weissker, Tim and Franzgrote, Matthis and Kuhlen, Torsten},
journal={IEEE Transactions on Visualization and Computer Graphics},
title={Try This for Size: Multi-Scale Teleportation in Immersive Virtual Reality},
year={2024},
volume={30},
number={5},
pages={2298-2308},
keywords={Teleportation;Navigation;Virtual environments;Three-dimensional displays;Visualization;Cybersickness;Collaboration;Virtual Reality;3D User Interfaces;3D Navigation;Head-Mounted Display;Teleportation;Multi-Scale},
doi={10.1109/TVCG.2024.3372043}}





StudyFramework: Comfortably Setting up and Conducting Factorial-Design Studies Using the Unreal Engine


Jonathan Ehret, Andrea Bönsch, Janina Fels, Sabine Janina Schlittmeier, Torsten Wolfgang Kuhlen
To be presented at Open Access Tools (OAT) and Libraries for Virtual Reality Workshop at IEEE Virtual Reality 2024
pubimg

Setting up and conducting user studies is fundamental to virtual reality research. Yet, often these studies are developed from scratch, which is time-consuming and especially hard and error-prone for novice developers. In this paper, we introduce the StudyFramework, a framework specifically designed to streamline the setup and execution of factorial-design VR-based user studies within the Unreal Engine, significantly enhancing the overall process. We elucidate core concepts such as setup, randomization, the experimenter view, and logging. After utilizing our framework to set up and conduct their respective studies, 11 study developers provided valuable feedback through a structured questionnaire. This feedback, which was generally positive, highlighting its simplicity and usability, is discussed in detail.

» Show Videos
» Show BibTeX

@ InProceedings{Ehret2024a,
author={Ehret, Jonathan and Bönsch, Andrea and Fels, Janina and
Schlittmeier, Sabine J. and Kuhlen, Torsten W.},
booktitle={2024 IEEE Conference on Virtual Reality and 3D User Interfaces
Abstracts and Workshops (VRW): Workshop "Open Access Tools and Libraries for
Virtual Reality"},
title={StudyFramework: Comfortably Setting up and Conducting
Factorial-Design Studies Using the Unreal Engine},
year={2024}
}





Audiovisual Coherence: Is Embodiment of Background Noise Sources a Necessity?


Jonathan Ehret, Andrea Bönsch, Isabel Sarah Schiller, Carolin Breuer, Lukas Aspöck, Janina Fels, Sabine Janina Schlittmeier, Torsten Wolfgang Kuhlen
To be presented at Workshop on Virtual Humans and Crowds in Immersive Environments (VHCIE) at IEEE Virtual Reality 2024
pubimg

Exploring the synergy between visual and acoustic cues in virtual reality (VR) is crucial for elevating user engagement and perceived (social) presence. We present a study exploring the necessity and design impact of background sound source visualizations to guide the design of future soundscapes. To this end, we immersed n = 27 participants using a head-mounted display (HMD) within a virtual seminar room with six virtual peers and a virtual female professor. Participants engaged in a dual-task paradigm involving simultaneously listening to the professor and performing a secondary vibrotactile task, followed by recalling the heard speech content. We compared three types of background sound source visualizations in a within-subject design: no visualization, static visualization, and animated visualization. Participants’ subjective ratings indicate the importance of animated background sound source visualization for an optimal coherent audiovisual representation, particularly when embedding peer-emitted sounds. However, despite this subjective preference, audiovisual coherence did not affect participants’ performance in the dual-task paradigm measuring their listening effort.

» Show Videos
» Show BibTeX

@ InProceedings{Ehret2024b,
author={Ehret, Jonathan and Bönsch, Andrea and Schiller, Isabel S. and
Breuer, Carolin and Aspöck, Lukas and Fels, Janina and Schlittmeier, Sabine
J. and Kuhlen, Torsten W.},
booktitle={2024 IEEE Conference on Virtual Reality and 3D User Interfaces
Abstracts and Workshops (VRW): "Workshop on Virtual Humans and Crowds in
Immersive Environments (VHCIE)"},
title={Audiovisual Coherence: Is Embodiment of Background Noise Sources a
Necessity?},
year={2024}
}





German and Dutch Translations of the Artificial-Social-Agent Questionnaire Instrument for Evaluating Human-Agent Interactions


Nele Albers, Andrea Bönsch, Jonathan Ehret, Boleslav A. Khodakov, Willem-Paul Brinkman
ACM International Conference on Intelligent Virtual Agents (IVA ’24)
pubimg

Enabling the widespread utilization of the Artificial-Social-Agent (ASA)Questionnaire, a research instrument to comprehensively assess diverse ASA qualities while ensuring comparability, necessitates translations beyond the original English source language questionnaire. We thus present Dutch and German translations of the long and short versions of the ASA Questionnaire and describe the translation challenges we encountered. Summative assessments with 240 English-Dutch and 240 English-German bilingual participants show, on average, excellent correlations (Dutch ICC M = 0.82,SD = 0.07, range [0.58, 0.93]; German ICC M = 0.81, SD = 0.09, range [0.58,0.94]) with the original long version on the construct and dimension level. Results for the short version show, on average, good correlations (Dutch ICC M = 0.65, SD = 0.12, range [0.39, 0.82]; German ICC M = 0.67, SD = 0.14, range [0.30,0.91]). We hope these validated translations allow the Dutch and German-speaking populations to evaluate ASAs in their own language.

» Show BibTeX

@InProceedings{Boensch2024,
author = { Nele Albers, Andrea Bönsch, Jonathan Ehret, Boleslav
A. Khodakov, Willem-Paul Brinkman },
booktitle = {ACM International Conference on Intelligent Virtual
Agents (IVA ’24)},
title = { German and Dutch Translations of the
Artificial-Social-Agent Questionnaire Instrument for Evaluating Human-Agent
Interactions},
year = {2024},
organization = {ACM},
pages = {4},
doi = {10.1145/3652988.3673928},
}




Late-Breaking Report: VR-CrowdCraft: Coupling and Advancing Research in Pedestrian Dynamics and Social Virtual Reality


Andrea Bönsch, Maik Boltes, Anna Sieben, Torsten Wolfgang Kuhlen
to be presented at: IEEE Virtual Humans and Crowds for Immersive Environments (VHCIE), 2024
pubimg

VR-CrowdCraft is a newly formed interdisciplinary initiative, dedicated to the convergence and advancement of two distinct yet interconnected research fields: pedestrian dynamics (PD) and social virtual reality (VR). The initiative aims to establish foundational workflows for a systematic integration of PD data obtained from real-life experiments, encompassing scenarios ranging from smaller clusters of approximately ten individuals to larger groups comprising several hundred pedestrians, into immersive virtual environments (IVEs), addressing the following two crucial goals: (1) Advancing pedestrian dynamic analysis and (2) Advancing virtual pedestrian behavior: authentic populated IVEs and new PD experiments. The LBR presentation will focus on goal 1.





Previous Year (2023)
Disclaimer Home Visual Computing institute RWTH Aachen University