Improving Essential Interactions for Immersive Virtual Environments with Novel Hand Gesture Authoring Tools

  • Augmented (AR), Virtual (VR) and Mixed Reality (MR) are on their way into everyday life. The recent emergence of consumer-friendly hardware to access this technology has greatly benefited the community. Research and application examples for AR, VR and MR can be found in many fields, such as medicine, sports, the area of cultural heritage, teleworking, entertainment and gaming. Although this technology has been around for decades, immersive applications using this technology are still in their infancy. As manufacturers increase accessibility to these technologies by introducing consumer grade hardware with natural input modalities such as eye gaze or hand tracking, new opportunities but also problems and challenges arise. Researchers strive to develop and investigate new techniques for dynamic content creation or novel interaction techniques. It has yet to be found out which interactions can be made intuitively by users. A major issue is that the possibilities for easy prototyping and rapid testing of new interaction techniques are limited and largely unexplored. In this thesis, different solutions are proposed to improve gesture-based interaction in immersive environments by introducing gesture authoring tools and developing novel applications. Specifically, hand gestures should be made more accessible to people outside this specialised domain. First, a survey which explores one of the largest and most promising application scenario for AR, VR and MR, namely remote collaboration is introduced. Based on the results of this survey, the thesis focuses on several important issues to consider when developing and creating applications. At its core, the thesis is about rapid prototyping based on panorama images and the use of hand gestures for interactions. Therefore, a technique to create immersive applications with panorama based virtual environments including hand gestures is introduced. A framework to rapidly design, prototype, implement, and create arbitrary one-handed gestures is presented. Based on a user study, the potential of the framework as well as efficacy and usability of hand gestures is investigated. Next, the potential of hand gestures for locomotion tasks in VR is investigated. Additionally, it is analysed how lay people can adapt to the use of hand tracking technology in this context. Lastly, the use of hand gestures for grasping virtual objects is explored and compared to state of the art techniques. Within this thesis, different input modalities and techniques are compared in terms of usability, effort, accuracy, task completion time, user rating, and naturalness.

Volltext Dateien herunterladen

Metadaten exportieren

Weitere Dienste

Suche bei Google Scholar
Metadaten
Verfasser*innenangaben:Alexander SchäferORCiD
URN:urn:nbn:de:hbz:386-kluedo-72723
DOI:https://doi.org/10.26204/KLUEDO/7272
Betreuer*in:Didier Stricker, Achim Ebert
Dokumentart:Dissertation
Sprache der Veröffentlichung:Englisch
Datum der Veröffentlichung (online):05.05.2023
Jahr der Erstveröffentlichung:2023
Veröffentlichende Institution:Rheinland-Pfälzische Technische Universität Kaiserslautern-Landau
Titel verleihende Institution:Rheinland-Pfälzische Technische Universität Kaiserslautern-Landau
Datum der Annahme der Abschlussarbeit:21.04.2023
Datum der Publikation (Server):08.05.2023
Seitenzahl:VIII, 189
Fachbereiche / Organisatorische Einheiten:Kaiserslautern - Fachbereich Elektrotechnik und Informationstechnik
DDC-Sachgruppen:0 Allgemeines, Informatik, Informationswissenschaft / 004 Informatik
Lizenz (Deutsch):Creative Commons 4.0 - Namensnennung, nicht kommerziell, keine Bearbeitung (CC BY-NC-ND 4.0)