Unimodal and Multimodal Sensor Fusion for Wearable Activity Recognition

  • The "Expressiveness of Human Body Movements" inspires this dissertation. People naturally synchronize hand, body movements, and facial expressions to create a cohesive nonverbal message. To understand this communication, measuring and quantifying it in natural settings is essential. Consequently, the focus of this work is on designing versatile wearable solutions, taking the situational context of body actions into account. The dissertation introduces a set of measurement tools to the wearable community, aiding in expanding the understanding of body movement expressiveness. It designs experimental scenarios with typical gestures associated with body language. It is important to note that the evaluations in this thesis primarily assess hardware capabilities and do not aim to evoke genuine emotions in participants. The work also proposes a variety of multipositional and multimodal wearable prototypes. The idea is that different sensor positions and multiple sensing modalities help to form a unified perception and understanding of complex situations. Another relevant aspect is to recognize human behavior/activities pervasively. Wearable devices are the most promising ubiquitous human activity recognition (HAR) option. Creating wearable-based HAR solutions that are both small and widely accepted by users presents a significant challenge. In this context, a multidisciplinary approach is required. This includes expertise in sensor technologies, signal processing, data fusion algorithms, and domain-specific knowledge. One way to gain user acceptance is to deploy the HW/SW systems in the most common wearable accessories on the market. Hence, the designs presented here are based on wristbands, goggles, headwear (helmet and sports cap), and clothing (jacket and gloves). This work focuses on HW/SW co-design systems for HAR in the context of \textbf{Hand Position and Gesture Estimation}, \textbf{Head and Facial Muscle Movements Recognition}, and \textbf{Body Postures and Gesture Classification}. Considering these three scenarios, the thesis explores customized smart-wearable design with application-specific goals. The decision criterion in this work is based on two factors: how relevant the scenario is to understanding human behavior and how innovative the sensing technology is within the wearable community. Overall, the designs have been tested in various experimental settings with evaluation based on mimicked gestures. The experiments were designed to test the feasibility of the proposed hardware for solving specific scenarios. The main goal is to provide tools that can be used in the future to understand the "Expressiveness of Human Body Movements" in a ubiquitous way. Nonetheless, the experiments should be extended to include the emotional element of expressions. This is beyond the scope of this dissertation, which focuses on mimicked experiments.

Download full text files

  • Hinweis: Prof. Dr. Karsten Berns ist nicht Gutachter sondern Vorsitzender der Promotionskommission

Export metadata

Additional Services

Search Google Scholar
Metadaten
Author:Hymalai Bello
URN:urn:nbn:de:hbz:386-kluedo-85850
DOI:https://doi.org/10.26204/KLUEDO/8585
Advisor:Paul Lukowicz, Kristof Van Laerhoven
Document Type:Doctoral Thesis
Language of publication:English
Date of Publication (online):2024/12/28
Year of first Publication:2024
Publishing Institution:Rheinland-Pfälzische Technische Universität Kaiserslautern-Landau
Granting Institution:Rheinland-Pfälzische Technische Universität Kaiserslautern-Landau
Acceptance Date of the Thesis:2024/12/18
Date of the Publication (Server):2025/01/02
Page Number:XIII, 131
Faculties / Organisational entities:Kaiserslautern - Fachbereich Informatik
CCS-Classification (computer science):B. Hardware
DDC-Cassification:0 Allgemeines, Informatik, Informationswissenschaft / 004 Informatik
Licence (German):Creative Commons 4.0 - Namensnennung, nicht kommerziell, keine Bearbeitung (CC BY-NC-ND 4.0)