| Name: | Description: | Size: | Format: | |
|---|---|---|---|---|
| 8.12 MB | Adobe PDF |
Advisor(s)
Abstract(s)
The field of affective computing (AffC) has experienced significant growth, making it challenging to stay up to date with the latest advancements. This surge in interest has likely contributed to a significant rise in the number of systematic reviews or surveys (SRoS) being published across various journals, covering topics like databases, methods, and general perspectives. This paper provides three key contributions: 1) A comprehensive analysis of the evolution of emotion recognition methods from 2002 to 2024, with particular emphasis on emotional body gesture recognition, documenting a clear transition from traditional machine learning to sophisticated deep learning architectures; 2) Identification and detailed analysis of the most impactful papers (the ‘‘cream of the crop’’) that have shaped body-based AffC methods, revealing that modern approaches increasingly use attention mechanisms, graph-based representations for skeletal data, and advanced spatial-temporal modeling techniques; and 3) A systematic categorization and analysis of emotion recognition methods across architectural types (machine learning, deep learning, and hybrid) and modalities (emotional body gesture recognition, facial emotion recognition, multimodal emotion recognition, and speech emotion recognition), demonstrating the field’s progression from unimodal to more robust multimodal approaches. Through an analysis of 10 selected SRoS papers published between 2021-2024, referencing 292 papers collectively, this study reveals critical challenges including limited availability of large-scale body-based emotional databases, computational demands of modern architectures, and cross-database generalization issues.
Description
Keywords
Affective computing Body-based emotion recognition
Pedagogical Context
Citation
Publisher
Institute of Electrical and Electronics Engineers
