Project Details
Projekt Print View

Multimodal recognition of affect over the course of a tutorial learning experiment

Subject Area Human Factors, Ergonomics, Human-Machine Systems
Automation, Mechatronics, Control Systems, Intelligent Technical Systems, Robotics
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term from 2018 to 2022
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 394413900
 
Tutorial systems in a learning environment should support the user in reaching his specific learning goals and react to his level of knowledge and individual abilities in a pedagogical reasonable manner. For this a comprehensive model of the user is essential taking the prior knowledge and interaction history of the user into account as well as his current affective state. In our research project we will focus on an interactive learning task, in which the user is supported by a tutorial system component. Neurophysiological data (fMRT, EEG) and psychophysiological data (ECG, skin conductance, respiration) as well as details of the behavior of the user (dynamics of key stroke, facial expressions) are recorded synchronously, in order to determine changing affective and cognitive user states by multimodal data analysis. The interdisciplinary analysis and interpretation of the interconnection between the partially weak signals of the single modalities, and the development and optimization of the classifiers for affect recognition are the primary goals of our research project.
DFG Programme Research Grants
 
 

Additional Information

Textvergrößerung und Kontrastanpassung