This paper presents a multimodal database developed within the EU-funded project MYSELF. The project aims at developing an e-learning platform endowed with affective computing capabilities for the training of relational skills through interactive simulations. The database includes data coming from 34 participants and concerning physiological parameters, vocal nonverbal features, facial expression and posture. Ten different emotions were considered (anger, joy, sadness, fear, contempt, shame, guilt, pride, frustration and boredom), ranging from primary to self-conscious emotions of particular relevance in learning process and Interpersonal relationships. Preliminary results and analyses are presented, together with directions for future work.
Anolli, L., Mantovani, F., Mortillaro, M., Vescovo, A., Agliati, A., Confalonieri, L., et al. (2005). A multimodal database as a background for emotional synthesis, recognition and training in e-learning systems. In J. Tao, T. Tan, R.W. Picard (a cura di), Affective Computing and Intelligent Interaction First International Conference, ACII 2005, Beijing, China, October 22-24, 2005, Proceedings (pp. 566-573). Springer [10.1007/11573548_73].
A multimodal database as a background for emotional synthesis, recognition and training in e-learning systems
ANOLLI, LUIGI MARIA;MANTOVANI, FABRIZIA;AGLIATI, ALESSIA;REALDON, OLIVIA;ZURLONI, VALENTINO;
2005
Abstract
This paper presents a multimodal database developed within the EU-funded project MYSELF. The project aims at developing an e-learning platform endowed with affective computing capabilities for the training of relational skills through interactive simulations. The database includes data coming from 34 participants and concerning physiological parameters, vocal nonverbal features, facial expression and posture. Ten different emotions were considered (anger, joy, sadness, fear, contempt, shame, guilt, pride, frustration and boredom), ranging from primary to self-conscious emotions of particular relevance in learning process and Interpersonal relationships. Preliminary results and analyses are presented, together with directions for future work.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.