The following project concerns the proposal of a positive mood-supporting application, based on multi-sensorial stimuli. The first goal of this project is to study the differences, relations, and interactions between emotion and mood, trying to understand how emotion recognition methods and eliciting stimuli could be adjusted in the mood domain. This is a fundamental step in building a system meant to detect the user’s mood and monitor it over time while trying to support a positive mood by presenting proper multisensorial stimuli. Therefore, two other intermediate steps are essential: i) building a multimodal mood detection framework, using wearable devices combined with short questionnaires; and ii) defining which audiovisual characteristics a multi-sensorial stimulus should exhibit to empower its positive mood-support effect, exploiting different learning models. A profiling questionnaire will be submitted the first time a user accesses the system. Then a multimodal mood detection takes place with an Ecological Momentary Assessment (EMA) of the current mood and data derived from smartphones and wearable devices such as environmental, behavioural, and physiological data. Since the user’s current mood is defined, the system automatically selects the proper audiovisual stimuli to suggest the improvement of the mood, if needed. After a mood support session, a user’s feedback will be required to improve the effectiveness of the system and ensure a better set of stimuli for the specific user. A case study will be considered to understand the efficacy of this system in real-world applications. In particular, an automotive scenario will be developed, exploiting virtual reality stimuli during a driving simulation.

Rabaioli, C. (2024). Computational methods for a customised positive mood-supporting system based on multi-sensorial stimuli. In Proceedings of the AIxIA Doctoral Consortium 2024 co-located with the 23rd International Conference of the Italian Association for Artificial Intelligence (AIxIA 2024).

Computational methods for a customised positive mood-supporting system based on multi-sensorial stimuli

Rabaioli, C
2024

Abstract

The following project concerns the proposal of a positive mood-supporting application, based on multi-sensorial stimuli. The first goal of this project is to study the differences, relations, and interactions between emotion and mood, trying to understand how emotion recognition methods and eliciting stimuli could be adjusted in the mood domain. This is a fundamental step in building a system meant to detect the user’s mood and monitor it over time while trying to support a positive mood by presenting proper multisensorial stimuli. Therefore, two other intermediate steps are essential: i) building a multimodal mood detection framework, using wearable devices combined with short questionnaires; and ii) defining which audiovisual characteristics a multi-sensorial stimulus should exhibit to empower its positive mood-support effect, exploiting different learning models. A profiling questionnaire will be submitted the first time a user accesses the system. Then a multimodal mood detection takes place with an Ecological Momentary Assessment (EMA) of the current mood and data derived from smartphones and wearable devices such as environmental, behavioural, and physiological data. Since the user’s current mood is defined, the system automatically selects the proper audiovisual stimuli to suggest the improvement of the mood, if needed. After a mood support session, a user’s feedback will be required to improve the effectiveness of the system and ensure a better set of stimuli for the specific user. A case study will be considered to understand the efficacy of this system in real-world applications. In particular, an automotive scenario will be developed, exploiting virtual reality stimuli during a driving simulation.
poster + paper
mood detection, multi-sensorial stimuli, multimodal approach, hand-crafted features
English
AIxIA Doctoral Consortium 2024
2024
Bacciu, D; Donadello, I
Proceedings of the AIxIA Doctoral Consortium 2024 co-located with the 23rd International Conference of the Italian Association for Artificial Intelligence (AIxIA 2024)
2024
https://ceur-ws.org/Vol-3914/short81.pdf
open
Rabaioli, C. (2024). Computational methods for a customised positive mood-supporting system based on multi-sensorial stimuli. In Proceedings of the AIxIA Doctoral Consortium 2024 co-located with the 23rd International Conference of the Italian Association for Artificial Intelligence (AIxIA 2024).
File in questo prodotto:
File Dimensione Formato  
Rabaioli-2024-CEUR-VoR.pdf

accesso aperto

Tipologia di allegato: Publisher’s Version (Version of Record, VoR)
Licenza: Licenza open access specifica dell’editore
Dimensione 806.6 kB
Formato Adobe PDF
806.6 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/540563
Citazioni
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
Social impact