The aim of this study was to investigate the neural underpinnings and the time course of emoji recognition through the recording of event-related potentials in 51 participants engaged in a categorization task involving an emotional word paradigm. Forty-eight happy, sad, surprised, disgusted, fearful, angry emojis, and as many facial expressions, were used as stimuli. Behavioral data showed that emojis were recognized faster and more accurately (92.7%) than facial expressions displaying the same emotions (87.35%). Participants were better at recognizing happy, disgusted, and sad emojis, and happy and angry faces. Fear was difficult to recognize in both faces and emojis. The N400 response was larger to incongruently primed emojis and faces, while the opposite was observed for the P300 component. However, both N400 and P300 were considerably later in response to faces than emojis. The emoji-related N170 component (150–190 ms) discriminated stimulus affective content, similar to face-related N170, but its neural generators did not include the face fusiform area but the occipital face area (OFA) for processing face details, and object-related areas. Both faces and emojis activated the limbic system and the orbitofrontal cortex supporting anthropomorphization. The schematic nature of emojis might determine an easier classification of their emotional content.
Dalle Nogare, L., Proverbio, A. (2023). Emojis vs. facial expressions: an electrical neuroimaging study on perceptual recognition. SOCIAL NEUROSCIENCE, 18(1), 46-64 [10.1080/17470919.2023.2203949].
Emojis vs. facial expressions: an electrical neuroimaging study on perceptual recognition
Proverbio, Alice Mado
Ultimo
2023
Abstract
The aim of this study was to investigate the neural underpinnings and the time course of emoji recognition through the recording of event-related potentials in 51 participants engaged in a categorization task involving an emotional word paradigm. Forty-eight happy, sad, surprised, disgusted, fearful, angry emojis, and as many facial expressions, were used as stimuli. Behavioral data showed that emojis were recognized faster and more accurately (92.7%) than facial expressions displaying the same emotions (87.35%). Participants were better at recognizing happy, disgusted, and sad emojis, and happy and angry faces. Fear was difficult to recognize in both faces and emojis. The N400 response was larger to incongruently primed emojis and faces, while the opposite was observed for the P300 component. However, both N400 and P300 were considerably later in response to faces than emojis. The emoji-related N170 component (150–190 ms) discriminated stimulus affective content, similar to face-related N170, but its neural generators did not include the face fusiform area but the occipital face area (OFA) for processing face details, and object-related areas. Both faces and emojis activated the limbic system and the orbitofrontal cortex supporting anthropomorphization. The schematic nature of emojis might determine an easier classification of their emotional content.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.