Perception of bimodal expressions of human emotional states: mechanisms of integration

 
Audio is AI-generated
67

Abstract

The patterns of perception of uni- and multimodal expressions of human affective states are studied. Objective: to search for functional mechanisms of cross-modal integration of vocal and facial forms of bimodal (voice + face) expressions. According to the hypothesis, the type of cross-modal relations in the perception of human affective states is determined by the content (category) of the target emotion. The features of perception of 14 unique and bimodal expressions of the same emotional states of actors-sitters were compared. The research was conducted on the platform of the Geneva Emotion Recognition Test — GERT. The experiment consisted of three series. In the first, the observers were presented with short audio videos of the affective states of the sitters, in the second — the same videos, but without sound, in the third — the intonation of the voice without a video image. Independent groups of subjects participated in each series: 72 women aged 22—27 ± 5.6. It was necessary to determine the emotional state of the sitter. The accuracy of identifications and the dynamics of the structure of categorical fields were analyzed. Five functional mechanisms of crossmodal integration have been identified, their correspondence to the content of emotions, valence and degree of arousal, as well as the structure of the categorical field. The phenomenon of coherence of unimodal expressions of the same affective content is described. The specifics of the organization of bimodal categorical fields are revealed. According to the results the process of cross-modal integration of unimodal states is not limited to the direct interaction of sensory systems, including visual and acoustic. The important thing is the subject (categorical) content and logic of combining diverse impressions about the human condition.

General Information

Keywords: interpersonal perception, multimodality, bimodal expressions of affective states, vocal expressions, facial expressions, accuracy of expression and identification of emotions, structure of categorical fields, cross-modal integration

Journal rubric: Psychology of Perception

Article type: scientific article

DOI: https://doi.org/10.17759/exppsy.2025180201

Funding. The work was supported by the RNF, project No. 24-18-00904 “Mechanisms of perception of the emotional state of a person in the processes of non-verbal communication”.

Acknowledgements. The authors are grateful for assistance in data collection E.V. Suvorova and A.V. Malionok.

Received 05.05.2025

Accepted

Published

For citation: Barabanschikov, V.A. (2025). Perception of bimodal expressions of human emotional states: mechanisms of integration. Experimental Psychology (Russia), 18(2), 7–33. (In Russ.). https://doi.org/10.17759/exppsy.2025180201

© Barabanschikov V.A., 2025

License: CC BY-NC 4.0

References

  1. Андреева, Е.В. (2013). Феномен кроссмодальных взаимодействий: современное состояние проблемы в зарубежной психологии. Российский научный журнал, 1, 219—225.
    Andreeva, E.V. (2013). The phenomenon of cross-modal interactions: Current state of the problem in foreign psychology. Russian Scientific Journal, 1, 219—225. (In Russ.).
  2. Барабанщиков, В.А. (2009). Восприятие выражений лица. М.: Институт психологии РАН.
    Barabanshchikov, V.A. (2009). Perception of facial expressions. Moscow: Institute of Psychology, RAS. (In Russ.).
  3. Барабанщиков, В.А. (2012a). Психологические механизмы восприятия выражений лица. В: В.А. Барабанщиков, А.А. Демидов, Д.А. Дивеев (Ред.), Лицо человека как средство общения (с. 13—31). М.: Когито-Центр.
    Barabanshchikov, V.A. (2012a). Psychological mechanisms of facial expression perception. In: V.A. Barabanshchikov, A.A. Demidov, D.A. Diveev (Ed.), The human face as a means of communication (pp. 13—31). Moscow: Cogito-Center. (In Russ.).
  4. Барабанщиков, В.А. (2012б). Экспрессии лица и их восприятие. М.: Институт психологии РАН.
    Barabanshchikov, V.A. (2012b). Facial expressions and their perception. Moscow: Institute of Psychology, RAS. (In Russ.).
  5. Барабанщиков, В.А. (2016). Динамика восприятия выражений лица. М.: Когито-Центр.
    Barabanshchikov, V.A. (2016). Dynamics of facial expression perception. Moscow: Cogito-Center. (In Russ.).
  6. Барабанщиков, В.А., Королькова, О.А. (2020). Восприятие экспрессий «живого» лица. Экспериментальная психология, 13(3), 55—73.
    Barabanshchikov, V.A., Korolkova, O.A. (2020). Perception of expressions of the “live” face. Experimental Psychology (Russia), 13(3), 55—73. (In Russ.).
  7. Барабанщиков, В.А., Суворова, Е.В. (2020). Оценка эмоционального состояния человека по его видеоизображению. Экспериментальная психология, 13(4), 4—24. https://doi.org/10.17759/exppsy.2020130401
    Barabanshchikov, V.A., Suvorova, E.V. (2020). Human Emotional State Assessment Based on a Video Portrayal.  Experimental Psychology (Russia), 13(4), 4—24. (In Russ.). https://doi.org/10.17759/exppsy.2020130401
  8. Барабанщиков, В.А., Суворова, Е.В. (2021). Оценка мультимодальных экспрессий лица в лаборатории и онлайн. В: К.И. Ананьева, В.А. Барабанщиков, А.А. Демидов (Ред.), Лицо человека в контекстах природы, технологий и культуры (с. 310—322). М.: Когито-Центр.
    Barabanshchikov, V.A., Suvorova, E.V. (2021). Assessment of multimodal facial expressions in the laboratory and online. In: K.I. Ananyeva, V.A. Barabanshchikov, A.A. Demidov (Ed.), The Human Face in Contexts of Nature, Technology, and Culture (pp. 310—322). Moscow: Kogito-Tsentr. (In Russ.).
  9. Барабанщиков, В.А., Суворова, Е.В. (2022). Гендерный фактор в распознавании эмоционального состояния человека по его аудио-видеоизображениям. Российский психологический журнал, 19(2), 6—20. https://doi.org/10.17759/rpj.2022190201
    Barabanshchikov, V.A., Suvorova, E.V. (2022). The gender factor in recognizing a person's emotional state from their audio and video images. Russian Psychological Journal, 19(2), 6—20. (In Russ.). https://doi.org/10.17759/rpj.2022190201
  10. Барабанщиков, В.А., Суворова, Е.В. (2022). Индивидуальные формы выражения и идентификация мультимодальных динамических состояний человека. Познание и переживание, 3(2), 6—35. https://doi.org/10.17759/kogito.202203220
    Barabanshchikov, V.A., Suvorova, E.V. (2022). Individual forms of expression and identification of multimodal dynamic human states. Cognition and Experience, 3(2), 6—35. (In Russ.). https://doi.org/10.17759/kogito.202203220
  11. Барабанщиков, В.А., Суворова, Е.В. (2023). Выражение и восприятие мультимодальных эмоциональных состояний. Национальный психологический журнал, 17(3), 106—127. https://doi.org/10.11621/npj.2023.0311
    Barabanshchikov, V.A., Suvorova, E.V. (2023). Expression and perception of multimodal emotional states. National Psychological Journal, 17(3), 106—127. (In Russ.). https://doi.org/10.11621/npj.2023.0311
  12. Барабанщиков, В.А., Суворова, Е.В. (2024). Восприятие подвижного лица как образующей мультимодальных аффективных состояний. Экспериментальная психология, 17(4), 4—27. https://doi.org/10.17759/exppsy.2024170401
    Barabanschikov, V.A., Suvorova, E.V. (2024). Vivid Face Perception as a Constructive Component of Multimodal Affective States. Experimental Psychology (Russia), 17(4), 4—27. (In Russ.). https://doi.org/10.17759/exppsy.2024170401
  13. Барабанщиков, В.А., Суворова, Е.В., Малионок, А.В. (2024). Восприятие просодической образующей мультимодальных аффективных состояний. Экспериментальная психология, 17(3), 30—51. https://doi.org/10.17759/exppsy.2024170303
    Barabanschikov, V.A., Suvorova, E.V., Malionok, A.V. (2024). Perception of the Prosodic Formative of Multimodal Affective States. Experimental Psychology (Russia), 17(3), 30—51. (In Russ.). https://doi.org/10.17759/exppsy.2024170303
  14. Baart, M., Vroomen, J. (2018). Recalibration of vocal affect by a dynamic face. Experimental Brain Research, 236(7), 1911—1918.
  15. Bänziger, T., Mortillaro, M., Scherer, K.R. (2012). Introducing the Geneva Multimodal expression corpus for experimental research on emotion perception. Emotion, 12(5), 1161. https://doi.org/10.1037/a0025827
  16. Dael, N., Goudbeek, M., Scherer, K.R. (2013). Perceived gesture dynamics in nonverbal expression of emotion. Perception, 42(6), 642—657. https://doi.org/10.1068/p7364
  17. Dael, N., Mortillaro, M., Scherer, K.R. (2012). Emotion expression in body action and posture. Emotion, 12(5), 1085. https://doi.org/10.1037/a0025737
  18. Dolan, R.J., Morris, J.S., de Gelder, B. (2001). Crossmodal binding of fear in voice and face. Proceedings of the National Academy of Sciences, 98(17), 9465—9470. https://doi.org/10.1073/pnas.171288598
  19. Gelder, B. de, Vroomen, J. (2000). The Perception of Emotions by Ear and by Eye. Cognition and Emotion, 14(3), 289—311. https://doi.org/10.1080/026999300378824
  20. Massaro, D.W., Egan, P.B. (1996). Perceiving affect from the voice and the face. Psychonomic Bulletin & Review, 3(2), 215—221. https://doi.org/10.3758/BF03212421
  21. Mehu, M., Mortillaro, M., Bänziger, T., Scherer, K.R. (2012). Reliable facial muscle activation enhances recognizability and credibility of emotional expression. Emotion, 12(4), 701—715. https://doi.org/10.1037/a0026717
  22. Mortillaro, M., Mehu, M., Scherer, K.R. (2011). Subtly different positive emotions can be distinguished by their facial expressions. Social Psychological and Personality Science, 2(3), 262—271.
  23. Scherer, K.R. (1986). Vocal affect expression: a review and a model for future research. Psychological bulletin, 99(2), 143—165. https://doi.org/10.1037/0033-2909.99.2.143
  24. Scherer, K.R. (2005). What are emotions? And how can they be measured? Social science information, 44(4), 695—729. https://doi.org/10.1177/0539018405058216
  25. Scherer, K.R., Scherer, U. (2011). Assessing the ability to recognize facial and vocal expressions of emotion: Construction and validation of the Emotion Recognition Index. Journal of Nonverbal Behavior, 35, 305—326.
  26. Schirmer, A., Adolphs, R. (2017). Emotion perception from face, voice, and touch: comparisons and convergence. Trends in cognitive sciences, 21(3), 216—228.
  27. Schlegel, K., Fontaine, J.R., Scherer, K.R. (2017). The nomological network of emotion recognition ability. European Journal of Psychological Assessment, 35(3), 352—363
  28. Schlegel, K., Grandjean, D., Scherer, K.R. (2012). Emotion recognition: Unidimensional ability or a set of modality-and emotion-specific skills? Personality and Individual Differences, 53(1), 16—21.
  29. Schlegel, K., Scherer, K.R. (2018). The nomological network of emotion knowledge and emotion understanding in adults: evidence from two new performance-based tests. Cognition and Emotion, 32(8), 1514—1530. https://doi.org/10.1080/02699931.2017.1414687

Information About the Authors

Vladimir A. Barabanschikov, Doctor of Psychology, Professor, Director, Institute of Experimental Psychology, Moscow State University of Psychology and Education, Professor, Moscow Institute of Psychoanalysis, Moscow, Russian Federation, ORCID: https://orcid.org/0000-0002-5084-0513, e-mail: vladimir.barabanschikov@gmail.com

Metrics

 Web Views

Whole time: 276
Previous month: 76
Current month: 5

 PDF Downloads

Whole time: 67
Previous month: 16
Current month: 1

 Total

Whole time: 343
Previous month: 92
Current month: 6