Identification of whole and partial facial expressions of human multimodal emotional states

 
Audio is AI-generated
2

Abstract

Context and relevance. Within the framework of the cognitive-communicative approach, the mechanisms of perception of multimodal expressions of human affective states are studied. The article considers the specifics of perception of bimodal emotional states of a person when exposing a whole and partially open face. Objective: comparative analysis of the productivity of perception of bimodal (face + voice) expressions of emotions in conditions of a fully and partially open face. Methods and materials. The experiment is based on the Russian version of the Geneva Emotion Recognition Test using facial occlusions. It consists of three episodes with 40 participants each; ages are 18—52 years old. The subjects were consistently presented with short (3—5 s) audio video clips of key affective states expressed by specially trained professional actors on the monitor screen. In the first series, their faces were completely exposed; in the second and third, the upper or lower halves of their faces were covered with a rectangular mask. It was necessary to determine the emotional state of the sitters by touching the corresponding button on the interactive scale of the Geneva Emotion Wheel. Results. New data has been obtained regarding the correctness of recognition of bimodal expressions when exposing a whole and partially open face, their relationship and dependence on the content of emotions and the location of occlusion. It is shown that when perceiving bimodal expressions, the popular maxim “the whole is greater than the part” is limited in nature, not taking into account the specifics of each of the unimodal expressions and their interrelationships. The content of emotions and the area of facial occlusion play an essential role. The types of multimodal relations of a part and a whole were revealed, covering different groups of affective categories: “the whole is greater than each of the parts” (36% of states), “the whole is equal to each of the parts” (21%), “the whole is greater than one of the parts” (14%) and “the whole is less than one of the parts” (28%). In 57% of cases, the accuracy of identification of the expressions of the upper and lower halves of the same face is indistinguishable, with the divergence of values, the expressions of the upper half are more accurately perceived. Conclusions. In conditions of multimodality, each of the halves of the face has its own expressive resources that are more closely linked to the potential of the face as a whole. The “paradoxical” effects of perceiving the expression of the whole and part of the face are due to the process of cross-modal integration of facial and prosodic forms of bimodal states.

General Information

Keywords: interpersonal perception, multimodality, bimodal expressions of affective states, vocal and facial expressions, identification of expressions of a whole and partially open face, cross-modal integration

Journal rubric: Psychology of Perception

Article type: scientific article

DOI: https://doi.org/10.17759/exppsy.2026190101

Funding. The work was supported by the RNF, project No. 24-18-00904 “Mechanisms of perception of the emotional state of a person in the processes of non-verbal communication”.

Received 04.03.2026

Revised 10.03.2026

Accepted

Published

For citation: Barabanschikov, V.A., Suvorova, E.V., Borisova, A.S. (2026). Identification of whole and partial facial expressions of human multimodal emotional states. Experimental Psychology (Russia), 19(1), 4–21. (In Russ.). https://doi.org/10.17759/exppsy.2026190101

© Barabanschikov V.A., Suvorova E.V., Borisova A.S., 2026

License: CC BY-NC 4.0

References

  1. Барабанщиков, В.А. (2009). Восприятие выражений лица. М.: Институт психологии РАН.
    Barabanschikov, V.A. (2009). Perception of facial expressions. Moscow: Institute of Psychology, RAS. (In Russ.)
  2. Барабанщиков, В.А. (2012a). Психологические механизмы восприятия выражений лица. В: В.А. Барабанщиков, А.А. Демидов, Д.А. Дивеев (Ред.), Лицо человека как средство общения (с. 13—31). М.: Когито-Центр.
    Barabanschikov, V.A. (2012a). Psychological mechanisms of facial expression perception. In: V.A. Barabanschikov, A.A. Demidov, D.A. Diveev (Eds.), The human face as a means of communication(pp. 13—31). Moscow: Cogito-Center. (In Russ.)
  3. Барабанщиков, В.А. (2012б). Экспрессии лица и их восприятие. М.: Институт психологии РАН.
    Barabanschikov, V.A. (2012b). Facial expressions and their perception. Moscow: Institute of Psychology, RAS. (In Russ.)
  4. Барабанщиков, В.А. (2016). Динамика восприятия выражений лица. М.: Когито-Центр.
    Barabanschikov, V.A. (2016). Dynamics of facial expression perception. Moscow: Cogito-Center. (In Russ.)
  5. Барабанщиков, В.А., Королькова, О.А. (2020). Восприятие экспрессий «живого» лица. Экспериментальная психология, 13(3), 55—73.
    Barabanschikov, V.A., Korolkova, O.A. (2020). Perception of expressions of the “live” face. Experimental Psychology (Russia), 13(3), 55—73. (In Russ.)
  6. Барабанщиков, В.А., Малкова, Т.Н. (1988). Зависимость точности идентификации экспрессии лица от локализации мимических проявлений. Вопросы психологии, 5, 131—
    Barabanschikov, V.A., Malkova, T.N. (1988). Dependence of accuracy of facial expression identification on localization of mimetic manifestations. Voprosy psikhologii, 5, 131—140. (In Russ.)
  7. Барабанщиков, В.А., Суворова, Е.В. (2020). Оценка эмоционального состояния человека по его видеоизображению. Экспериментальная психология, 13(4), 4—24. https://doi.org/10.17759/exppsy.2020130401
    Barabanschikov, V.A., Suvorova, E.V. (2020). Human emotional state assessment based on a video portrayal. Experimental Psychology (Russia), 13(4), 4—24. (In Russ.). https://doi.org/10.17759/exppsy.2020130401
  8. Барабанщиков, В.А., Суворова, Е.В. (2021). Оценка мультимодальных экспрессий лица в лаборатории и онлайн. В: К.И. Ананьева, В.А. Барабанщиков, А.А. Демидов (Ред.), Лицо человека в контекстах природы, технологий и культуры (с. 310—322). М.: Когито-Центр.
    Barabanschikov, V.A., Suvorova, E.V. (2021). Assessment of multimodal facial expressions in the laboratory and online. In: K.I. Ananyeva, V.A. Barabanschikov, A.A. Demidov (Eds.), The Human Face in Contexts of Nature, Technology, and Culture (pp. 310—322). Moscow: Kogito-Center. (In Russ.)
  9. Барабанщиков, В.А., Суворова, Е.В. (2023). Выражение и восприятие мультимодальных эмоциональных состояний. Национальный психологический журнал, 17(3), 106—127. https://doi.org/10.11621/npj.2023.0311
    Barabanschikov, V.A., Suvorova, E.V. (2023). Expression and perception of multimodal emotional states. National Psychological Journal, 17(3), 106—127. (In Russ.). https://doi.org/10.11621/npj.2023.0311
  10. Барабанщиков, В.А., Суворова, Е.В. (2024). Восприятие подвижного лица как образующей мультимодальных аффективных состояний. Экспериментальная психология, 17(4), 4—27. https://doi.org/10.17759/exppsy.2024170401
    Barabanschikov, V.A., Suvorova, E.V. (2024). Vivid face perception as a constructive component of multimodal affective states. Experimental Psychology (Russia), 17(4), 4—27. (In Russ.). https://doi.org/10.17759/exppsy.2024170401
  11. Барабанщиков, В.А., Суворова, Е.В., Малионок, А.В. (2024). Восприятие просодической образующей мультимодальных аффективных состояний. Экспериментальная психология, 17(3), 30—51. https://doi.org/10.17759/exppsy.2024170303
  12. Barabanschikov, V.A., Suvorova, E.V., Malionok, A.V. Perception of the Prosodic Formative of Multimodal Affective States. Experimental Psychology (Russia), 17(3), 30—51. (In Russ.). https://doi.org/10.17759/exppsy.2024170303
  13. Bartolini, E., Prete, G., Ceccato, I., La Malva, P., Di Crosta, A., Cannito, L., Marin, A., Di Domenico, A., Palumbo, R. (2026). From emotional detection of dynamic stimuli to facial identity recognition: Age difference in the processing of partially occluded faces. Research on Aging, 48(2), 169—183. https://doi.org/10.1177/01640275251348585
  14. Bombari, D., Schmid, P.C., Mast, M.S., Birri, S., Mast, F.W., Lobmaier, J.S. (2013). Emotion recognition: The role of featural and configural face information. Quarterly Journal of Experimental Psychology, 66(12), 2426—2442. https://doi.org/10.1080/17470218.2013.789065
  15. Boucher, J.D., Ekman, P. (1975). Facial areas and emotional information. Journal of Communication, 25(2), 21—29. https://doi.org/10.1111/j.1460-2466.1975.tb00577.x
  16. Calder, A.J., Keane, J., Manes, F., Antoun, N., Young, A.W. (2000). Impaired recognition and experience of disgust following brain injury. Nature Neuroscience, 3(11), 1077—1078.
  17. Calvo, M.G., Nummenmaa, L. (2016). Perceptual and affective mechanisms in facial expression recognition: An integrative review. Cognition and Emotion, 30(6), 1081—1106. https://doi.org/10.1080/02699931.2015.1049124
  18. Carbon, C. (2020). Wearing face masks strongly confuses counterparts in reading emotions. Frontiers in Psychology, 11, 566886. https://doi.org/10.3389/fpsyg.2020.566886
  19. Dolan, R.J., Morris, J.S., de Gelder, B. (2001). Crossmodal binding of fear in voice and face. Proceedings of the National Academy of Sciences of the United States of America, 98(17), 10006—10010. https://doi.org/10.1073/pnas.171288598
  20. Gao, D., Duan, W., Liu, T., Gao, Y., Tang, X. (2026). Do you fail to recognize me with a mask on? The impact of voice on mask-occluded facial identity recognition. Behavioral Sciences (Basel), 16(1), 128. https://doi.org/10.3390/bs16010128
  21. Garrido-Vásquez, P., Pell, M.D., Paulmann, S., Kotz, S.A. (2018). Dynamic facial expressions prime the processing of emotional prosody. Frontiers in Human Neuroscience, 12, 244. https://doi.org/10.3389/fnhum.2018.00244
  22. Gelder, B. de, Vroomen, J. (2000). The perception of emotion by ear and by eye. Cognition and Emotion, 14(3), 289—311. https://doi.org/10.1080/026999300378824
  23. Grahlow, M., Rupp, C.I., Derntl, B. (2022). The impact of face masks on emotion recognition performance and perception of threat. PLoS ONE, 17(2), e0262840.
  24. Jackson, I.R., Perugia, E., Stone, M.A., Saunders, G.H. (2024). The impact of face coverings on audio-visual contributions to communication with conversational speech. Cognitive Research: Principles and Implications, 9(1), 25. https://doi.org/10.1186/s41235-024-00552-y
  25. Kawahara, M., Tanaka, A. (2025). Impact of partial occlusion of the face on multisensory emotion perception: Comparison of pre- and post-COVID-19 pandemic. PLoS ONE, 20(1), e0307631. https://doi.org/10.1371/journal.pone.0307631
  26. Kellman, P.J. (2003). Visual perception of objects and boundaries: A four-dimensional approach. In: R. Kimchi, M. Behrmann, C.R. Olson (Eds.), Perceptual organization in vision: Behavioral and neural perspectives (pp. 155—201).
  27. Kotsia, I., Buciu, I., Pitas, I. (2008). An analysis of facial expression recognition under partial facial image occlusion. Image and Vision Computing, 26(7), 1052—1067. https://doi.org/10.1016/j.imavis.2007.11.004
  28. Libby, A.R., Scarince, C. (2025). The effect of face masks on confusion of emotional expressions. PLoS ONE, 20(9), e0330430.
  29. Leitner, M.C., Meurer, V., Hutzler, F., Schuster, S., Hawelka, S. (2022). The effect of masks on the recognition of facial expressions: A true-to-life study on the perception of basic emotions. Frontiers in Psychology, 13, 933438. https://doi.org/10.3389/fpsyg.2022.933438
  30. McCrackin, S.D., Ristic, J. (2022). Emotional context can reduce the negative impact of face masks on inferring emotions. Frontiers in Psychology, 13, 928524. https://doi.org/10.3389/fpsyg.2022.928524
  31. McCrackin, S.D., Ristic, J. (2024). Beyond the whole: Reduced empathy for masked emotional faces is not driven by disrupted configural face processing. Behavioral Sciences, 14(9), 850. https://doi.org/10.3390/bs14090850
  32. Ruba, A.L., Pollak, S.D. (2020). Children’s emotion inferences from masked faces: Implications for social interactions during the COVID-19 pandemic. PLoS ONE, 15(12), e0243708.
  33. Scherer, K.R. (2005). What are emotions? And how can they be measured? Social Science Information, 44(4), 695—729. https://doi.org/10.1177/0539018405058216
  34. Schirmer, A., Adolphs, R. (2017). Emotion perception from face, voice, and touch: Comparisons and convergence. Trends in Cognitive Sciences, 21(3), 216—228.
  35. Schlegel, K., Fontaine, J.R.J., Scherer, K.R. (2017). The nomological network of emotion recognition ability. European Journal of Psychological Assessment, 35(3), 352—363.
  36. Schlegel, K., Grandjean, D., Scherer, K.R. (2012). Emotion recognition: Unidimensional ability or a set of modality- and emotion-specific skills? Personality and Individual Differences, 53(1), 16—21.
  37. Schlegel, K., Grandjean, D., Scherer, K.R. (2014). Introducing the Geneva Emotion Recognition Test: An example of Rasch-based test development. Psychological Assessment, 26(2), 666—672.
  38. Schurgin, M.W., Nelson, J., Iida, S., Ohira, H., Chiao, J.Y., Franconeri, S.L. (2014). Eye movements during emotion recognition in faces. Journal of Vision, 14(13), 14.
  39. Smith, M.L., Cottrell, G.W., Gosselin, F., Schyns, P.G. (2005). Transmitting and decoding facial expressions. Psychological Science, 16(3), 184—189. https://doi.org/10.1111/j.0956-7976.2005.00801.x
  40. Sullivan, S., Ruffman, T., Hutton, S.B. (2007). Age differences in emotion recognition skills and the visual scanning of emotion faces. The Journals of Gerontology: Series B, 62(1), P53—P60. https://doi.org/10.1093/geronb/62.1.p53
  41. Wegrzyn, M., Vogt, M., Kireclioglu, B., Schneider, J., Kissler, J. (2017). Mapping the emotional face: How individual face parts contribute to successful emotion recognition. PLoS ONE, 12, e0177239. https://doi.org/10.1371/journal.pone.0177239

Information About the Authors

Vladimir A. Barabanschikov, Doctor of Psychology, Professor, Director, Institute of Experimental Psychology, Moscow State University of Psychology and Education, Professor, Moscow Institute of Psychoanalysis, Moscow, Russian Federation, ORCID: https://orcid.org/0000-0002-5084-0513, e-mail: vladimir.barabanschikov@gmail.com

Ekaterina V. Suvorova, Research Associate, Institute of Experimental Psychology, Moscow State University of psychology and education, PhD Student, Moscow Institute of Psychoanalysis, Moscow, Russian Federation, ORCID: https://orcid.org/0000-0001-8834-2037, e-mail: esresearch@yandex.ru

Anna S. Borisova, Master in Psychology, Institute of Experimental Psychology, Moscow State University of psychology and education, Moscow, Russian Federation, ORCID: https://orcid.org/0009-0000-8415-8582, e-mail: boorisova.anna@gmail.com

Contribution of the authors

Vladimir A. Barabanshchikov — idea, concept, meaningful analysis and interpretation of the results.

Ekaterina V. Suvorova — scientific review; adaptation of stimulus material to experimental requirements, planning and monitoring of research; visualization of results.

Anna S. Borisova — selection of subjects; technical implementation; experiment; data collection and analysis; statistical data processing.

All authors participated in the discussion of the results and approved the final text of the manuscript.

Conflict of interest

The authors declare no conflict of interest.

Ethics statement

The study was reviewed and approved by the Ethical Commission of the Institute of Experimental Psychology of the Moscow State University of Psychology and Education (report no. 11, 2025/12/05).

Metrics

 Web Views

Whole time: 9
Previous month: 0
Current month: 9

 PDF Downloads

Whole time: 2
Previous month: 0
Current month: 2

 Total

Whole time: 11
Previous month: 0
Current month: 11