Part-Whole Perception of Audiovideoimages of Multimodal Emotional States of a Person

180

Abstract

The patterns of perception of a part and a whole of multimodal emotional dynamic states of people unfamiliar to observers are studied. Audio-video clips of fourteen key emotional states expressed by specially trained actors were randomly presented to two groups of observers. In one group (N=96, average age — 34, SD — 9.4l.), each audio—video image was shown in full, in the other (N=78, average age — 25, SD — 9.6l.), it was divided into two parts of equal duration from the beginning to the conditional middle (short phonetic pause) and from the middle to the end of the exposure. The stimulus material contained facial expressions, gestures, head and eye movements, changes in the position of the body of the sitters, who voiced pseudolinguistic statements accompanied by affective intonations. The accuracy of identification and the structure of categorical fields were evaluated depending on the modality and form (whole/part) of the exposure of affective states. After the exposure of each audio-video image from the presented list of emotions, observers were required to choose the one that best corresponds to what they saw. According to the data obtained, the accuracy of identifying the emotions of the initial and final fragments of audio-video images practically coincide, but significantly less than with full exposure. Functional differences in the perception of fragmented audio-video images of the same emotional states are revealed. The modes of transitions from the initial stage to the final one and the conditions affecting the relative speed of the perceptual process are shown. The uneven formation of the information basis of multimodal expressions and the heterochronous perceptogenesis of emotional states of actors are demonstrated.

General Information

Keywords: multimodal dynamic expressions, wholeand parts of audiovideoimagesof expressions, expression and identification of emotions, perceptogenesis of multimodal affective states

Journal rubric: Psychology of Perception

Article type: scientific article

DOI: https://doi.org/10.17759/exppsy.2022150401

Funding. The work was supported by the RNF, project No.18-18-00350-П“Perception in the structure of nonverbal communication”.

Received: 15.10.2022

Accepted:

For citation: Barabanschikov V.A., Suvorova E.V. Part-Whole Perception of Audiovideoimages of Multimodal Emotional States of a Person. Eksperimental'naâ psihologiâ = Experimental Psychology (Russia), 2022. Vol. 15, no. 4, pp. 4–21. DOI: 10.17759/exppsy.2022150401. (In Russ., аbstr. in Engl.)

References

  1. Barabanschikov V.A. Dinamika vospriyatiya vyrazhenii litsa [The dynamics of the perception of facial expressions]. Moscow: Publ. Kogito-Tsentr. 2016. 380 p. (In Russ.).
  2. Barabanschikov V.A., Korol’kova O.A. Vospriyatie ekspressii «zhivogo» litsa [Perception of “live” facial expressions]. Eksperimental’naya psikhologiya = Experimental Psychology (Russia), 2020, Vol. 8, no. 3, pp. 55—73. DOI:10.17759/exppsy.2020130305 (In Russ., abstr. in Engl.).
  3. Barabanschikov V.A., Korol’kova O.A., Lobodinskaya E.A. Raspoznavanie emotsii v usloviyakh stupenchatoi stroboskopicheskoi ekspozitsii vyrazhenii litsa [Recognition of facial expressions during step-function stroboscopic presentation]. Eksperimental’naya psikhologiya = Experimental Psychology (Russia), 2020. Vol. 11, no. 4, pp. 50—69. DOI:10.17759/exppsy.2018110405 (In Russ., abstr. in Engl.).
  4. Barabanshhikov V.A., Korol’kova O.A., Lobodinskaja E.A. Vosprijatie mikrojekspressij lica v uslovijah kazhushhegosja dvizhenija i maskirovki [Perception of facial microexpressions in conditions of apparent movement and disguise]. Moscow: Publ. Kogito-Tsentr. 2021. 380 p. (In Russ.).
  5. Barabanshhikov V.A., Suvorova E.V. Ocenka jemocional’nogo sostojanija cheloveka po ego videoizobrazhenijam [Human emotional state assessment based on a video portrayal]. Eksperimental’naya psikhologiya = Experimental Psychology, 2020. Vol. 13, no. 4, pp. 4—24. DOI:10.17759/exppsy.2020130401 (In Russ., abstr. in Engl.).
  6. Barabanshhikov V.A., Suvorova E.V. Ocenka mul’timodal’nyh jekspressij lica v laboratorii i onlajn [Multimodal expressions’ assessment in laboratory and online]. Litso cheloveka v kontekstakh prirody, tekhnologii i kul’tury [The Face of Man in Contexts of Nature, Technology, and Culture]. K.I. Anan’eva, V.A. Barabanschikov, A.A. Demidov (eds.). Moscow: Moskovskii institut psikhoanaliza. M.: Kogito-Centr, 2020a. P. 310—322. (In Russ.).
  7. Barabanshhikov V.A., Suvorova E.V. Gendernyj faktor v raspoznavanii jemocional’nogo sostojanija cheloveka po ego audio-videoizobrazhenijam [Gender factor in perception of human emotional state based on audiovideo portrayal]. Rossijskij psihologicheskij zhurnal = Russian Psychological Journal. 2022. V. 19, no. 2, pp. 6—20. DOI:10.21702/rpj.2022.2.1 (In Russ.).
  8. Barabanshhikov V.A., Suvorova E.V. Individual’nye formy vyrazhenija i identifikacija mul’timodal’nyh dinamicheskih sostojanij cheloveka [Individual forms of expression and identification of multimodal dynamic human states]. Poznanie i perezhivanie = Cognition and Experience, 2022a. V. 3, no. 2, pp. 6—35. DOI:10.51217/cogexp_2022_03_02_01 (In Russ.).
  9. Zherdev I.Ju., Barabanshhikov V.A. Identifikacija licevyh jekspressij v uslovijah intrasakkadicheskoj smeny stimuli [Facial expression identification with intrasaccadic stimulus substitution]. Eksperimental’naya psikhologiya = Experimental Psychology (Russia), 2021. Vol. 14, no. 2, pp. 68—84. DOI:10.17759/exppsy.2021140205 (In Russ.).
  10. Bould E., Morris N., Wink B. Recognising subtle emotional expressions: The role of facial movements. Cognition and emotion, 2008. Vol. 22. P. 1569—1587. DOI:10.1080/02699930801921156
  11. Cunningham D.W., Wallraven C. Dynamic information for the recognition of conversational expressions. Journal of vision, 2009. Vol. 9, no. 13, pp. 1—17. DOI:10.1167/9.13.7
  12. Gelder B. De, Vroomen J. The perception of emotions by ear and by eye. Cognition & Emotion, 2000. Vol. 14, no. 3, pp. 289—311. DOI:10.1080/026999300378824
  13. Kokinous J., Kotz S.A., Tavano A., Schröger E. The role of emotion in dynamic audiovisual integration of faces and voices. Social Cognitive and Affective Neuroscience, 2015. Vol. 10, no. 5, pp.713—720. DOI:10.1093/scan/nsu105
  14. Massaro D.W., Egan P.B. Perceiving affect from the voice and the face. Psychonomic Bulletin & Review, 1996. Vol. 3, no. 2, pp. 215—221. DOI:10.3758/BF03212421
  15. Schlegel K., Grandjean D., Scherer K.R. Introducing the Geneva Emotion Recognition Test: An example of Rasch-based test development. Psychological Assessment, 2014. Vol. 26(2), P. 666—672. DOI:10.1037/a0025827
  16. Stock J. Van den, Righart R., Gelder B. De. Body expressions influence recognition of emotions in the face and voice. Emotion, 2007. Vol. 7, no. 3, pp. 487—494. DOI:10.1037/1528-3542.7.3.487

Information About the Authors

Vladimir A. Barabanschikov, Doctor of Psychology, Professor, Director, Institute of Experimental Psychology, Moscow State University of Psychology and Education, Dean of the Faculty of Psychology, Moscow Institute of Psychoanalysis, Moscow, Russia, ORCID: https://orcid.org/0000-0002-5084-0513, e-mail: vladimir.barabanschikov@gmail.com

Ekaterina V. Suvorova, Research Associate, Institute of Experimental Psychology, Moscow State University of Psychology & Education, PhD Student, Moscow Institute of Psychoanalysis, Moscow, Russia, ORCID: https://orcid.org/0000-0001-8834-2037, e-mail: esresearch@yandex.ru

Metrics

Views

Total: 459
Previous month: 23
Current month: 14

Downloads

Total: 180
Previous month: 7
Current month: 4