Experimental Psychology (Russia)
2025. Vol. 18, no. 4, 64–79
doi:10.17759/exppsy.2025180404
ISSN: 2072-7593 / 2311-7036 (online)
Perception of emotional state of a person based on the transformed video recordings of the body
Abstract
Context and relevance. The perception of emotional states from human body expressions involves the use of both shape and motion kinematics information. Despite a significant number of studies, it remains unclear what minimum amount of motion and shape information is sufficient for emotion recognition, as well as how motion and shape are related in the perception of different emotion categories. Objective: to determine the extent to which excluding shape and motion information will hinder the recognition of emotional body expressions. Hypothesis. Compared to full-color video clips, the accuracy of emotion recognition from dynamic dot patterns will decrease; further reduction in the number of frames will lead to an even greater decrease in accuracy. Methods and materials. The study involved 233 participants (209 women and 24 men, aged 22 to 64 years, median 37 years). The stimuli consisted of dynamic black dots on a white background perceived as human body movements expressing emotions of happiness, fear, disgust, excitement, disappointment, boredom, and neutral state. The stimuli were shown to different groups of participants with either smooth (23 frames/s) or stroboscopic (2 frames/s) motion. Participants determined the emotional state of the posers. The obtained data were compared with the results of a previous study using full-color video fragments. Results. Stroboscopic exposure reduces the recognition accuracy of all expressions except happiness and neutral state and also changes the structure of categorical fields of emotion perception. The main contribution to the perception of happiness is made by the shape and texture of the human image, as well as the context of the situation. For expressions of disgust and disappointment, the role of dynamic information increases. The assessment of fear expression depends on both static and dynamic features and is primarily based on body, not facial expressions. The perception of boredom and excitement is primarily associated with kinematic patterns. For recognizing a neutral state, it’s not static or dynamic features that play a key role, but rather information about facial microexpressions. Conclusions. We identified differentiated patterns of the contribution of shape and movement information to the perception of emotional body expressions that depend on the emotion category.
General Information
Keywords: emotion perception, biological motion, body expression, emotional states, non-verbal behavior
Journal rubric: Face Science
Article type: scientific article
DOI: https://doi.org/10.17759/exppsy.2025180404
Funding. The research was supported by the Russian Science Foundation (project No. 24-18-00904 “Mechanisms of person’s emotional state perception in the processes of nonverbal communication” https://rscf.ru/project/24-18-00904/).
Acknowledgements. The author is grateful to E.G. Hoze for assistance in data collection.
Received 10.11.2025
Revised 20.11.2025
Accepted
Published
For citation: Korolkova, O.A. (2025). Perception of emotional state of a person based on the transformed video recordings of the body. Experimental Psychology (Russia), 18(4), 64–79. (In Russ.). https://doi.org/10.17759/exppsy.2025180404
© Korolkova O.A., 2025
License: CC BY-NC 4.0
References
- Королькова, О.А. (2024). Восприятие эмоционального состояния коммуниканта на основе информации об экспрессиях его лица и тела. Экспериментальная психология, 17(4), 28—43. https://doi.org/10.17759/exppsy.2024170402
Korolkova, O.A. (2024). Perception of Emotional State of a Communicant Based on Information about His Facial and Body Expressions. Experimental Psychology (Russia), 17(4), 28—43. (In Russ.). https://doi.org/10.17759/exppsy.2024170402 - Старостина, Е.Г., Тэйлор, Г., Квилти, Л., Бобров, А., Мошняга, Е., Пузырева, Н., Боброва, М., Ивашкина, М., Кривчикова, М., Шаврикова, Е., Бэгби, М. (2010). Торонтская шкала алекситимии (20 пунктов): Валидизация русскоязычной версии на выборке терапевтических больных. Социальная и Клиническая Психиатрия, 20(4), 31—38.
Starostina, E.G., Taylor, G., Quilty, L., Bobrov, A., Moshnyaga, E., Puzyreva, N., Bobrova, M., Ivashkina, M., Krivchikova, M., Shavrikova, E., Bagby, M. (2010). Toronto Alexithymia Scale (20 items): Validation of the Russian version on a sample of medical patients. Social and Clinical Psychiatry, 20(4), 31—38. (In Russ.). - Ahmed, F., Bari, A.S.M.H., Gavrilova, M.L. (2020). Emotion recognition from body movement. IEEE Access, 8, 11761—11781. https://doi.org/10.1109/ACCESS.2019.2963113
- Alaerts, K., Nackaerts, E., Meyns, P., Swinnen, S. P., Wenderoth, N. (2011). Action and emotion recognition from point light displays: An investigation of gender differences. PLoS ONE, 6(6), 1—9. https://doi.org/10.1371/journal.pone.0020989
- Atkinson, A.P., Dittrich, W.H., Gemmell, A.J., Young, A.W. (2004). Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception, 33(6), 717—746. https://doi.org/10.1068/p5096
- Atkinson, A.P., Tunstall, M.L., Dittrich, W.H. (2007). Evidence for distinct contributions of form and motion information to the recognition of emotions from body gestures. Cognition, 104(1), 59—72. https://doi.org/10.1016/j.cognition.2006.05.005
- Baker, C.L.J., Braddick, O.J. (1985). Temporal properties of the short-range process in apparent motion. Perception, 14, 181—192. https://doi.org/10.1068/p140181
- Bazarevsky, V., Grishchenko, I., Raveendran, K., Zhu, T.L., Zhang, F., Grundmann, M. (2020). BlazePose: On-device Real-time Body Pose tracking. ArXiv, abs/2006.1, 1—4.
- Beintema, J.A., Utrecht, U., Georg, K., Lappe, M. (2006). Perception of biological motion from limited-lifetime stimuli. Perception & Psychophysics, 68(4), 613—624.
- Dael, N., Goudbeek, M., Scherer, K.R. (2013). Perceived gesture dynamics in nonverbal expression of emotion. Perception, 42(6), 642—657. https://doi.org/10.1068/p7364
- De Gelder, B., Poyo Solanas, M. (2022). What postures communicate. Enfance, 3(3), 353—365. https://doi.org/10.3917/enf2.223.0353
- Fourati, N., Pelachaud, C. (2018). Perception of Emotions and Body Movement in the Emilya Database. IEEE Transactions on Affective Computing, 9(1), 90—101. https://doi.org/10.1109/TAFFC.2016.2591039
- Gardenfors, P., Warglien, M. (2012). Using Conceptual Spaces to Model Actions and Events. Journal of Semantics, 29(4), 487—519. https://doi.org/10.1093/jos/ffs007
- Giese, M.A., Poggio, T. (2003). Neural mechanisms for the recognition of biological movements. Nature Reviews Neuroscience, 4(3), 179—192. https://doi.org/10.1038/nrn1057
- Ikeda, E., Destler, N., Feldman, J. (2025). The role of dynamic shape cues in the recognition of emotion from naturalistic body motion. Attention, Perception, and Psychophysics, 87(2), 604—618. https://doi.org/10.3758/s13414-024-02990-8
- Ildei, C.B., Benahmed, O., Bouidaine, D., Francisco, V., Decatoire, A. (2024). SmartDetector: Automatic and vision-based approach to point-light display generation for human action perception. Behavior Research Methods, 56, 8349—8361.
- Johansson, G. (1973). Visual perception of biological motion and a model for its analysis. Perception & Psychophysics, 14(2), 201—211. https://doi.org/10.3758/BF03212378
- Mather, G., Radford, K., West, S. (1992). Low-Level Visual Processing of Biological Motion. Proceedings of the Royal Society B: Biological Sciences, 249, 149—155. https://doi.org/10.1098/rspb.1992.0097
- Mcdonnell, R., Newell, F., Sullivan, C.O. (2007). Smooth Movers: Perceptually Guided Human Motion Simulation. In: D. Metaxas & J. Popovic (Eds.), SCA ’07: Proceedings of the 2007 ACM SIGGRAPH/Eurographics symposium on Computer animation (pp. 259—269).
- Melzer, A., Shafir, T., Tsachor, R.P. (2019). How Do We Recognize Emotion From Movement? Specific Motor Components Contribute to the Recognition of Each Emotion. Frontiers in Psychology, 10. https://doi.org/10.3389/fpsyg.2019.01389
- O’Reilly, H., Pigat, D., Fridenson, S., Berggren, S., Tal, S., Golan, O., Bölte, S., Baron-Cohen, S., Lundqvist, D. (2016). The EU-Emotion Stimulus Set: A validation study. Behavior Research Methods, 48(2), 567—576. https://doi.org/10.3758/s13428-015-0601-4
- Presti, P., Ruzzon, D., Galasso, G.M., Avanzini, P., Caruana, F., Vecchiato, G. (2022). The Avatar’s Gist: How to Transfer Affective Components From Dynamic Walking to Static Body Postures. Frontiers in Neuroscience, 16, 1—14. https://doi.org/10.3389/fnins.2022.842433
- R Core Team (2024). R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. URL: https://www.r-project.org/
- Runeson, S., Frykholm, G. (1983). Kinematic specification of dynamics as an informational basis for person-and-action perception: Expectation, gender recognition, and deceptive intention. Journal of Experimental Psychology: General, 112(4), 585—615. https://doi.org/10.1037/0096-3445.112.4.585
- Shafir, T., Tsachor, R.P., Welch, K.B. (2016). Emotion regulation through movement: Unique Sets of Movement Characteristics are Associated with and Enhance Basic Emotions. Frontiers in Psychology, 6, 1—15. https://doi.org/10.3389/fpsyg.2015.02030
- Smekal, V., Solanas, M.P., Fraats, E.I.C., Gelder, B. de. (2011). Differential contributions of body form, motion, and temporal information to subjective action understanding in naturalistic stimuli. Frontiers in Integrative Neuroscience, 18, 1—12. https://doi.org/10.3389/fnint.2024.1302960
- Vangeneugden, J., Peelen, M.V, Tadin, D., Battelli, L. (2014). Distinct Neural Mechanisms for Body Form and Body Motion Discriminations. The Journal of Neuroscience, 34(2), 574—585. https://doi.org/10.1523/JNEUROSCI.4032-13.2014
- Venture, G., Kadone, H., Zhang, T., Grèzes, J., Berthoz, A. (2014). Recognizing Emotions Conveyed by Human Gait. International Journal of Social Robotics, 6, 621—632. https://doi.org/10.1007/s12369-014-0243-1
- Vu, H.A., Yamazaki, Y., Dong, F., Hirota, K. (2012). Emotion Recognition based on Human Gesture and Speech Information using RT Middleware. In: IEEE International Conference on Fuzzy Systems (pp. 787—791).
Information About the Authors
Conflict of interest
The author declares no conflict of interest.
Ethics statement
The study was reviewed and approved by the Ethics Committee of Moscow State University of Psychology and Education (report no. 2, 2025/04/25).
Metrics
Web Views
Whole time: 17
Previous month: 0
Current month: 17
PDF Downloads
Whole time: 3
Previous month: 0
Current month: 3
Total
Whole time: 20
Previous month: 0
Current month: 20