“Oddball” Electroencephalogram/Evoked Potential Paradigm for Identifying a Person’s Psycho-Emotional State

2

Abstract

Assessment of evoked potentials using electroencephalography is a classic method for determining a person's response to different types of stimuli. The literature describes EPs that are specific markers of emotional perception. To date, many stimulus bases have been developed and validated for use in EEG EP paradigms, among which images of human faces with emotional expression stand out. It is possible that the perception of this type of stimulus may have its own specificity — for example, it may occur faster than the perception of other classes of images, since it represents a more significant biological signal. In this review, we wanted to show the features of using affective images in the oddball paradigm, focusing on the use of human faces with emotional expression. This paradigm also belongs to the EEG/EP paradigms, but it has several features. The advantages of this technique are, firstly, its higher sensitivity compared to other paradigms with the presentation of emotional images. Secondly, it is possible, during the passive presentation of visual stimuli, to analyze the rapid automatic reactions that, according to previous studies, accompany the perception of faces. Perhaps the most effective images in the oddball EEG/EP paradigm will be facial expressions. The obtained data by using this paradigm are presented. The data obtained data show differences in both the amplitude and spatial components of the EP associated with different facial expressions — happy/angry.

General Information

Keywords: electroencephalogram, evoked potentials, emotions, oddball paradigm

Journal rubric: Neurosciences and Cognitive Studies

Article type: scientific article

DOI: https://doi.org/10.17759/jmfp.2024130201

Funding. The article was prepared within the framework of the project “Mirror Laboratories” HSE University.

Received: 02.05.2024

Accepted:

For citation: Blagovechtchenski E.D., Pomelova E.D., Popyvanova A.V., Koriakina M.M., Lukov M.Yu., Bartseva K.V. “Oddball” Electroencephalogram/Evoked Potential Paradigm for Identifying a Person’s Psycho-Emotional State [Elektronnyi resurs]. Sovremennaia zarubezhnaia psikhologiia = Journal of Modern Foreign Psychology, 2024. Vol. 13, no. 2, pp. 10–21. DOI: 10.17759/jmfp.2024130201. (In Russ., аbstr. in Engl.)

References

  1. Branco D., Gonçalves Ó.F., Badia S.B.I. A systematic review of international affective picture system (IAPS) around the world. Sensors. 2023. Vol. 23. no. 8. Article ID 3866. 31 pp. DOI:10.3390/s23083866
  2. Dawel A., Miller E.J., Horsburgh A., Ford P. A systematic survey of face stimuli used in psychological research 2000—2020. Behavior Research Methods. 2022. Vol. 54, no. 4, pp. 1889—1901. DOI:10.3758/s13428-021-01705-3
  3. Astikainen P., Kreegipuu K., Czigler I. Visual mismatch negativity (vMMN): A unique tool in investigating automatic processing. Frontiers in Human Neuroscience. 2022. Vol. 16, article ID 1056208. 3 p. DOI:10.3389/fnhum.2022.1056208
  4. Küntzler T., Höfling T.T. A., Alpers G.W. Automatic facial expression recognition in standardized and non-standardized emotional expressions. Frontiers in psychology. 2021. Vol. 12, article ID 627561. 13 p. DOI:10.3389/fpsyg.2021.627561
  5. Andermann M., Izurieta Hidalgo N.A., Rupp A., Schmahl C., Herpertz S.C., Bertsch K. Behavioral and neurophysiological correlates of emotional face processing in borderline personality disorder: are there differences between men and women? European Archives of Psychiatry and Clinical Neuroscience. 2022. Vol. 272, no. 8, pp. 1583—1594. DOI:10.1007/s00406-022-01434-4
  6. Bradley M. M., Sambuco N., Lang PP. J. Affective perception: The power is in the picture. In Ionescu B., Bainbridge W.A., Murray N. (eds.), Human perception of visual information: psychological and computational perspectives. Cham: Springer, 2022, pp. 59—83.
  7. Calvo G., Marrero H., Beltrán D. When does the brain distinguish between genuine and ambiguous smiles? An ERP study. Brain and Cognition. 2013. Vol. 81, no. 2, pp. 237—246. DOI:10.1016/j.bandc.2012.10.009
  8. Scherer K.R., Ellgring H., Dieckmann A., Unfried M., Mortillaro M. Dynamic facial expression of emotion and observer inference. Frontiers in psychology. 2019. Vol. 10, article ID 508. 17 p. DOI:10.3389/fpsyg.2019.00508
  9. Yang X., Wang Q., Qiao Z., Qiu X., Han D., Zhu X. Dysfunction of pre-attentive visual information processing in drug-naive women, but not men, during the initial episode of major depressive disorder. Frontiers in Psychiatry. 2020. Vol. 10, article ID 899. 9 p. DOI:10.3389/fpsyt.2019.00899
  10. Zhang Y., Chen J., Hou X., Guo Y., Lv R., Xu S. Dysfunction of processing task-irrelevant emotional faces in primary insomnia patients: an evidence from expression-related visual MMN / Sleep and Breathing. 2021. Vol. 25, pp. 41—48. DOI:10.1007/s11325-020-02058-5
  11. Ekman P. ExpressIon and the Nature of Emotion. Approaches to Emotion. Eds. K. Scherer, P. Ekman. New York: Psychology Press, 1984. pp. 319—344. DOI:10.4324/9781315798806
  12. Ellsworth P.C., Scherer K.R. Appraisal processes in emotion [Electronic resource]. In Davidson R.J., Scherer K.R., Goldsmith H.H. (eds.), Handbook of affective sciences. New York: Oxford University Press, 2003. pp. 572—595. URL: https://repository.law.umich.edu/cgi/viewcontent.cgi?article=1228&context=book_chapters (Accessed 20.05.2024).
  13. Carretié L., Tapia M., López-Martín S., Albert J. EmoMadrid: An emotional pictures database for affect research. Motivation and Emotion. 2019. Vol. 43, pp. 929—939. DOI:10.1007/s11031-019-09780-y
  14. Xu P., Peng S., Luo Y.J., Gong G. Facial expression recognition: A meta-analytic review of theoretical models and neuroimaging evidence. Neuroscience and Biobehavioral Reviews. 2021. Vol. 127, pp. 820—836. DOI:10.1016/j.neubiorev.2021.05.023
  15. Falkenstein M. Recent Advances in Clinical Applications of P300 and MMN. In Valeriani, de Tommaso M. (eds.). Psychophysiology Methods. New York: Humana Press, 2023, pp. 1—21. DOI:10.1007/978-1-0716-3545-2_1
  16. Miettinen J., Nordhausen K., Taskinen S. fICA: FastICA algorithms and their improved variants. The R Journal. 2018. Vol. 10, no. 2, pp. 148—158. DOI:10.32614/RJ-2018-046
  17. Allan N.P., Judah M.R., Albanese B.J., Macatee R.J., Sutton C.A., Bachman M.D. Gender differences in the relation between the late positive potential in response to anxiety sensitivity images and self-reported anxiety sensitivity. Emotion. 2019. Vol. 19(1), pp. 70—83. DOI:10.1037/emo0000420
  18. Redies C., Grebenkina M., Mohseni M., Kaduhm A., Dobel C. Global image properties predict ratings of affective pictures. Frontiers in psychology. 2020. Vol. 11, article ID 953. 16 p. DOI:10.3389/fpsyg.2020.00953
  19. Jiang D., Chen Y., Guo T. Lasting effects of using distraction to manage responses to unpleasant pictures: electrophysiological evidence. Biological Psychology. 2020. Vol. 156, article ID 107952. 18 p. DOI:10.1016/j.biopsycho.2020.107952
  20. Katebi M. E., Ghafari M. H., Ghafari T. STEMorph: Morphed Emotional Face Stimuli. bioRxiv. 2024. 15 p. DOI:10.1101/2024.05.13.593881
  21. Kätsyri J., de Gelder B., de Borst A.W. Amygdala responds to direct gaze in real but not in computer-generated faces. NeuroImage. 2020. Vol. 204, article ID 116216. 12 p. DOI:10.1016/j.neuroimage.2019.116216
  22. Kim S.H., Lee O.K., Kim D.J. Study of Practical Method for International 10-20 Electrode System. Korean Journal of Clinical Laboratory Science. 2021. Vol. 53, no. 1, pp. 60—67. DOI:10.15324/kjcls.2021.53.1.60
  23. Kurdi B., Lozano S., Banaji M.R. Introducing the open affective standardized image set (OASIS). Behavior research methods. 2016. Vol. 49, pp. 457—470. DOI:10.3758/s13428-016-0715-3
  24. Lundqvist D., Flykt A., Öhman A. Karolinska Directed Emotional Faces (KDEF) [Database record]. APA PsycTests. 1998. DOI:10.1037/t27732-000
  25. MacNamara A., Joyner K., Klawohn J. Event-related potential studies of emotion regulation: A review of recent progress and future directions. International Journal of Psychophysiology. 2022. Vol. 176, pp. 73—88. DOI:10.1016/j.ijpsycho.2022.03.008
  26. Maudrich T. Somatosensory-evoked potentials as a marker of functional neuroplasticity in athletes: a systematic review. Frontiers in physiology. 2022. Vol. 12, article ID 821605. 14 p. DOI:10.3389/fphys.2021.821605
  27. McMurray B. The myth of categorical perception. The Journal of the Acoustical Society of America. 2022. Vol. 152, no. 6, pp. 3819—3842. DOI:10.1121/10.0016614
  28. Prochnow A, Bluschke A, Weissbach A, Münchau A, Roessner V, Mückschel M, Beste C. Neural dynamics of stimulus-response representations during inhibitory control. Journal of Neurophysiology. 2021. Vol. 126, no. 2, pp. 680—692. DOI:10.1152/jn.00163.2021
  29. O’Reilly J.A., O’Reilly A. A critical review of the deviance detection theory of mismatch negativity. NeuroSci. 2021. Vol. 2(2), pp. 151—165. DOI:10.3390/neurosci2020011
  30. Guo Y., Chen J., Hou X., Xu S., Ma Y., Nie S. Pre-attentive dysfunction of processing emotional faces in interictal migraine revealed by expression-related visual mismatch negativity. Brain research. 2020. Vol. 1738, article ID 146816. 8 p. DOI:10.1016/j.brainres.2020.146816
  31. Na E., Lee K., Kim E.J., Bae J.B., Suh S.W., Byun S. Pre-attentive visual processing in Alzheimer’s disease: an event-related potential study. Current Alzheimer Research. 2020. Vol. 17, no. 13, pp. 1195—1207. DOI:10.2174/1567205018666210216084534
  32. Lin H., Müller-Bardorff M., Gathmann B., Brieke J., Mothes-Lasch M., Bruchmann M., Miltner W.H.R., Straube T. Stimulus arousal drives amygdalar responses to emotional expressions across sensory modalities. Scientific Reports. 2020. Vol. 10, article ID 1898. 12 p. DOI:10.1038/s41598-020-58839-1
  33. Zhang Y., Chen J., Hou X., Guo Y., Lv R., Xu S. Stimulus-specific regulation of visual oddball differentiation in posterior parietal cortex. Scientific Reports. 2020. Vol. 10, article ID 13973. 15 p. DOI:10.1038/s41598-020-70448-6
  34. Goeleven E., De Raedt R., Leyman L., Verschuere B. The Karolinska directed emotional faces: a validation study. Cognition and emotion. 2008. Vol. 22, no. 6, pp. 1094—1118. DOI:10.1080/02699930701626582
  35. Schmidtmann G., Jennings B.J., Sandra D.A., Pollock J., Gold I. The McGill Face Database: Validation and insights into the recognition of facial expressions of complex mental states. Perception. 2020. Vol. 49, no. 3, pp. 310—329. DOI:10.1177/0301006620901671
  36. Male A.G., O’Shea R.PP., Schröger E., Müller D., Roeber U., Widmann A. The quest for the genuine visual mismatch negativity (vMMN): Event‐related potential indications of deviance detection for low‐level visual features. Psychophysiology. 2020. Vol. 57, no. 6, article ID e13576. 27 p. DOI:10.1111/psypp.13576
  37. Matt S., Dzhelyova M., Maillard L., Lighezzolo-Alnot J., Rossion B., Caharel S. The rapid and automatic categorization of facial expression changes in highly variable natural images. Cortex. 2021. Vol. 144, pp. 168—184. DOI:10.1016/j.cortex.2021.08.005
  38. Waller B. M., Julle-Daniere E., Micheletta J. Measuring the evolution of facial ‘expression’using multi-species FACS. Neuroscience and Biobehavioral Reviews. 2020. Vol. 113, pp. 1—11. DOI:10.1016/j.neubiorev.2020.02.031

Information About the Authors

Evgenii D. Blagovechtchenski, PhD in Biology, Senior researcher, Centre for Cognition & Decision Making, National Research University Higher School of Economics, Moscow, Russia, ORCID: https://orcid.org/0000-0002-0955-6633, e-mail: eblagovechensky@hse.ru

Ekaterina D. Pomelova, Phd Student, Research Assistant, the Centre for Cognition and Decision making, Institute for Cognitive Neuroscience, National Research University Higher School of Economics, Moscow, Russia, ORCID: https://orcid.org/0000-0003-0420-0221, e-mail: epomelova@hse.ru

Alena V. Popyvanova, Phd Student, Research Assistant, the Centre for Cognition and Decision making, Institute for Cognitive Neuroscience, National Research University Higher School of Economics, Russia, ORCID: https://orcid.org/0000-0002-4413-9421, e-mail: apopyvanova@hse.ru

Maria M. Koriakina, Junior Research Fellow, the Centre for Cognition and Decision making, Institute for Cognitive Neuroscience, National Research University Higher School of Economics, Russia, ORCID: https://orcid.org/0000-0001-6737-550X, e-mail: mkoriakina@hse.ru

Mikhail Y. Lukov, Senior Researcher, Yaroslav-the-Wise Novgorod State University, Veliky Novgorod, Russia, ORCID: https://orcid.org/0009-0002-5430-2170, e-mail: lukov.mi@yandex.ru

Ksenia V. Bartseva, Junior Research Fellow, Saint Petersburg State University, PhD Student, St. Petersburg State University, St.Petersburg, Russia, ORCID: https://orcid.org/0000-0003-4854-726X, e-mail: bartseva.ksenia@gmail.com

Metrics

Views

Total: 3
Previous month: 0
Current month: 3

Downloads

Total: 2
Previous month: 0
Current month: 2