Engagement in Online Learning: a Pilot Study

 
Audio is AI-generated
73

Abstract

Context and relevance. Research utilizing computer vision methods represents a novel, promising, and highly relevant direction in education. It opens up opportunities for applying neuroscience to the development of scientifically grounded pedagogical approaches aimed at improving the quality of education. Objective. This study aims to determine the relationship between the visual design features of educational video content and engagement components (both cognitive and emotional). Hypothesis. The psychophysiological components of students' cognitive and emotional engagement during the viewing of educational video content (in the context of online learning) are positively associated with a polylogic format of material presentation in video lectures, the use of concrete examples, and the presence of questions directed by the lecturer to the audience. Methods and materials. During the viewing of video lectures and the completion of subsequent tasks, changes in skin electrical activity and gaze movement were recorded. Skin electrical activity was measured using the NTrend-BIO biobracelet, while gaze coordinates were tracked using the NTrend-ET500 eye tracker. Based on the collected data, standard metrбиобраслетics of emotional engagement, valence changes, attention, and interest were calculated using the "Neurobarometer" software package (developed by AO "Neurotrend"). Results. The experiment revealed a relationship between cognitive engagement and the visual design features of educational video content. Attention metrics were significantly higher among participants who watched video lectures with questions addressed directly to them. Emotional engagement metrics were significantly higher when participants completed tasks than during video viewing. Eye-tracking metrics further demonstrated that respondents focused more on video sequences featuring a single lecturer or slides rather than on dialogic/polylogic formats with multiple instructors. Conclusions. The pilot study provided preliminary data suggesting that engagement levels are influenced by specific visual design features of educational video content. These findings can be taken into account when developing an "ideal model" for online courses.

General Information

Keywords: erratum, schizophrenia, lipidomics, transcriptomics, mass spectrometry, corpus callosum

Journal rubric: Educational Psychology

Article type: scientific article

DOI: https://doi.org/10.17759/pse.2025300404

Funding. The study was supported by the TSU Development Program ("Priority 2030") NU 2.3.1.23 IG.

Supplemental data. Supplemental data. Datasets аvailable from https://doi.org/10.48612/MSUPE/ezg5-na5u-u6h2

Received 28.10.2024

Revised 10.02.2025

Accepted

Published

For citation: Gorchakova, O.Yu., Filkina, A.V., Larionova, A.V., Tolstova, M.A. (2025). Engagement in Online Learning: a Pilot Study. Psychological Science and Education, 30(4), 56–68. (In Russ.). https://doi.org/10.17759/pse.2025300404

© Gorchakova O.Yu., Filkina A.V., Larionova A.V., Tolstova M.A., 2025

License: CC BY-NC 4.0

References

  1. Barannikov, K.A., Ananin, D.P., Strikun, N.G., Alkanova, O.N., Bayzarov, A.Ye. (2023). Hybrid Learning: Russian and International Practice. Educational Studies Moscow, (2), 33–69. https://doi.org/10.17323/1814-9545-2023-2-33-69
  2. Gritsova, O.A., Tissen, E.V. (2021). Quality Assessment of Online Learning in Regional Higher Education Systems. Economy of Region, 17(3), 929–943. https://doi.org/10.17059/ekon.reg.2021-3-15
  3. Kasatkina, D.A., Kravchenko, A.M., Kupriyanov, R.B., Nekhorosheva, E.V. (2020). Automatic Engagement Detection in Education: Critical Review. Journal of Modern Foreign Psychology, 9(3), 59–68. https://doi.org/10.17759/jmfp.2020090305
  4. Klimenskikh, M.V., Lebedeva, Yu.V., Maltsev, A.V., Savelyev, V.V. (2019). Psychological Factors of Online Learning Efficiency of Students. Perspectives of Science and Education, 42(6), 312–321. https://doi.org/10.32744/pse.2019.6.26
  5. Koroleva, M.V., Luzhin, A.O. (2020). Method for Analyzing Emotional Perception of Audiovisual Content among a User Group. Patent RF, no. 2723732. (In Russ.)
  6. Latanov, A.V., Anisimov, V.N., Boiko, L.A., Galkina, N.V. (2020). Method for Assessing Voluntary Attention Based on Oculomotor Indicators and Amplitude-Frequency Characteristics of Electroencephalogram. Patent RF, no. 2722447. (In Russ.)
  7. Uvarov, A.Yu. (2018). The AI Technologies in Education. Informatics and Education, (4), 14–22.
  8. Aslan, S., Diner, I., Cataltepe, Z., Esme, A. A., Ferens, R., Kamhi, G., Dundar, O., Oktay, E., Soysal, C., Yener, M. (2014). Learner engagement measurement and classification in 1:1 learning. 13th International Conference on Machine Learning and Applications (ICMLA), 545–552. https://doi.org/10.1109/ICMLA.2014.111
  9. Bond, M., Bedenlier, S. (2019). Facilitating student engagement through educational technology: Towards a conceptual framework. Journal of Interactive Media in Education, 1(1), 1–14. https://doi.org/10.5334/jime.528
  10. Bond, M., Buntins, K., Bedenlier, S., Zawacki-Richter, O., Kerres, M. (2020). Mapping research in student engagement and educational technology in higher education: A systematic evidence map. International Journal of Educational Technology in Higher Education, 17(1), 1–30. https://doi.org/10.1186/s41239-019-0176-8
  11. Booth, B.M., Bosch, N., D’Mello, K.S. (2023). Engagement detection and its applications in learning: A tutorial & selective review. Proceedings of the IEEE, 111(10), 1398–1422. https://doi.org/10.1109/JPROC.2023.3309560
  12. Buhl-Wiggers, J., Kjærgaard, A., Munk, K. (2023). A scoping review of experimental evidence on face-to-face components of blended learning in higher education. Studies in Higher Education, 48(1), 151–173. https://doi.org/10.1080/0307502022.2123911
  13. Buscher, G., Dengel, A., Elst, L.V. (2008). Eye movements as implicit relevance feedback. Extended Abstracts Proceedings of the 2008 Conference on Human Factors in Computing Systems, 2991–2996. https://doi.org/10.1145/1358628.1358796
  14. Dewan, M., Murshed, M., Lin, F. (2019). Engagement detection in online learning: A review. Smart Learning Environments, 6(1), 1–20. https://doi.org/10.1186/s40561-018-0080-z
  15. Fredricks, J.A., Blumenfeld, P.C., Paris, A.H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59–109. https://doi.org/10.3102/00346543074001059
  16. Fredricks, J.A., McColskey, W. (2012). The measurement of student engagement: A comparative analysis of various methods and student self-report instruments. In S.L. Christenson et al. (Eds.), Handbook of Research on Student Engagement (pp. 763–782). Boston, MA: Springer US. https://doi.org/10.1007/978-1-4614-2018-7_37
  17. Fredricks, J.A. (2022). The measurement of student engagement: Methodological advances and comparison of new self-report instruments. In J.A. Fredricks et al. (Eds.), Handbook of Research on Student Engagement (pp. 597–616). Cham: Springer International Publishing. https://doi.org/10.1007/978-3-031-07853-8_29
  18. Krithika, L.B., Lakshmi, P.G.G. (2016). Student emotion recognition system (SERS) for e-learning improvement based on learner concentration metric. Procedia Computer Science, 85, 767–776. https://doi.org/10.1016/j.procs.2016.05.264
  19. Mayer, R.E. (2021). Evidence-based principles for how to design effective instructional videos. Journal of Applied Research in Memory and Cognition, 10(2), 229–240.
  20. Moore, M.G. (1989). Editorial: Three types of interaction. American Journal of Distance Education, 3(2), 1–7. https://doi.org/10.1080/08923648909526659
  21. Noetel, M., Griffith, S., Delaney, O., Sanders, T., Parker, P.D., del Pozo Cruz, B., Lonsdale, C. (2021). Video improves learning in higher education: A systematic review. Review of Educational Research, 91(2), 204–236.
  22. Pomplun, M., Sunkara, S. (2003). Pupil dilation as an indicator of cognitive workload in human-computer interaction. Proceedings of the International Conference on HCI, 1–5. Available at: https://api.semanticscholar.org/CorpusID:1052200
  23. Raina, S., Bernard, L., Taylor, B., Kaza, S. (2016). Using eye-tracking to investigate content skipping: A study on learning modules in cybersecurity. IEEE Conference on Intelligence and Security Informatics (ISI), 261–266. https://doi.org/10.1109/ISI.2016.7745486
  24. Sharma, P., Joshi, S., Gautam, S., Filipe, V., Reis, M. (2019). Student engagement detection using emotion analysis, eye tracking and head movement with machine learning. arXiv:1909.12913, 9 p. https://doi.org/10.48550/arXiv.1909.12913
  25. Skaramagkas, V., Giannakakis, G., Ktistakis, E., et al. (2023). Review of eye tracking metrics involved in emotional and cognitive processes. IEEE Reviews in Biomedical Engineering, 16, 260–277. https://doi.org/10.1109/RBME.2021.3066072
  26. Symonds, J., Kaplan, A., Upadyaya, K., Salmela-Aro, K., Torsney, B., Eccles, J. (2019). Momentary student engagement as a dynamic developmental system. PsyArXiv. https://doi.org/10.31234/osf.io/fuy7p
  27. Tarnowski, P., Kołodziej, M., Majkowski, A., Rak, R.J. (2020). Eye-tracking analysis for emotion recognition. Computational Intelligence and Neuroscience, 2020, 1–13. https://doi.org/10.1155/2020/2909267
  28. Wu, C., Cha, J.S., Sulek, J.E., et al. (2019). Eye-tracking metrics predict perceived workload in robotic surgical skills training. Human Factors: The Journal of the Human Factors and Ergonomics Society, 62(8), 001872081987454. https://doi.org/10.1177/0018720819874544
  29. R Core Team (2022). R: A language and environment for statistical computing [online]. R Foundation for Statistical Computing, Vienna, Austria, 2022, available at: http://www.R-project.org/ (accessed 25.12.2023)

Information About the Authors

Olesya Y. Gorchakova, Junior Researcher at the Center for Cognitive Research and Neuroscience, National Research Tomsk State University, Tomsk, Russian Federation, ORCID: https://orcid.org/0000-0002-7571-0360, e-mail: avendus@mail.ru

Alexandra V. Filkina, Candidate of Science (Psychology), Research Fellow, Center for the Sociology of Education, Institute of Education, National Research Tomsk State University, Tomsk, Russian Federation, ORCID: https://orcid.org/0000-0001-7026-7059, e-mail: lexia@inbox.ru

Anastasia V. Larionova, Candidate of Science (Psychology), Tomsk State University, Tomsk, Russian Federation, ORCID: https://orcid.org/0000-0002-8523-2913, e-mail: anpavlar@mail.ru

Maria A. Tolstova, Candidate of Science (Philology), Associate Professor, Head of the Center for Cognitive Research and Neuroscience, National Research Tomsk State University, Tomsk, Russian Federation, ORCID: https://orcid.org/0009-0008-6442-0860, e-mail: tolstova_11@mail.ru

Contribution of the authors

Gorchakova O Yu — development of the research idea; conceptualization of the hypothesis; development of the research methodology; writing and formatting the manuscript; research planning; participation in the interpretation of results

Filkina A V — theoretical analysis and literature review; annotation of sources; participation in the development of research instruments; discussion of results; manuscript editing

Larionova A V — application of statistical data processing methods; interpretation of cognitive and emo-
tional engagement metrics; writing the “Methods and Materials” section; contribution to the formulation of conclusions

Tolstova M A — conducting the experiment; data collection using an eye tracker and a biobracelet; data analysis in the “Neurobarometer” software package; preparation of data visualization

All authors participated in the discussion of the results and approved the final text of the manuscript

Conflict of interest

The authors declare no conflict of interest

Ethics statement

The study was reviewed and approved by the Ethics Committee of Tomsk State University of Psychology and Education (report no 230711_A4_16, 2023/07/11)

Metrics

 Web Views

Whole time: 306
Previous month: 69
Current month: 15

 PDF Downloads

Whole time: 73
Previous month: 17
Current month: 4

 Total

Whole time: 379
Previous month: 86
Current month: 19