Using Process Data of Task Performance in Creative Thinking Assessment



Creative thinking is an important skill of the modern world, and its assessment with the help of modern digital tools is becoming an increasingly complex methodological task. The inclusion of process data of task performance in the assessment model of creative thinking is a promising direction that becomes possible in computer testing. The use of such data allows us to consider the processes of creative thinking in dynamics, which makes the assessment of the level of creativity of students more accurate and multifaceted. The purpose of the study was to determine the possibility of using process data of task performance as part of evaluating creative thinking using a tool in a digital environment. The paper presents an analysis of the work of 823 4th grade students who, during the assignment, created images in a closed simulation environment to assess creative and critical thinking. The analysis of process data of completing tasks performance took place using N-grams of various lengths. As a result, the sequences of actions of students with different levels of creative thinking were compared, and various strategies of behavior of the test takers were identified in task for creative thinking compared with a task for critical thinking. Together with information about the level of creativity based on the analysis of the created product, process data of task performance improves the understanding of the functioning of tasks through the prism of the task execution process by the test takers. It also makes a step forward in detailing the feedback that can be obtained as part of testing.

General Information

Keywords: creative thinking, process data, computer testing, N-grams

Journal rubric: Developmental Psychology

Article type: scientific article


Funding. The reported study was funded by The Ministry of Education and Science of the Russian Federation (project number 075-15-2022-325 from 25.04.2022).

Received: 15.06.2023


For citation: Tarasov S.V. Using Process Data of Task Performance in Creative Thinking Assessment. Psikhologicheskaya nauka i obrazovanie = Psychological Science and Education, 2023. Vol. 28, no. 4, pp. 63–80. DOI: 10.17759/pse.2023280404. (In Russ., аbstr. in Engl.)


  1. Uglanova I.L., Orel E.A., Brun I.V. Izmerenie kreativnosti i kriticheskogo myshlenija v nachal’noj shkole [Measuring creativity and critical thinking in primary school]. Psihologicheskij Zhurnal [Psychological Journal], 2020, no. 6(41), pp. 96—107. (In Russ.).
  2. Agresti, A. (1990). Categorical data analysis. New York, NY: John Wiley & Sons, Inc
  3. Autor, D. H., Levy, F., and Murnane, R. J. (2003). The skill content of recent technological change: an empirical exploration. Q. J. Econ. 118, 1279–1333. doi: 10.1162/003355303322552801
  4. Barbot, B., Besançon, M., & Lubart, T. (2016). The generality-specificity of creativity: Exploring the structure of creative potential with EPoC. Learning and Individual Differences, 52, 178-187.
  5. Bentler, P. M. (1990). Comparative fit indexes in structural models. Psychological Bulletin, 107(2), 238–246.
  6. Bock, R. D., Gibbons, R., & Muraki, E. (1988). Full-information item factor analysis. Applied psychological measurement, 12(3), 261-280.
  7. Brown T. A., Moore M. T. Confirmatory factor analysis //Handbook of structural equation modeling. - 2012. - P. 361-379.
  8. DiStefano, Christine; Zhu, Min; and Mîndrilã, Diana (2009). "Understanding and Using Factor Scores: Considerations for the Applied Researcher," Practical Assessment, Research, and Evaluation: Vol. 14, Article 20. DOI:
  9. Griffin P., Care E. Assessment and teaching of 21st century skills: Methods and approach / P. Griffin, E. Care, Springer, 2014.
  10. Guilford, J. P. (1967). The nature of human intelligence. McGraw-Hill.
  11. He, Q., Borgonovi, F., Paccagnella, M. (2021). Leveraging process data to assess adults’ problem-solving skills: Identifying generalized behavioral patterns with sequence mining. Computers and Education, 166, 104170.
  12. He, Q., & von Davier, M. (2016). Analyzing process data from problem-solving items with N-grams: Insights from a computer-based large-scale assessment. In R. Yigal, F. Steve, & M. Maryam (Eds.), Handbook of research on technology tools for real-world skill development (pp. 749-776). Hershey, PA: Information Science Reference.
  13. Howell, S., & Veale, T. (2009). Designing serious games with linguistic resources. Proceedings of the 4th International Conference on Foundations of Digital Games, 291–298.
  14. Kerr, D., Chung, G. K. W. K., & Iseli, M. R. (2011). The feasibility of using cluster analysis to examine log data from educational video games (CRESST Report 790). Los Angeles, CA: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST)
  15. Kim, Y. J., & Shute, V. J. (2015). Opportunities and challenges in assessing and supporting creativity in video games. In Video games and creativity (pp. 99–117). Elsevier Academic Press.
  16. Krebs, E., Jaschek, C., von Thienen, J., Borchart, K.-P., Meinel, C., & Kolodny, O. (2020). Designing a Video Game to Measure Creativity. 2020 IEEE Conference on Games (CoG), 407–414.
  17. OECD (2013), “Problem-Solving Framework”, in PISA 2012 Assessment and Analytical Framework: Mathematics, Reading, Science, Problem Solving and Financial Literacy, OECD Publishing, Paris. DOI:
  18. Qiao X and Jiao H (2018) Data Mining Techniques in Analyzing Process Data: A Didactic. Front. Psychol. 9:2231. doi: 10.3389/fpsyg.2018.02231
  19. Rafner, J., Hjorth, A., Risi, S., Philipsen, L., Dumas, C., Biskjær, M. M., Noy, L., Tylén, K., Bergenholtz, C., Lynch, J., Zana, B., & Sherson, J. (2020). crea.blender: A Neural Network-Based Image Generation Game to Assess Creativity. In Extended Abstracts of the 2020 Annual Symposium on Computer-Human Interaction in Play (pp. 340–344). Association for Computing Machinery.
  20. Rhodes, M. (1961). An Analysis of Creativity. The Phi Delta Kappan, 42(7), 305–310.
  21. Rosseel, Y. (2012). lavaan: An R Package for Structural Equation Modeling. Journal of Statistical Software, 48(2), 1-36.
  22. Rupp, A. (2002). Feature selection for choosing and assembling measurement models: A building-block- based organisation. International Journal of Testing, 2 (3/4), 311–360
  23. Said-Metwaly, S., Van den Noortgate, W., & Kyndt, E. (2017). Methodological issues in measuring creativity: A systematic literature review. Creativity. Theories-Research-Applications, 4(2), 276–301.
  24. Schmidt D, Heckendorf C (2022). “ngram: Fast n-Gram Tokenization.” R package version 3.2.2
  25. Shi, D., Maydeu-Olivares, A., & Rosseel, Y. (2020). Assessing fit in ordinal factor analysis models: SRMR vs. RMSEA. Structural Equation Modeling: A Multidisciplinary Journal, 27(1), 1–15.
  26. Sternberg, R. J., & Lubart, T. I. (1995). Defying the crowd: Cultivating creativity in a culture of conformity. New York: Free Press.
  27. Torrance, E. P. (1962). Guiding creative talent. Prentice-Hall, Inc.
  28. Ward, T. B. (1994). Structured Imagination: the Role of Category Structure in Exemplar Generation. Cognitive Psychology, 27(1), 1–40.
  29. World Economic Forum / Schwab, K. (Editor). (2018). The Global Competitiveness Report 2018.

30. Yuan J, Xiao Y and Liu H (2019). Assessment of Collaborative Problem Solving Based on Process Stream Data: A New Paradigm for Extracting Indicators and Modeling Dyad Data. Front. Psychol. 10:369. doi: 10.3389/fpsyg.2019.00369

Information About the Authors

Sergei V. Tarasov, MA in Psychology, Research Assistant: Institute of Education/Center for Psychometrics and Measurement in Education, National Research University Higher School of Economics, Moscow, Russia, ORCID:, e-mail:



Total: 51
Previous month: 48
Current month: 3


Total: 13
Previous month: 13
Current month: 0