Spatial Localization of Digital Sound in Scientific Experiment and Practice



Localization of sound in space is an important component of auditory perception, which is involved in the selection of various sound streams, the perception of speech in noise, and the organization of auditory images. Research over the past century has shown that sound localization is achieved through: differences in the intensity and time delay of sound waves arriving at different ears; spectral distortions arising from the anatomical features of the structure of the auricles, head, torso; dynamic cues (listener head movements), etc. However, some scientific and methodological issues (primarily related to the perception of natural sounds and the ecological validity of studies) have not been resolved. The development of digital audio techniques also leads to the emergence of new areas of research, including the processing of sound for the transmission of spatial information in headphones (which is solved using the head related transfer function — HRTF) and the creation of auditory interfaces. The tasks facing researchers in these areas are to improve the perception of spatial information (by manipulating the characteristics of the sound, prompts or training) and the creation of such sound events that can be perceived as object-related, i.e., inextricably linked with the purpose of the operator's activity. The methodology of the perceived quality of events, which makes it possible to distinguish which properties of the auditory image become the most important in human activity and which physical properties of the event they correspond to, can help in solving the tasks set and increasing the ecological validity of research.

General Information

Keywords: auditory perception, sound localization, head related transfer function (HRTF), auditory interfaces, sonification, perceived quality of events

Journal rubric: Psychology of Digital Reality

Article type: scientific article


Funding. The study was funded by State assignment no. 0138-2023-0006.

Received: 03.05.2023


For citation: Razvaliaeva A.Y., Nosulenko V.N. Spatial Localization of Digital Sound in Scientific Experiment and Practice. Eksperimental'naâ psihologiâ = Experimental Psychology (Russia), 2023. Vol. 16, no. 2, pp. 20–35. DOI: 10.17759/exppsy.2023160202. (In Russ., аbstr. in Engl.)


  1. Basul I.A., Obelets V.S. Opyt registratsii HRTF v reverberatsionnykh usloviyakh [Registering HRTF in reverberant conditions]. XIV Vserossiiskaya mul'tikonferentsiya po problemam upravleniya (MKPU-2021): materialy XIV mul'tikonferentsii [Proceedings of the 14th All-Russian multidisciplinary conference on the issues of management] (Divnomorskoe, Gelendzhik, 27 September — 2 October, 2021): in 4 vols. Rostov-na-Donu; Taganrog: Publ. Yuzhnyi federal'nyi universitet, 2021. Vol. 3, pp. 26—28. (In Russ.).
  2. Blauert J. Prostranstvennyi slukh [Spatial Hearing: The Psychophysics of Human Sound Localization]. Moscow: Svyaz', 1979. 220 p. (In Russ.).
  3. Zavalova N.D., Lomov B.F., Ponomarenko V.A. Obraz v sisteme psikhicheskoi regulyatsii deyatel'nosti [Image in the system of mental regulation of activity]. Moscow: Nauka, 1986. 174 p. (In Russ.).
  4. Nosulenko V.N. Psikhologiya slukhovogo vospriyatiya [Psychology of auditory perception]. Moscow: Nauka, 1988. 216 p. (In Russ.).
  5. Nosulenko V.N. “Ekologizatsiya” psikhoakusticheskogo issledovaniya: osnovnye napravleniya [Ecologization of psychoacoustic studies: The main approaches]. In V.N. Nosulenko (ed.). Problemy ekologicheskoi psikhoakustiki [Issues of Ecological Psychoacoustics]. Moscow: IPAN, 1991, pp. 8—27. (In Russ.).
  6. Nosulenko V.N. Psikhofizika vospriyatiya estestvennoi sredy. Diss. dokt. psikhol. nauk. [Psychophysics of the Perception of the Natural Environment. Dr. Sci. (Psychology) diss.]. Moscow: IP RAN, 2004. 323 p. (In Russ.).
  7. Nosulenko V.N. Psikhofizika vospriyatiya estestvennoi sredy. Problema vosprinimaemogo kachestva [Psychophysics of the Perception of the Natural Environment. The Perceived Quality Problem]. Moscow: IP RAN, 2007. 400 p. (In Russ.).
  8. Nosulenko V.N. Zvuk v interfeisakh vzaimodeistviya cheloveka i tekhniki [Sound in the interfaces for human-machine interaction]. In V.I. Panov (ed.). Ekopsikhologicheskie issledovaniya-6: ekologiya detstva i psikhologiya ustoichivogo razvitiya [Ecopsychological Studies-6: Ecology of Childhood and Psychology of Stable Development]. Moscow: FGBNU “Psikhologicheskii institut RAO”; Kursk: Universitetskaya kniga, 2020, pp. 155—159. (In Russ.).
  9. Nosulenko V.N. Integration Issues of Qualitative and Quantitative Methods in Psychological Research. Experimental Psychology, 2021. Vol. 14 (3), pp. 4—16. DOI:10.17759/exppsy.2021140301. (In Russ., abstr. in Engl.).
  10. Nosulenko V.N., Basul I.A., Zybin E.Yu., Lelikov M.A. Prostranstvennoe razdelenie informatsii v samoletnom peregovornom ustroistve [Spatial separation of information in the aircraft communication device]. Izvestiya YuFU. Tekhnicheskie nauki = Izvestiya SFedU. Engineering Sciences, 2021, no. 7, pp. 109—119. DOI:10.18522/2311-3103-2021-7-109-119 (In Russ.).
  11. Nosulenko V.N., Starikova I.V. Sravnenie kachestva zvuchaniya muzykal'nykh fragmentov, razlichayushchikhsya sposobom kodirovaniya zapisi [Comparison of sounding of musical fragments that differ in the way of encoding]. Eksperimental'naya psikhologiya = Experimental Psychology, 2009. Vol. 2 (3), pp. 19—34. (In Russ., abstr. in Engl.).
  12. Nosulenko V.N., Kharitonov A.N. Zhizn' sredizvukov. Psikhologicheskie rekonstruktsii [Life among Sounds. Psychological Reconstructions]. Moscow: Institut psikhologii RAN Publ., 2018. 422 p. (In Russ.).
  13. Razvalyaeva A.Yu. Sonifikatsiya kak sredstvo neverbal'noi kommunikatsii: klassifikatsiya metodov i sposoby primeneniya [Sonification as a means of nonverbal communication: Classification of methods and applications]. Istoriya, sovremennost' i perspektivy razvitiya psikhologii v sisteme Rossiiskoi Akademii nauk: Materialy Mezhdunarodnoi yubileinoi nauchnoi konferentsii, posvyashchennoi 50-letiyu sozdaniya Instituta psikhologii RAN (g. Moskva, 16—18 noyabrya 2022 g.) [History, the present and the perspectives of the development of psychology in the Russian Academy of Sciences: Proceedings of the International anniversary scientific conference celebrating 50 years since the establishment of the Institute of Psychology]. Moscow: IP RAN, 2022, pp. 238—240. DOI:10.38098/conf_22_0451 (In Russ.).
  14. Strutt J.W. (Baron Rayleigh). Teoriya zvuka [The Theory of Sound]. Vol. 2. Moscow; Leningrad: OGIZ, Gostekhizdat, 1944. 477 p. (In Russ.).
  15. Cherry C. O binaural'nom vospriyatii zvukov [On the binaural perception of sounds]. In G.D. Smirnov (ed.). Teoriya svyazi v sensornykh sistemakh [Communication theory in sensory systems]. Moscow: Mir, 1964, pp. 321—337. (In Russ.).
  16. Algazi V.R., Avendano C., Duda R.O. Estimation of a spherical-head model from anthropometry. Journal of Audio Engineering Society, 2001. Vol. 49 (6), pp. 472—478.
  17. Bălan O., Moldoveanu A., Moldoveanu F., Morar A., Ivaşcu S. Perceptual feedback training for improving spatial acuity and resolving front-back confusion errors in virtual auditory environments. 40th International Conference on Telecommunications and Signal Processing (TSP)(Barcelona, Spain, 5—7 July 2017). Brno: Brno University of Technology, 2017, pp. 334—337. DOI:10.1109/TSP.2017.8075999
  18. Baldwin C.L. Auditory Cognition and Human Performance: Research and Applications. Boca Raton, FL: CRC Press, 2012. 314 p.
  19. Bilinski P., Ahrens J., Thomas M.R.P., Tashev I.J., Platt J.C. HRTF magnitude synthesis via sparse representation of anthropometric features. IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) (Florence, Italy, 4—9 May 2014). Piscataway, NJ: IEEE, 2014, pp. 4468—4472.
  20. Bouchara T., Bara T.-G., Weiss P.-L., Guilbert A. Influence of vision on short-term sound localization training with non-individualized HRTF. Proceedings of the EAA Spatial Audio Signal Processing Symposium (Paris, France, 6—7 September 2019). Paris: Sorbonne Université, 2019, pp. 55—60. DOI:10.25836/sasp.2019.04
  21. Bregman A.S. Auditory streaming: competition among alternative organizations. Perception and Psychophysics, 1978. Vol. 23 (5), pp. 391—398. DOI:10.3758/BF03204141
  22. Bronkhorst A.W. The cocktail party phenomenon: A review of research on speech intelligibility in multiple-talker conditions. Acta Acustica united with Acoustica, 2000. Vol. 86 (1), pp. 119—128.
  23. Brungart D.S., Simpson B.D. Cocktail party listening in a dynamic multitalker environment. Perception & Psychophysics, 2007. Vol. 69 (1), pp. 71—99.
  24. Brungart D.S., Simpson B.D. Design, validation, and in-flight evaluation of an auditory attitude indicator based on pilot-selected music. Proceedings of the 14th International Conference on Auditory Display (Paris, France, 24—27 June, 2008). Atlanta, GA: Georgia Institute of Technology, ICAD, 2008, pp. 24—27.
  25. Brungart D.S., Simpson B.D., Dallman R.C., Romigh G., Yasky R., Raquet J. A comparison of head-tracked and vehicle-tracked virtual audio cues in an aircraft navigation task. Proceedings of the 13 International Conference on Auditory Display (Montreal, Canada, 26—29 June, 2007). Atlanta, GA: Georgia Institute of Technology, ICAD, 2007, pp. 32—37.
  26. Gelfand S.A. Hearing: An Introduction to Psychological and Physiological Acoustics. 5th Edition. London: Informa Healthcare, 2010. 311 p.
  27. Geronazzo M., Peruch E., Prandoni F., Avanzini F. Applying a single-notch metric to image-guided head-related transfer function selection for improved vertical localization. Journal of the Audio Engineering Society, 2019. Vol. 67 (6), pp. 414—428. DOI:10.17743/jaes.2019.0010
  28. Green D.M. An Introduction to Hearing. Hillsdale, NY: Lawrence Erlbaum, 1976. 353 p.
  29. Gulick W.L. Hearing: Physiology and Psychophysics. New York: Oxford University Press, 1971. 258 p.
  30. Hansberger J.T., Peng C., Blakely V., Meacham S., Cao L., Diliberti N. A multimodal interface for virtual information environments. In J.Y.C. Chen and G. Fragomeni (eds.) Virtual, Augmented and Mixed Reality. Multimodal Interaction. HCII 2019. Lecture Notes in Computer Science, vol. 11574. Cham: Springer, 2019, pp. 59—70.
  31. Jiang J., Xie B., Mai H., Liu L., Yi K., Zhang C. The role of dynamic cue in auditory vertical localization. Applied Acoustics, 2019. Vol. 146, pp. 398—408. DOI:10.1016/j.apacoust.2018.12.002
  32. Kearney G., Gorzel M., Rice H., Boland F. Distance perception in interactive virtual acoustic environments using first and higher order ambisonic sound fields. Acta Acustica united with Acustica, 2012. Vol. 98 (1), pp. 61—71. DOI:10.3813/AAA.918492
  33. Larsen C.H., Lauritsen D.S., Larsen J.J., Pilgaard M., Madsen J.B. Differences in human audio localization performance between a HRTF- and a non-HRTF audio system. AM'13: Proceedings of the 8th Audio Mostly Conference (Piteå, Sweden, 18—20 September, 2013). New York: ACM Press, 2013. Pp. 1—8. DOI:10.1145/2544114.2544118
  34. Lee G.W., Kim H.K. Personalized HRTF modeling based on deep neural network using anthropometric measurements and images of the ear. Applied Sciences, 2018. Vol. 8 (11), pp. 2180. DOI:10.3390/app8112180
  35. Letowski T.R., Letowski S.T. Auditory Spatial Perception: Auditory Localization. Report ARL-TR-6016. U.S. Army Research Laboratory, 2012. 163 p.
  36. Lotto A., Holt L. Psychology of auditory perception. Wiley Interdisciplinary Reviews: Cognitive Science, 2011. Vol. 2 (5), pp. 479—489. DOI:10.1002/wcs.123
  37. Mendonça C., Campos G., Dias P., Vieira J., Ferreira J.P., Santos J.A. On the improvement of localization accuracy with non-individualized HRTF-based sounds. Journal of the Audio Engineering Society, 2012. Vol. 60 (10), pp. 821—830.
  38. Nicol R.Representation et perception des espaces auditifs virtuels : Mémoire d’Habilitation à Diriger des Recherches. Le Mans; Laval: Université de Maine, 2010. 287 p.
  39. Nosulenko V. Problems of ecological psychoacoustics. Proceedings of the Sixth Annual Meeting of the International Society for Psychophysics. Würsburg, 1990, pp. 135—139.
  40. Oberem J., Richter J.G., Setzer D., Seibold J., Koch I., Fels J. Experiments on localization accuracy with non-individual and individual HRTFs comparing static and dynamic reproduction methods [Elektronnyi resurs]. bioRxiv, 2020. Available at: (Accessed 26.09.2022)
  41. Parise C.V., Spence C. Audiovisual crossmodal correspondences and sound symbolism: A study using the implicit association test. Experimental Brain Research, 2012. Vol. 220 (3—4), pp. 319—333. DOI:10.1007/s00221-012-3140-6
  42. Pastore M.T., Zhou Y., Yost W.A. Cross-modal and cognitive processes in sound localization. In J. Blauert, J. Braasch (eds.) The Technology of Binaural Understanding. Modern Acoustics and Signal Processing. Cham: Springer, 2020, pp. 315—350. DOI:10.1007/978-3-030-00386-9_12
  43. Rajendran V.G., Gamper H. Spectral manipulation improves elevation perception with non-individualized head-related transfer functions. The Journal of the Acoustical Society of America, 2019. Vol. 145 (3), pp. EL222—EL228. DOI:10.1121/1.5093641
  44. Simpson B.D., Brungart D.S., Dallman R.C., Yasky R.J., Romigh G.D. Flying by ear: Blind flight with a music-based artificial horizon. Proceedings of the Human Factors and Ergonomics Society 52nd Annual Meeting (New York City, USA, 22—26 September, 2008). Santa Monica, CA: Human Factors & Ergonomics Society, 2008, pp. 6—9.
  45. Stitt P., Picinali L., Katz B.F.G. Auditory accommodation to poorly matched non-individual spectral localization cues through active learning. Scientific Reports, 2019. Vol. 9, p. 1063. DOI:10.1038/s41598-018-37873-0
  46. Terai K., Kakuhari I. HRTF calculation with less influence from 3-D modeling error: Making a physical human head model from geometric 3-D data. Acoustical Science and Technology, Vol. 24 (5), pp. 333—334. DOI:10.1250/ast.24.333
  47. Towers J., Burgess-Limerick R., Riek S. Improving 3-D audio localisation through the provision of supplementary spatial audio cues. The Ergonomics Open Journal, 2012. Vol. 5 (1), pp. 1—9. DOI:10.2174/1875934301205010001
  48. Warusfel O. Listen HRTF database [Elektronnyi resurs]. Available at: (Accessed05.2023).
  49. Wenzel E.M., Miller J.D., Abel J.S. Sound Lab: A real-time, software-based system for the study of spatial hearing. Proceedings of the 108th Convention of the Audio Engineering Society (Paris, France, 19—22 February, 2000). New York: Audio Engineering Society, 2000. P. 5140.
  50. Wright B.A., Zhang Y. A review of learning with normal and altered sound-localization cues in human adults: Revisión del aprendizaje en adultos con claves de localización Sonora normales o alteradas. International Journal of Audiology, 2006. Vol. 45 (S1), pp. 92—98. DOI:10.1080/14992020600783004
  51. Zahorik P., Bangayan P., Sundareswaran V., Wang K., Tam C. Perceptual recalibration in human sound localization: Learning to remediate front-back reversals. The Journal of the Acoustical Society of America, 2006. Vol. 120 (1), pp. 343—359. DOI:10.1121/1.2208429
  52. Zhang W., Samarasinghe P.N., Chen H., Abhayapala T.D. Surround by sound: A review of spatial audio recording and reproduction. Applied Sciences, 2017. Vol. 7 (5), pp. 532—539. DOI:10.3390/app7050532
  53. Ziemer T., Nuchprayoon N., Schultheis H. Psychoacoustic sonification as user interface for human-machine interaction. International Journal of Informatics Society, 2020. Vol. 12 (1), pp. 3—16.

Information About the Authors

Anna Y. Razvaliaeva, PhD in Psychology, Researcher, Laboratory of Cognitive Processes and Mathematical Psychology, Institute of Psychology of the Russian Academy of Sciences, Moscow, Russia, ORCID:, e-mail:

Valeriy N. Nosulenko, Doctor of Psychology, Chief Researcher, Laboratory of Cognitive Processes and Mathematical Psychology, Institute of Psychology, Russian Academy of Sciences, Chief Researcher, Institute of Experimental Psychology, Moscow State University of Psychology and Education, Moscow, Russia, ORCID:, e-mail:



Total: 164
Previous month: 45
Current month: 45


Total: 44
Previous month: 8
Current month: 7