Whom We Trust More: AI-driven vs. Human-driven Economic Decision-Making

333

Abstract

AI as a new direction in the study of human-computer interaction requires a new look at trust as a phenomenon. In our study, we focus on examining trust in the context of economic behavior. The study took place in two stages. At the first stage, during the interview, we have identified the main factors of trust and mistrust in AI and the specific factors of trust in AI in economic decisions. Also, we have revealed a subjective indicator of the level of trust in the advisor’s recommendations - the economic activity of the participant when performing the recommended action. At the second stage, an experiment was carried out. The participants were asked to play a stock exchange game. The goal of the game was to make money by buying and selling shares. There were an option to ask an advise. For the experimental group, AI acted as an advisor, for the control group, a person (an expert in trading). According to the analysis of 800 economic decisions, economic activity during the game was higher among the participants in the control group who followed the advice of the person (t = 3.646, p <0.001). As a result of the study, three main conclusions were obtained: 1) the level of trust in councils in an economic decision can be expressed in the form of economic activity; 2) the level of trust in economic recommendation depends on whether the recommendation is made by a human or an AI; 3) the specific factors of trust in economic decisions are highlighted: the individuality of the council and the speed of the requested solution.

General Information

Keywords: trust, artificial intelligence, economic behavior, decision support systems (DSS)

Journal rubric: Psychology of Digital Reality

Article type: scientific article

DOI: https://doi.org/10.17759/exppsy.2023160206

Received: 10.12.2021

Accepted:

For citation: Vinokurov F.N., Sadovskaya E.D. Whom We Trust More: AI-driven vs. Human-driven Economic Decision-Making. Eksperimental'naâ psihologiâ = Experimental Psychology (Russia), 2023. Vol. 16, no. 2, pp. 87–100. DOI: 10.17759/exppsy.2023160206. (In Russ., аbstr. in Engl.)

References

  1. Andreeva G. M. i dr. Social'naya psihologiya. M., 2001. Vol. 2. (In Russ.).
  2. Gozman L.Ya. Psihologiya emocional'nyh otnoshenij. Izd-vo Mosk. un-ta, 1987. (In Russ.).
  3. ZHuravlev A.L., Nestik T.A. Social'no-psihologicheskie posledstviya vnedreniya novyh tekhnologij: perspektivnye napravleniya issledovanij. Psihologicheskij zhurnal, 2019. Vol. 40, no. 5, pp. 35—47. (In Russ.).
  4. Kuprejchenko A.B. Doverie i nedoverie tekhnike i sociotekhnicheskim sistemam: postanovka problemy i obosnovanie podhoda k issledovaniyu. Uchenye zapiski IMEI, 2012. Vol. 2, no. 1, pp. 126—137. (In Russ.).
  5. Kuprejchenko A.B. Psihologiya doveriya i nedoveriya. Litres, 2022. (In Russ.).
  6. Nestik T.A. Social'no-psihologicheskie prediktory otnosheniya lichnosti k novym tekhnologiyam. Informacionnoe obshchestvo: obrazovanie, nauka, kul'tura i tekhnologii budushchego, 2018. No. 2, pp. 309—319. (In Russ.).
  7. Safonov V.S. O psihologii doveritel'nogo obshcheniya. Problema obshcheniya v psihologii, 1981. Pp. 264—272. (In Russ.).
  8. Skripkina T.P. Psihologiya doveriya. 2000. (In Russ.).
  9. Soldatova G.U., Nestik T.A. Otnoshenie k internetu sredi internet-pol'zovatelej: tekhnofoby i tekhnofily. Vestnik Moskovskogo gosudarstvennogo oblastnogo universiteta. Seriya: Psihologicheskie nauki, 2016. No. 1, pp. 54—61. (In Russ.).
  10. Bickmore T., Cassell J. Social dialongue with embodied conversational agents. Advances in natural multimodal dialogue systems. Springer, Dordrecht, 2005. Pp. 23—54.
  11. Bierhoff H.W., Vornefeld B. The social psychology of trust with applications in the internet. Analyse & Kritik, 2004. Vol. 26, no. 1, pp. 48—62.
  12. Castelfranchi C. Modelling social action for AI agents. Artificial intelligence, 1998. Vol. 103, no. 1-2, pp. 157—182.
  13. Fridman L., et al. Mit autonomous vehicle technology study: Large-scale deep learning based analysis of driver behavior and interaction with automation. arXiv preprint, arXiv:1711.06976. 2017. Vol. 1.
  14. Habibov N., Cheung A., Auchynnikava A. Does social trust increase willingness to pay taxes to improve public healthcare? Cross-sectional cross-country instrumental variable analysis. Social Science & Medicine, 2017. Vol. 189, pp. 25—34.
  15. Habibov N., Cheung A., Auchynnikava A. Does trust increase willingness to pay higher taxes to help the needy? International Social Security Review, 2017. Vol. 70, no. 3, pp. 3—30.
  16. Jarrahi M.H. Artificial intelligence and the future of work: Human-AI symbiosis in organizational decision making. Business Horizons, 2018. Vol. 61, no. 4, pp. 577—586.
  17. Longoni C., Bonezzi A., Morewedge C.K. Resistance to medical artificial intelligence. Journal of Consumer Research, 2019. Vol. 46, no. 4, pp. 629—650.
  18. Lucas G.M., et al. It’s only a computer: Virtual humans increase willingness to disclose. Computers in Human Behavior, 2014. Vol. 37, pp. 94—100.
  19. Marwala T. Artificial intelligence techniques for rational decision making. Springer, 2014. 167 p.
  20. Ulfert A.S., Antoni C.H., Ellwart T. The role of agent autonomy in using decision support systems at work. Computers in Human Behavior, 2022. Vol. 126, pp. 106987.
  21. Wang, Dayong, et al. Deep learning for identifying metastatic breast cancer. arXiv preprint, arXiv:1606.05718. 2016. Preprint at: https://arxiv.org/abs/1606.05718
  22. Yu T.K., Lin M.L., Liao Y.K. Understanding factors influencing information communication technology adoption. Computers in Human Behavior, 2017. Vol. 71. P. 196—208.
  23. Zhang Q., Lee M.L., Carter S. You complete me: Human-ai teams and complementary expertise. Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, 2022. Pp. 1—28.

Information About the Authors

Fedor N. Vinokurov, PhD in Psychology, Senior Researcher, Department of Social Psychology, Lomonosov Moscow State University, Moscow, Russia, ORCID: https://orcid.org/0000-0001-8302-374X, e-mail: VinokurovFN@my.msu.ru

Ekaterina D. Sadovskaya, Postgraduate, Department Social Psychology, Lomonosov Moscow State University, Moscow, Russia, ORCID: https://orcid.org/0000-0002-7530-0097, e-mail: ed.sadovskaya@gmail.com

Metrics

Views

Total: 908
Previous month: 64
Current month: 31

Downloads

Total: 333
Previous month: 23
Current month: 9