Human decision-making biases in the moral dilemmas of autonomous vehicles

Frank, Chrysochou, Mitkidis, Ariely (2019) Human decision-making biases in the moral dilemmas of autonomous vehicles Sci Rep (IF: 4.6) 9(1) 13080
Full Text
Full text

Click the PDF icon to view the full text of the paper

Abstract

The development of artificial intelligence has led researchers to study the ethical principles that should guide machine behavior. The challenge in building machine morality based on people's moral decisions, however, is accounting for the biases in human moral decision-making. In seven studies, this paper investigates how people's personal perspectives and decision-making modes affect their decisions in the moral dilemmas faced by autonomous vehicles. Moreover, it determines the variations in people's moral decisions that can be attributed to the situational factors of the dilemmas. The reported studies demonstrate that people's moral decisions, regardless of the presented dilemma, are biased by their decision-making mode and personal perspective. Under intuitive moral decisions, participants shift more towards a deontological doctrine by sacrificing the passenger instead of the pedestrian. In addition, once the personal perspective is made salient participants preserve the lives of that perspective, i.e. the passenger shifts towards sacrificing the pedestrian, and vice versa. These biases in people's moral decisions underline the social challenge in the design of a universal moral code for autonomous vehicles. We discuss the implications of our findings and provide directions for future research.

Links

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6739396
http://www.ncbi.nlm.nih.gov/pubmed/31511560
http://dx.doi.org/10.1038/s41598-019-49411-7

Similar articles

Tools