Moral psychology of nursing robots: Exploring the role of robots in dilemmas of patient autonomy
Citations Over TimeTop 10% of 2022 papers
Abstract
Abstract Artificial intelligences (AIs) are widely used in tasks ranging from transportation to healthcare and military, but it is not yet known how people prefer them to act in ethically difficult situations. In five studies (an anthropological field study, n = 30, and four experiments, total n = 2150), we presented people with vignettes where a human or an advanced robot nurse is ordered by a doctor to forcefully medicate an unwilling patient. Participants were more accepting of a human nurse's than a robot nurse's forceful medication of the patient, and more accepting of (human or robot) nurses who respected patient autonomy rather than those that followed the orders to forcefully medicate (Study 2). The findings were robust against the perceived competence of the robot (Study 3), moral luck (whether the patient lived or died afterwards; Study 4), and command chain effects (Study 5; fully automated supervision or not). Thus, people prefer robots capable of disobeying orders in favour of abstract moral principles like valuing personal autonomy. Our studies fit in a new era in research, where moral psychological phenomena no longer reflect only interactions between people, but between people and autonomous AIs.
Related Papers
- → Arousing autonomy: A valid assessment of the implicit autonomy motive(2020)9 cited
- → Gender, Female Traditionality, Achievement Level, and Cognitions of Success and Failure(1977)29 cited
- → Self-Determination and Attribution of Responsibility: Another Look(1983)7 cited
- Susquehanna Chorale Spring Concert "Roots and Wings"(2017)
- → DETERMINING QUALITY REQUIREMENTS AT THE UNIVERSITIES TO IMPROVE THE QUALITY OF EDUCATION(2018)