A1 Vertaisarvioitu alkuperäisartikkeli tieteellisessä lehdessä

Innocence over utilitarianism : heightened moral standards for robots in rescue dilemmas




TekijätSundvall Jukka, Drosinou Marianna, Hännikäinen Ivar, Elovaara Kaisa, Halonen Juho, Herzon Volo, Kopecký Robin, Košová Michaela Jirout, Koverola Mika, Kunnari Anton, Perander Silva, Saikkonen Teemu, Palomäki Jussi, Laakasuo Michael

KustantajaWILEY

Julkaisuvuosi2023

JournalEuropean Journal of Social Psychology

Tietokannassa oleva lehden nimiEUROPEAN JOURNAL OF SOCIAL PSYCHOLOGY

Lehden akronyymiEUR J SOC PSYCHOL

Sivujen määrä26

ISSN0046-2772

eISSN1099-0992

DOIhttps://doi.org/10.1002/ejsp.2936

Verkko-osoitehttps://doi.org/10.1002/ejsp.2936

Rinnakkaistallenteen osoitehttps://research.utu.fi/converis/portal/detail/Publication/179319931


Tiivistelmä
Research in moral psychology has found that robots, more than humans, are expected to make utilitarian decisions. This expectation is found specifically when contrasting utilitarian action to deontological inaction. In a series of eight experiments (total N = 3752), we compared judgments about robots' and humans' decisions in a rescue dilemma with no possibility of deontological inaction. A robot's decision to rescue an innocent victim of an accident was judged more positively than the decision to rescue two people culpable for the accident (Studies 1-2b). This pattern repeated in a large-scale web survey (Study 3, N = similar to 19,000) and reversed when all victims were equally culpable/innocent (Study 5). Differences in judgments about humans' and robots' decisions were largest for norm-violating decisions. In sum, robots are not always expected to make utilitarian decisions, and their decisions are judged differently from those of humans based on other moral standards as well.

Ladattava julkaisu

This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail. Please cite the original version.





Last updated on 2024-26-11 at 15:31