In the journal Science published an article concerning the behavior of self-driving cars in cases when harm to man is inevitable. We are talking about those situations when the AI is faced with a choice: bump into a group of pedestrians crossing the road that will likely lead to their death, or to swerve into a ditch with fatal consequences for the driver.
No matter how “smart” car wasn’t made a choice, he will inevitably entail casualties. A clear answer to the question of what to do in certain situations, the autopilot, no: apparently, it is up to the algorithm developers, who can make a choice in favor of the consumer.
However, to decide whose life is more important in an emergency — pedestrian or driver with passengers — not even the man himself. The study showed that people adhere to a utilitarian approach when issues of ethics and morality.
So, 76% of respondents agreed that the car with the autopilot needs to move to the side to avoid the deaths of 10 pedestrians. However, when those same respondents were asked, would they be willing to be inside the driverless vehicle is programmed, the percentage of approval has dropped by a third.
“Most people want to live in a world where cars minimize the loss,” said Professor Massachusetts Institute of technology, study co-author Iyad Rauan, But everyone wants personal car protected them at any cost.”