I think I’ve avoided this topic long enough. The types of questions the industry is often faced with are: How will the vehicle decide who to kill: the driver or other people? And will the vehicle have knowledge to make the decision about which people’s lives are more valuable? Should the vehicle aim for the cyclist with a helmet (who is protected) instead of the cyclist without one? To date, Germany is the only government agency that has taken a stance on this topic. As stated in this article, the country’s transportation minister outlined three rules as a starting point for future laws:
- Property damage always takes precedence over personal injury.
- There must be no classification of people, for example, based on their size, age and the like.
- If anything goes wrong, the manufacturer is liable.
Interestingly, the auto makers and technology developers are pretty quiet on this issue. I’m not surprised. I believe the media and the academics are making this into a much larger issue than it actually is (see some examples of the hype here, here, and here). When was the last time when you were driving that you were faced with a decision to kill yourself versus someone else? And, if you are in the very small group faced with that situation, how did you make the decision? Was there even time to make a decision?
MIT has developed a Moral Machine to gather a “human perspective on moral decisions made by machine intelligence, such as self-driving cars.” The reality is that the developers working on driverless vehicles are refining the technology to deal with pedestrians, cyclists, bad weather, poor pavement markings, and construction sites…. Despite these challenges being every day occurrences, our society has put a lot of emphasis on the moral decisions….is that necessary?