I read an article today about the future ethics of self-driving cars. MIT has been running a survey to try to find out what people think a machine should do when it is confronted with 2 impossible choices. The Moral Machine survey gives you a number of choices where a car has to decide between killing passengers or pedestrians for example. Over 2,5 Million people from 130 countries have taken the test. Aside from some classic findings (crash into old people rather than children, chose the group with the least number of people, save women over men) its interesting that people will favour certain social categories over others or law abiding people over rebels. Although the test (and findings) is interesting I don't think this is the right way to solve the problem of self-driving cars. If the car has to make a choice between 2 bad outcomes rather than trying to find a "right" decision let fate decide: the car will randomly choose a course of action. Sometimes even our machines need to accept their limits.
No comments:
Post a Comment