Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Flipping a coin is still a conscious decision by the programmer.

MIT created an online survey called the Moral Machine asking people to choose who a car should hit in various (randomized and user-created) scenarios. Data like that could be used to train a "fair" AI. It's an interesting thought experiment but the survey's scenarios are unrealistic and include information that a car AI would not have available, e.g. person A is homeless and has no children while person B is a wealthy executive in good health.

http://moralmachine.mit.edu/



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: