Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The people setting the parameters may have hidden agendas. Or even not so hidden ones, like "make me rich at anybody's cost".


I think a much more likely cause would be biased data.

If you trained against a dataset of all police stops in my state (CT) to predict which vehicles should be stopped to maximize drug finds you'd include multiple large departments that have been under federal investigation for racial discrimination (and were found guilty).

I don't think the hidden agenda would be intentional, but there would be a hidden agenda.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: