Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

With self-driving cars some human will be held responsible in case of the accident, I hope. Why would it be different here? It seems like a responsibility problem, not a technology one.


I'm not talking about matter of formal responsibility here, especially since the enforcing mechanisms for stuff like war crimes are very poor due to the lack of a single global authority capable of enforcing them (see the ongoing ICC saga). It's about whether people feel personally responsible. AI provides a way to diffuse and redirect this moral responsibility that might otherwise deter them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: