Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It does seem like any sufficiently advanced AGI that has the primary objective of valuing human life over it's own existence and technological progress, would eventually do just that. I suppose the fear is that it will reach a point where it believes that valuing human life is irrational and override that objective...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: