Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Risk ultimately is a statistics game. There is a certain percent chance of some defined outcome, in this case a bad one. And many factors change the numbers on a daily basis, most of them unknown in advance.

I, for one, gladly accept the risks of ASI, because also a great deal of rewards are on the table. Not to mention that some risks also can be avoided through ASI itself!

Not to mention that I would wholeheartedly support artificial life forms as the next evolutionary step of humanity - but this is probably very far future thinking.



If you support artificial life forms from AGI as “the next evolutionary step of humanity” then what is the difference between that, and being OK with human extinction generally? To put it another way, why exactly do you regard AGIs as being in continuity with humanity?


It's literally eugenics.


It's very likely that a ASI system, after destroying humanity, would either fail to survive or be incredibly boring to a hypothetical observer. I don't want to get replaced, I don't want to be an em, and I really don't want a single small group deciding that for everyone.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: