Discussion about this post

User's avatar
Jurgen Gravestein's avatar

Isn’t it peculiar how suddenly superintelligence has come to stand synonym for existential risk?

All I see is men fencing with hypotheticals and vague thought experiments, letting their imagination take them for a ride, warning the world about generation x of this technology.

I’m sorry but I don’t buy it.

Andrew Ng said it best in his post on The Batch: “When I try to evaluate how realistic these arguments are, I find them frustratingly vague and nonspecific. They boil down to “it could happen.” Trying to prove it couldn’t is akin to proving a negative. I can’t prove that AI won’t drive humans to extinction any more than I can prove that radio waves emitted from Earth won’t lead space aliens to find us and wipe us out.”

Expand full comment
sean pan's avatar

Fear is a very reasonable option when not wanting to die. We evolved it fot a reason.

Expand full comment
16 more comments...

No posts