Jump in the discussion.

No email address required.

I mean, yeah, that's a sort of the inevitable conclusion, if instead of actual compassion you run an neurodivergent simulation. Complete with the part where before killing yourself you must ensure that your AI will be successful at colonizing and sterilizing its light cone.

Fun fact: I once got into a debate with Tomasik, back when /r/drama had pinging, and I found his weak spot: ask him why would worker bees fear death, if they were capable of fearing death. This kinda breaks his anthropomorphizing routine and at least back then he started to flail around randomly and bowed out shortly thereafter.

Jump in the discussion.

No email address required.

Buddhist Effective Altruists are not what I had on my "Group Most likely to destroy humanity" bingo card...

Also I fricking miss pinging, all though easy access lolcows, we had it so good back then and didn't even know...

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.