0:00
/
0:00
Transcript

Here Lies Humanity, Dead by Fancy Autocomplete

Adam Becker and I discuss everything that annoyed us about Eliezer Yudkowsky's new book

Adam did try to warn me.

Last month, The Atlantic published Adam Becker’s scathing review of Eliezer Yudkowsky and Nate Soares’s new book, If Anyone Builds It, Everyone Dies. (“The Useful Idiots of AI Doomsaying.”) It’s a wonderfully pointed review of a book that deserved it.

I decided to read the book anyway. I know, I know… I should stop doing this. But livetweeting hate-read reaction threads on Bluesky has kind of become a thing for me. Everyone needs hobbies.

Adam is the author of More Everything Forever: AI Overlords, Space Empires, and Silicon Valley’s Crusade to Control the Fate of Humanity, which is probably the best book I’ve read this year. (I reviewed it here.)

He’s also a friend. So, in lieu of turning my Bluesky review thread into a book review post, I figured it’d be more fun to record a video conversation with Adam where we discussed the book.

The recording is above. This is a new format for me. I thought it was fun.

Some highlights:

  • Rationalism is to Silicon Valley as Scientology is to Hollywood. I think this is my best joke on the topic. Adam doesn’t think it’s funny, because it’s just plain true. Repeat exposure to the LessWrong discussion boards is functionally indistinguishable from mercury poisoning. (Rationalism isn’t exactly a cult, but it’s not not a cult, y’know?…)

  • The basic problem is in part 1, where they can’t define intelligence, but just sort of handwave it away, “trust me, bro”-style. I know academic discourse can be exhausting, but defining your terms and evaluating their limitations is actually pretty important. Otherwise you end up building intellectual sandcastles and believe you’ve poured concrete.

  • At least it is short. If you’re going to hate-read a book, hate-read a short book. Eliezer Yudkowsky’s editors tried their best. Ah, well, nevertheless.

  • Adam imagines an alternate timeline, a better world, where Yudkowsky just becomes the mid-grade scifi author he clearly wants to be. I suspect, even in that world, he still goes the L. Ron Hubbard path and starts his own cult. <shrug emoji>

  • There’s a blustering style to the book which is very discussion-board coded. As Adam puts it, it has the feel of a precocious teenager who never grew out of their worst habits and just decided “this must be how the world works.”

  • Yudkowsky gets futurism wrong in a way that drives both of us just completely bonkers. Come on, Eliezer. You’ve been playing this game for several decades. Please learn the goddamn rules.

And, if you haven’t yet, you should buy Adam’s book. It’s really good.

Discussion about this video

User's avatar

Ready for more?