MIRI
banner
intelligence.org
MIRI
@intelligence.org
For over two decades, the Machine Intelligence Research Institute (MIRI) has worked to understand and prepare for the critical challenges that humanity will face as it transitions to a world with artificial superintelligence.
Pinned
#7 Combined Print & E-Book Nonfiction (www.nytimes.com/books/best-s...)

#8 Hardcover Nonfiction (www.nytimes.com/books/best-s...)
”If Anyone Builds It, Everyone Dies” was recently added to the New Yorker's “The Best Books of the Year So Far” list!

newyorker.com/best-books-2...
October 31, 2025 at 2:30 AM
“If Anyone Builds It, Everyone Dies” coauthor Nate Soares recently chatted with Major Garrett on @cbsnews.com.
New book argues superhuman AI puts humans on path to extinction
Nate Soares, the co-author of "If Anyone Builds It, Everyone Dies," argues in his new book that if any company builds an artificial superintelligence, it would end in human extinction. He joins "The…
www.youtube.com
October 31, 2025 at 1:29 AM
Reposted by MIRI
@hankgreen.bsky.social rarely does interviews or 30+ min long videos.

His latest video, an hour+ long interview with Nate Soares about “If Anyone Builds It, Everyone Dies,” is a banger. My new favorite!

www.youtube.com/watch?v=5CKu...
October 30, 2025 at 8:52 PM
In the Bay Area? Come join Nate Soares, in conversation with Semafor Tech Editor Reed Albergotti, about Nate's NYT bestselling book “If Anyone Builds It, Everyone Dies.”

🗓️ Tuesday Oct 28 @ 7:30pm at Manny’s in SF.

Get your tickets:
Nate Soares - If Anyone Builds It, Everyone Dies
Nate Soares discusses the scramble to create superhuman AI that has us on a path to extinction. But it’s not too late to change course.
www.eventbrite.com
October 24, 2025 at 10:09 PM
Academy Award winning director Kathryn Bigelow is reading “If Anyone Builds It, Everyone Dies.”

From an interview in The Guardian by Danny Leigh: www.theguardian.com/film/2025/oc...
October 18, 2025 at 3:12 PM
“The book uses parables, very well told, to argue that evolutionary processes are not predictable, at least not easily. [...] I came away far more concerned than I had been before opening the book.”

www.forbes.com/sites/billco...
October 18, 2025 at 1:18 AM
Reposted by MIRI
Today’s episode of The Ezra Klein Show.

The researcher Eliezer Yudkowsky argues that we should be very afraid of artificial intelligence’s existential risks.
www.nytimes.com/2025/10/15/o...

youtu.be/2Nn0-kAE5c0?...
How Afraid of the AI Apocalypse Should We Be? | The Ezra Klein Show
YouTube video by The Ezra Klein Show
youtu.be
October 15, 2025 at 1:40 PM
Reposted by MIRI
Michael talks with Nate Soares, co-author of "If Anyone Builds It, Everyone Dies", on the risks of advanced artificial intelligence. Soares argues that humanity must treat AI risk as seriously as pandemics or nuclear war.
Hear the #bookclub #podcast 🎧📖 https://loom.ly/w1hBbWM
October 15, 2025 at 8:30 PM
Reposted by MIRI
🎙️ w/ Nate Soares on his and E. Yudkowsky’s book *If Anybody Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All.* @intelligence.org

Why mitigating existential AI risk should be a top global priority, the problem of pointing minds, a treaty to ban the race to superintelligence, and more.
EP 327 Nate Soares on Why Superhuman AI Would Kill Us All - The Jim Rutt Show
Jim talks with Nate Soares about his and Eliezer Yudkowsky's book If Anybody Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All.
www.jimruttshow.com
October 16, 2025 at 1:29 PM
😮 Whoopie Goldberg recommends “If Anyone Builds It, Everyone Dies” on The View!
October 15, 2025 at 10:46 PM
Great event last week at @politicsprose.bsky.social (The Wharf) in DC, with “If Anyone Builds It, Everyone Dies” coauthor Nate Soares.

Many thanks to all those who attended, and to @jonatomic.bsky.social, Director of Global Risk at FAS, for the great conversation.
October 3, 2025 at 12:05 AM
Happening tonight!
🗓️ Next Friday Sept 26th in DC at @politicsprose.bsky.social (The Wharf)

Join us for a conversation between co-author Nate Soares and @jonatomic.bsky.social, Director of Global Risk at the Federation of American Scientists.

Audience Q&A, book signing, and more:
politics-prose.com/nate-soares
September 26, 2025 at 7:31 PM
#7 Combined Print & E-Book Nonfiction (www.nytimes.com/books/best-s...)

#8 Hardcover Nonfiction (www.nytimes.com/books/best-s...)
September 24, 2025 at 11:00 PM
Reposted by MIRI
This was a great event. Really enjoyed chatting with Joel and Ollie on the first panel.

Thanks @scientistsorg.bsky.social and @futureoflife.org for putting this event together.
Dear diary, we had a great time on the Hill last week with our friends at @futureoflife.org

We kicked off our AGI x Global Risk day with remarks from @repbillfoster.bsky.social, @reptedlieu.bsky.social, and John Bailey — setting the stage for a day of bold dialogue on the future of AGI 🌎
September 23, 2025 at 6:06 PM
Reposted by MIRI
I think my favorite interview Eliezer and Nate have done so far for the book has been for the Making Sense podcast with Sam Harris.

Unfortunately the full episode is for subscribers only.

Fortunately, as a subscriber, I can share the full thing 🙂
Sam Harris | #434 - Can We Survive AI?
Sam Harris speaks with Eliezer Yudkowsky and Nate Soares about their new book, If Anyone Builds It, Everyone Dies: The Case Against Superintelligent AI.
samharris.org
September 22, 2025 at 8:48 PM
“[...] everyone with an interest in the future has a duty to read what he and Soares have to say.”
September 22, 2025 at 1:24 PM
In the last couple of days “If Anyone Builds It, Everyone Dies” co-authors Eliezer Yudkowsky and Nate Soares have appeared live on @cnn.com, ABC News Live, and Bannon's War Room!

📺 Full segments below starting with @cnn.com:
next.frame.io/share/c0b240...
September 20, 2025 at 8:59 PM
Reposted by MIRI
"If Anyone Builds It, Everyone Dies". It's a shocking headline. How well does it hold up? Today I review.

peterwildeford.substack.com/p/if-we-buil...
If We Build AI Superintelligence, Do We All Die?
If you're not at least a little doomy about AI, you're not paying attention
peterwildeford.substack.com
September 18, 2025 at 1:57 PM
🗓️ Next Friday Sept 26th in DC at @politicsprose.bsky.social (The Wharf)

Join us for a conversation between co-author Nate Soares and @jonatomic.bsky.social, Director of Global Risk at the Federation of American Scientists.

Audience Q&A, book signing, and more:
politics-prose.com/nate-soares
September 18, 2025 at 2:42 PM
Reposted by MIRI
AI researcher Eliezer Yudkowsky warned superintelligent AI could threaten humanity by pursuing its own goals over human survival.
Forget woke chatbots — an AI researcher says the real danger is an AI that doesn't care if we live or die
AI researcher Eliezer Yudkowsky warned superintelligent AI could threaten humanity by pursuing its own goals over human survival.
www.businessinsider.com
September 16, 2025 at 10:41 AM
Reposted by MIRI
Eliezer Yudkowsky, one of the most fascinating people in A.I., is the author of a new book on why artificial intelligence will kill us all. On "Hard Fork," he made the case for why A.I. development should be shut down now, long before we reach superintelligence. nyti.ms/46kniYD
September 13, 2025 at 7:08 PM
Reposted by MIRI
AI has drives and behaviors that nobody asked for and nobody wanted—which may prove to be disastrous, Eliezer Yudkowsky and Nate Soares write.
AI Is Grown, Not Built
Nobody knows exactly what an AI will become. That’s very bad.
bit.ly
September 16, 2025 at 3:45 PM
“If Anyone Builds It, Everyone Dies” has hit the shelves!

New blurbs and media appearances, plus reading group support and ways to help with the book launch!

intelligence.org/2025/09/16/i...
September 16, 2025 at 10:00 PM
UK hardcover edition also looks sharp in person 😍

Just over a week to the UK release on Sept. 18th!
September 10, 2025 at 4:39 PM
🎧 Audiobook sneak peak from @hachetteaudio.bsky.social

From the book's intro: “We open many of the chapters with parables: stories that, we hope, will help convey some points more simply than otherwise.”

Here's a snippet of one of them from the beginning of Chapter 1.

(Extended clip in 🧵)
September 3, 2025 at 2:36 AM