MIRI
banner
intelligence.org
MIRI
@intelligence.org
For over two decades, the Machine Intelligence Research Institute (MIRI) has worked to understand and prepare for the critical challenges that humanity will face as it transitions to a world with artificial superintelligence.
Pinned
#7 Combined Print & E-Book Nonfiction (www.nytimes.com/books/best-s...)

#8 Hardcover Nonfiction (www.nytimes.com/books/best-s...)
Final Update: From ~$450k earlier today, we’re now down to just over $250k left in unclaimed matching funds!

4 hours left to go, and by golly it looks like we’ve got a real shot at securing all the matching.

Thanks everyone! Happy New Year 🎉
Donations to MIRI before Jan 1 are high-leverage. We’ve got ~$1.6M in 1:1 matching from SFF, over half of which has yet to be claimed!

This is real counterfactual matching: whatever doesn’t get matched by the end of Dec 31, we don’t get. 🧵
January 1, 2026 at 4:09 AM
Final Update: From ~$450k earlier today, we’re now down to just over $250k left in unclaimed matching funds!

4 hours left to go, and by golly it looks like we’ve got a real shot at securing all the matching.

Thanks everyone! Happy New Year 🎉
Donations to MIRI before Jan 1 are high-leverage. We’ve got ~$1.6M in 1:1 matching from SFF, over half of which has yet to be claimed!

This is real counterfactual matching: whatever doesn’t get matched by the end of Dec 31, we don’t get. 🧵
MIRI's 2025 Fundraiser - Machine Intelligence Research Institute
MIRI is running its first fundraiser in six years, targeting $6M. The first $1.6M raised will be matched 1:1 via an SFF grant. Fundraiser ends at midnight on Dec 31, 2025. Support our efforts to impro...
intelligence.org
January 1, 2026 at 4:06 AM
With just under 7 hours to go, we’re now down below $300k of unclaimed matching funds!
January 1, 2026 at 1:20 AM
Update 2: We’re down to ~$450k left of unclaimed matching funds, with just over 12 hours to go!

Thanks to all those who stepped up in the last couple of days to close the gap by ~$500k. ❤️
Donations to MIRI before Jan 1 are high-leverage. We’ve got ~$1.6M in 1:1 matching from SFF, over half of which has yet to be claimed!

This is real counterfactual matching: whatever doesn’t get matched by the end of Dec 31, we don’t get. 🧵
December 31, 2025 at 7:49 PM
Update 2: We’re down to ~$450k left of unclaimed matching funds, with just over 12 hours to go!

Thanks to all those who stepped up in the last couple of days to close the gap by ~$500k. ❤️
Donations to MIRI before Jan 1 are high-leverage. We’ve got ~$1.6M in 1:1 matching from SFF, over half of which has yet to be claimed!

This is real counterfactual matching: whatever doesn’t get matched by the end of Dec 31, we don’t get. 🧵
MIRI's 2025 Fundraiser - Machine Intelligence Research Institute
MIRI is running its first fundraiser in six years, targeting $6M. The first $1.6M raised will be matched 1:1 via an SFF grant. Fundraiser ends at midnight on Dec 31, 2025. Support our efforts to impro...
intelligence.org
December 31, 2025 at 7:48 PM
Update: We’ve received over $250k since this was posted.

~$700k in matching funds remaining.
Donations to MIRI before Jan 1 are high-leverage. We’ve got ~$1.6M in 1:1 matching from SFF, over half of which has yet to be claimed!

This is real counterfactual matching: whatever doesn’t get matched by the end of Dec 31, we don’t get. 🧵
December 31, 2025 at 7:47 PM
PSA: It’s worth reaching out to old donors, because sometimes this happens 🙂
December 30, 2025 at 6:44 PM
Update: We’ve received over $250k since this was posted.

~$700k in matching funds remaining.
Donations to MIRI before Jan 1 are high-leverage. We’ve got ~$1.6M in 1:1 matching from SFF, over half of which has yet to be claimed!

This is real counterfactual matching: whatever doesn’t get matched by the end of Dec 31, we don’t get. 🧵
MIRI's 2025 Fundraiser - Machine Intelligence Research Institute
MIRI is running its first fundraiser in six years, targeting $6M. The first $1.6M raised will be matched 1:1 via an SFF grant. Fundraiser ends at midnight on Dec 31, 2025. Support our efforts to impro...
intelligence.org
December 30, 2025 at 6:41 PM
And of course, to everyone who’s already donated, including the >100 first-time MIRI donors who gave during the fundraiser, thank you!

If you’re looking for other ways to help, sharing this thread, or quote-posting it with why you chose to support us would mean a lot.
December 29, 2025 at 10:55 PM
If you’d like to help us secure as much of the remaining matching funds as we can, we’d be grateful for your support.

(If you don’t see your preferred donation method, including cryptocurrencies, reach out to us at development@intelligence.org.)
Donate - Machine Intelligence Research Institute
Support MIRI’s research. Find out if your employer will match donations! Donate using ACH, PayPal, digital currency. MIRI is a 501(c)(3) nonprofit.
intelligence.org
December 29, 2025 at 10:55 PM
Why is this real counterfactual matching?

The funds come from a Survival and Flourishing Fund matching pledge—not a traditional grant.

You can learn more about SFF’s matching pledges here:
Matching Pledges | Survival and Flourishing Fund
Matching Pledges | Survival and Flourishing Fund
survivalandflourishing.fund
December 29, 2025 at 10:55 PM
Donations to MIRI before Jan 1 are high-leverage. We’ve got ~$1.6M in 1:1 matching from SFF, over half of which has yet to be claimed!

This is real counterfactual matching: whatever doesn’t get matched by the end of Dec 31, we don’t get. 🧵
MIRI's 2025 Fundraiser - Machine Intelligence Research Institute
MIRI is running its first fundraiser in six years, targeting $6M. The first $1.6M raised will be matched 1:1 via an SFF grant. Fundraiser ends at midnight on Dec 31, 2025. Support our efforts to impro...
intelligence.org
December 29, 2025 at 10:55 PM
”If Anyone Builds It, Everyone Dies” was recently added to the New Yorker's “The Best Books of the Year So Far” list!

newyorker.com/best-books-2...
October 31, 2025 at 2:30 AM
“If Anyone Builds It, Everyone Dies” coauthor Nate Soares recently chatted with Major Garrett on @cbsnews.com.
New book argues superhuman AI puts humans on path to extinction
Nate Soares, the co-author of "If Anyone Builds It, Everyone Dies," argues in his new book that if any company builds an artificial superintelligence, it would end in human extinction. He joins "The…
www.youtube.com
October 31, 2025 at 1:29 AM
Reposted by MIRI
@hankgreen.bsky.social rarely does interviews or 30+ min long videos.

His latest video, an hour+ long interview with Nate Soares about “If Anyone Builds It, Everyone Dies,” is a banger. My new favorite!

www.youtube.com/watch?v=5CKu...
October 30, 2025 at 8:52 PM
In the Bay Area? Come join Nate Soares, in conversation with Semafor Tech Editor Reed Albergotti, about Nate's NYT bestselling book “If Anyone Builds It, Everyone Dies.”

🗓️ Tuesday Oct 28 @ 7:30pm at Manny’s in SF.

Get your tickets:
Nate Soares - If Anyone Builds It, Everyone Dies
Nate Soares discusses the scramble to create superhuman AI that has us on a path to extinction. But it’s not too late to change course.
www.eventbrite.com
October 24, 2025 at 10:09 PM
Academy Award winning director Kathryn Bigelow is reading “If Anyone Builds It, Everyone Dies.”

From an interview in The Guardian by Danny Leigh: www.theguardian.com/film/2025/oc...
October 18, 2025 at 3:12 PM
“The book uses parables, very well told, to argue that evolutionary processes are not predictable, at least not easily. [...] I came away far more concerned than I had been before opening the book.”

www.forbes.com/sites/billco...
October 18, 2025 at 1:18 AM
Reposted by MIRI
Today’s episode of The Ezra Klein Show.

The researcher Eliezer Yudkowsky argues that we should be very afraid of artificial intelligence’s existential risks.
www.nytimes.com/2025/10/15/o...

youtu.be/2Nn0-kAE5c0?...
How Afraid of the AI Apocalypse Should We Be? | The Ezra Klein Show
YouTube video by The Ezra Klein Show
youtu.be
October 15, 2025 at 1:40 PM
Reposted by MIRI
Michael talks with Nate Soares, co-author of "If Anyone Builds It, Everyone Dies", on the risks of advanced artificial intelligence. Soares argues that humanity must treat AI risk as seriously as pandemics or nuclear war.
Hear the #bookclub #podcast 🎧📖 https://loom.ly/w1hBbWM
October 15, 2025 at 8:30 PM
Reposted by MIRI
🎙️ w/ Nate Soares on his and E. Yudkowsky’s book *If Anybody Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All.* @intelligence.org

Why mitigating existential AI risk should be a top global priority, the problem of pointing minds, a treaty to ban the race to superintelligence, and more.
EP 327 Nate Soares on Why Superhuman AI Would Kill Us All - The Jim Rutt Show
Jim talks with Nate Soares about his and Eliezer Yudkowsky's book If Anybody Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All.
www.jimruttshow.com
October 16, 2025 at 1:29 PM
About the book:
If Anyone Builds It, Everyone Dies
The race to superhuman AI risks extinction, but it's not too late to change course.
ifanyonebuildsit.com
October 15, 2025 at 10:46 PM
😮 Whoopie Goldberg recommends “If Anyone Builds It, Everyone Dies” on The View!
October 15, 2025 at 10:46 PM
Great event last week at @politicsprose.bsky.social (The Wharf) in DC, with “If Anyone Builds It, Everyone Dies” coauthor Nate Soares.

Many thanks to all those who attended, and to @jonatomic.bsky.social, Director of Global Risk at FAS, for the great conversation.
October 3, 2025 at 12:05 AM
Happening tonight!
🗓️ Next Friday Sept 26th in DC at @politicsprose.bsky.social (The Wharf)

Join us for a conversation between co-author Nate Soares and @jonatomic.bsky.social, Director of Global Risk at the Federation of American Scientists.

Audience Q&A, book signing, and more:
politics-prose.com/nate-soares
September 26, 2025 at 7:31 PM