banner
athenaaktipis.bsky.social
@athenaaktipis.bsky.social
Professor at ASU, host/producer of Zombified Media & author of The Cheating Cell and A Field Guide to the Apocalypse.
I love that I was still able to sneak in some exercise and cross country ski while in Davos! The scenery was insanely beautiful.

#Davos #WEF #WEF2026 🎿
February 2, 2026 at 9:35 PM
Los Angeles friends and colleagues - we are doing a private showing of our in-development Surviving AI show in LA this weekend, hot off the Davos train! (Yes, we literally had the last Surviving AI show in a train at Davos, more on that later.)
Surviving AI · Luma
Surviving AI is a comedy show that blends humor with thoughtful exploration of how humans relate to artificial intelligence and how we might build a more…
luma.com
January 30, 2026 at 12:16 AM
AI is becoming social, but current evaluations miss a key dimension: real cooperation.

In my new Substack post I explain why developing benchmarks for AI cooperation and friendship is essential if we want systems that genuinely contribute to human flourishing.

substack.com/home/post/p-...
Why We Need Benchmarks For AI Cooperation And Friendship
How engagement-optimized systems exploit social cues, and why cooperation science must shape the next generation of AI
substack.com
January 27, 2026 at 10:35 PM
Excited to share that I’ll be in Davos this week for the World Economic Forum.
January 17, 2026 at 4:48 PM
From dodging cows on snowy backroads to proving humans help each other in crisis, my Reed Magazine feature dives into the science behind cooperation, catastrophe, and why fun matters in wild times.
How to Thrive in Wild Times
According to cooperation theorist Athena Aktipis ’02, the one thing we really need? Each other.
www.reed.edu
January 12, 2026 at 9:04 PM
Reminder that abstract submissions for EvoCancer close tomorrow.

If you’re working at the intersection of cancer, evolution, and ecology, this is a great community to plug into.

Details here: www.evocancer.com
ISEEC
The International Society for Evolution, Ecology and Cancer
www.evocancer.com
January 9, 2026 at 6:51 PM
Reposted
As I did last year, I thought I'd start the new year by highlighting five writers I enjoy reading, along with a recent post from each that grabbed my attention.

Happy 2026!

www.futureofbeinghuman.com/p/five-voice...

@athenaaktipis.bsky.social @athenedonald.bsky.social @bnerlich.bsky.social
Five voices worth reading in 2026
As I did last year, I thought I'd highlight five writers I enjoy reading, along with a recent post from each that grabbed my attention.
www.futureofbeinghuman.com
January 1, 2026 at 3:08 PM
Since it cost a lot to win
And even more to lose
You and me bound to spend some time
Wondering what to choose

This line from the Grateful Dead’s Deal has been on my mind as the new year begins - not as a metaphor about gambling or fate, but as a statement about choice under uncertainty.
January 2, 2026 at 6:19 PM
For most of our evolutionary history, alcohol looked nothing like today’s Bud Lights, Margaritas, or White Claws.
December 30, 2025 at 5:19 PM
I’m delighted to share that I’ll be hosting at Human Tech Week in SF this Spring.
December 29, 2025 at 6:32 PM
Join me Tuesday with authors J. Arvid Agren and Manus M. Patten for the launch of their book The Paradox of the Organism: Adaptation and Internal Conflict.
December 14, 2025 at 8:30 PM
The AI safety conversation focuses on alignment, making powerful systems do what we want. But what if that’s the wrong question? What if the issue isn’t controlling one system, but cultivating healthy ecosystems?
December 11, 2025 at 10:01 PM
You’re sitting down to eat some greasy pizza - what sounds good to drink with it? If you’re thinking about reaching for a beer, you’re not alone. Beer and pizza go together like… well, beer and pizza.
November 30, 2025 at 7:16 PM
Right now the AI ecosystem looks like fast colonization of a new habitat: resources suddenly opened up, competition intense, survival uncertain, and the whole landscape reshaped faster than anyone can track.
November 26, 2025 at 8:34 PM
The AI safety conversation focuses on alignment, making powerful systems do what we want. But what if that’s the wrong question? What if the issue isn’t controlling one system, but cultivating healthy ecosystems?

substack.com/home/post/p-...
November 23, 2025 at 9:31 PM
The full Human Energy Global Salon Series “Alignment for a Major Evolutionary Transition: The Future of Humanity and AI” is now on YT:

www.youtube.com/watch?v=5Kmv...

We had an incredible discussion and hope you weigh in as well! 👇
Alignment for a Major Evolutionary Transition: The Future of Humanity and AI
YouTube video by Human Energy
www.youtube.com
November 20, 2025 at 7:14 PM
AI systems - especially self-training systems - are built on recursion. They update themselves based on feedback. They generate variation. They propagate internal patterns across training cycles. In other words, they provide everything evolution needs to get going.
November 18, 2025 at 5:55 PM
I talked to Human Energy about the topics I’ll be discussing this evening during the Global Salon Series panel.

Check out the link to join the conversation that starts at 7 pm pst!

www.youtube.com/live/73om6Pw...
November 11, 2025 at 11:33 PM
I'm thrilled to share that I'll be joining the Global Salon Series tonight in San Francisco for a panel discussion titled “Alignment for a Major Evolutionary Transition: The Future of Humanity and AI.”

Livestream starts at 7pm pst: youtube.com/live/73om6Pw...
Human Energy Global Salon – San Francisco | Alignment for a Major Evolutionary Transition
Join us live on the Human Energy YouTube channel as part of our Global Salon Series: the San Francisco Salon — “Alignment for a Major Evolutionary Transition.” This immersive evening convenes leading...
youtube.com
November 11, 2025 at 4:56 PM
I’ve spent my career studying cooperation and looking for general principles that apply across systems.

Lately, I’ve been turning my attention to AI, thinking about the ways that cooperation science can help us navigate what is probably the most critical challenge of our time.
November 11, 2025 at 4:51 PM
I started a Substack called Not For Peer Review. I’ve filled my notebooks with ideas, many of which have inspired my articles and books. But there are so many ideas that I haven’t published, and, if I’m being honest with myself, I know I will never publish through the traditional peer review system.
November 11, 2025 at 2:59 AM
I’m honored to share that I’ll be joining the Global Salon Series this Tuesday in San Francisco for a panel discussion titled “Alignment for a Major Evolutionary Transition: The Future of Humanity and AI.”
November 9, 2025 at 6:22 PM
💞 Friendship isn’t about keeping score. It’s about helping when it counts.

In our new The Conversation article, @jessicadayers.bsky.social & I explore why true friendship works like risk-pooling, not exchange.

Read here 👉 doi.org/10.64628/AAI...

#Friendship #Psychology
Friendships aren’t just about keeping score – new psychology research looks at why we help our friends when they need it
Friendship isn’t a tit-for-tat balance sheet, but that’s how researchers have traditionally defined it. New studies are refining the model to be less about transactions and truer to real life.
doi.org
October 11, 2025 at 3:46 PM
🧠🔥 Join us at #ZAMM2025!
Got a theory too wild for peer review? A study too ambitious to fund?
Pitch it at Extreme Experiments & Unhinged Hypotheses — 5-min lightning talks where imagination rules.
Submit here 👉 docs.google.com/forms/d/e/1F...
October 8, 2025 at 4:47 PM
🚨 Call for Wild Ideas 🚨
Got a theory too bold for a grant? A study so extreme it borders sci-fi? Pitch it at Extreme Experiments & Unhinged Hypotheses—5-min lightning talks at #ZAMM2025.
Submit here 👉 docs.google.com/forms/d/e/1F...
More info: zombiemed.org
Extreme Experiments & Unhinged Hypotheses
Have you ever dreamed of running an experiment without limits? Do you have a theory so wild it breaks the boundaries of conventional thinking? Bring it to Extreme Experiments & Unhinged Hypotheses—a l...
docs.google.com
September 28, 2025 at 7:38 PM