Yasaman Bahri
banner
yasamanbb.bsky.social
Yasaman Bahri
@yasamanbb.bsky.social
Research Scientist @ Google DeepMind. AI + physics. Prev Ph.D. @ UC Berkeley.

https://sites.google.com/view/yasamanbahri/home/
In our new preprint, we explain how some salient features of representational geometry in language modeling originate from a single principle - translation symmetry in the statistics of data.

arxiv.org/abs/2602.150...

With Dhruva Karkada, Daniel Korchinski, Andres Nava, & Matthieu Wyart.
Symmetry in language statistics shapes the geometry of model representations
Although learned representations underlie neural networks' success, their fundamental properties remain poorly understood. A striking example is the emergence of simple geometric structures in LLM rep...
arxiv.org
February 19, 2026 at 4:20 AM
Reposted by Yasaman Bahri
Dhruva Karkada, Daniel J. Korchinski, Andres Nava, Matthieu Wyart, Yasaman Bahri: Symmetry in language statistics shapes the geometry of model representations https://arxiv.org/abs/2602.15029 https://arxiv.org/pdf/2602.15029 https://arxiv.org/html/2602.15029
February 17, 2026 at 6:35 AM
Reposted by Yasaman Bahri
How do diverse context structures reshape representations in LLMs?
In our new work, we explore this via representational straightening. We found LLMs are like a Swiss Army knife: they select different computational mechanisms reflected in different representational structures. 1/
February 4, 2026 at 2:54 AM
Reposted by Yasaman Bahri
Why isn’t modern AI built around principles from cognitive science or neuroscience? Starting a substack (infinitefaculty.substack.com/p/why-isnt-m...) by writing down my thoughts on that question: as part of a first series of posts giving my current thoughts on the relation between these fields. 1/3
Why isn’t modern AI built around principles from cognitive science?
First post in a series on cognitive science and AI
infinitefaculty.substack.com
December 16, 2025 at 3:40 PM
I'll be missing NeurIPS this year, but we have two conference papers on the dynamics of learning and the structure of data in language modeling, a new direction I'm excited about: arxiv.org/abs/2502.09863 and arxiv.org/abs/2505.18651.
December 4, 2025 at 7:01 PM
Reposted by Yasaman Bahri
Very excited to lead this new @simonsfoundation.org collaboration on the physics of learning and neural computation to develop powerful tools from physics, math, CS, stats, neuro and more to elucidate the scientific principles underlying AI. See our website for more: www.physicsoflearning.org
August 18, 2025 at 5:48 PM
I'm looking forward to giving a talk tomorrow morning at the ICML workshop on High-Dimensional Learning Dynamics (HiDL) sites.google.com/view/hidimle.... Come by at 9 am!
Workshop on High-dimensional Learning Dynamics
18 July, ICML 2025 Vancouver, BC, Canada
sites.google.com
July 18, 2025 at 5:25 AM
Excited to be at the APS March Meeting this year! @apsphysics.bsky.social

I'll be giving a talk in the Tues afternoon session MAR-J58, Physics of Learning & Adaptation I.
March 18, 2025 at 7:11 PM
Reposted by Yasaman Bahri
A wide range of insightful and inspiring talks today at #ML4PS @neuripsconf.bsky.social , including @yasamanbb.bsky.social , Thea Aarrestad, and @annalenakofler.bsky.social
December 16, 2024 at 12:25 AM