Xianhui He
xianhuihe.bsky.social
Xianhui He
@xianhuihe.bsky.social
何贤辉 Oxford DPhil student at the Staresina lab, go crazy for brain🧠 and volleyball🏐
Thanks my supervisor @bstaresina.bsky.social and my coauthors @philippbuchel.bsky.social @simonfsoubeyrand.bsky.social Janina Klingspohr @mskehl.bsky.social for their kind help! More details please check our preprint! www.biorxiv.org/content/10.1... 9/9
www.biorxiv.org
June 16, 2025 at 9:33 AM
In sum: Our research shows that sequence learning reshapes our neural representations to be more predictive. And sleep, especially deep sleep, is crucial for transforming these representations to be more abstract. This process helps us update our internal world model by external experiences. 8/9
June 16, 2025 at 9:33 AM
FINDING 3: So, what's the driving force behind this transformation? **Sleep**! Specifically, deep sleep (slow-wave sleep). We found that participants who got more slow-wave sleep after learning had stronger and more abstract successor representations. 🧠💤 7/9
June 16, 2025 at 9:33 AM
FINDING 2: Even more interesting, using RSA with a deep neural network, we found the successor representation wasn't just a faint copy of the original image. It became more abstract and "high-level," shifting from simple visual features to the core concept of the image after learning. 6/9
June 16, 2025 at 9:33 AM
FINDING 1: It worked! We found that even when the sequence was no longer relevant for the task at hand, when participants saw image A, we could decode the information for the successor image B from their brain activity. This confirms the existence of successor representations. 5/9
June 16, 2025 at 9:33 AM
To find out, we designed an experiment: participants first learned image sequences. We then recorded their brain activity using high-density EEG, including throughout a 2-hr nap, to see if their brain would spontaneously activate a representation of the next image. 4/9
June 16, 2025 at 9:33 AM
Our study focused on three key questions:

1. After learning a sequence (e.g., A→B→C), does our brain automatically anticipate the successor (B)?

2. Is this prediction based on concrete visual details or more abstract concepts?

3. What role does sleep play in this process? 3/9
June 16, 2025 at 9:33 AM
There's a cool theory behind this called "Successor Representation." It suggests our brain doesn't just process the "now" but also maintains a "predictive map," anticipating what comes next based on past experiences. We wanted to understand how this map is drawn and updated. 2/9
June 16, 2025 at 9:33 AM