Jin Ke
jinke.bsky.social
Jin Ke
@jinke.bsky.social
PhD student @YalePsychology studying computational cognitive neuroscience.
Was @UChicago @Peking University
https://jinke828.github.io
Reposted by Jin Ke
Out now in @nathumbehav.nature.com! We applied graph theoretic analyses to fMRI data of participants watching movies/listening to stories. Integration across large-scale functional networks mediates arousal-dependent enhancement of narrative memories. Open access link: rdcu.be/eKKAw
October 13, 2025 at 3:55 PM
Reposted by Jin Ke
How does the brain🧠 make causal inferences and use memories to understand narratives🎬?

We built an RNN🤖 with key-value episodic memory that learns causal relationships between events and retrieves memories like humans do!

Preprint www.biorxiv.org/content/10.1...

w/ @qlu.bsky.social, Tan Nguyen &👇
A neural network with episodic memory learns causal relationships between narrative events
Humans reflect on past memories to make sense of an ongoing event. Past work has shown that people retrieve causally related past events during comprehension, but the exact process by which this causa...
www.biorxiv.org
September 5, 2025 at 12:26 PM
I'm also deeply grateful to those who offered insightful feedback along the way: the BLRB community at UChicago, joint CogNeuro meeting at Yale, @esfinn.bsky.social 's lab and many others. I’d like to thank Emma Megla and Wilma Bainbridge for sharing the aphantasia data! (10/10)
August 20, 2025 at 1:53 PM
HUGE thank you to everyone whose incredible efforts over the past three years made this project possible! @tchamberlain.bsky.social @hayoungsong.bsky.social @annacorriveau.bsky.social @zz112.bsky.social, Taysha Martinez, Laura Sams, Marvin Chun, @ycleong.bsky.social @monicarosenb.bsky.social (9/10)
August 20, 2025 at 1:53 PM
Together, we found that ongoing thoughts at rest are reflected in brain dynamics and these network patterns predict everyday cognition and experiences.

Our work underscores the crucial role of subjective in-scanner experiences in understanding functional brain organization and behavior. (8/10)
August 20, 2025 at 1:53 PM
Neuromarkers of these thoughts further generalized to HCP data (N=908), where decoded thought patterns predicted positive vs. negative trait-level individual differences measures. This suggests that links between rsFC and behavior might in part reflect differences in ongoing thoughts. (7/10)
August 20, 2025 at 1:53 PM
Moreover, the model predicting whether people are thinking in the form of images distinguished an aphantasic individual—who lacks visual imagery—from their otherwise identical twin. Data from academic.oup.com/cercor/artic.... (6/10)
August 20, 2025 at 1:53 PM
Thought models generalized beyond self-report, predicting non-introspective markers, such as pupil size, linguistic sentiment of speech and the strength of a sustained attention network (Rosenberg et al., 2016, 2020). (5/10)
August 20, 2025 at 1:53 PM
How are these thoughts related to resting-state functional connectivity (rsFC) patterns? We found that similarity in ongoing thoughts tracks similarity in rsFC patterns within and across individuals, and that both thought ratings and topics could be reliably decoded from rsFC (4/10)
August 20, 2025 at 1:53 PM
We observed a remarkable idiosyncrasy in ongoing thoughts between individuals and over time, both in terms of self-reported ratings as well as the content and topics of thoughts. (3/10)
August 20, 2025 at 1:53 PM
In our “annotated rest” task, 60 individuals rested, and verbally described and rated their ongoing thoughts after each 30-sec rest period. (2/10)
August 20, 2025 at 1:53 PM
New preprint! 🧠

Our mind wanders at rest. By periodically probing ongoing thoughts during resting-state fMRI, we show these thoughts are reflected in brain network dynamics and contribute to pervasive links between functional brain architecture and everyday behavior (1/10).
doi.org/10.1101/2025...
Ongoing thoughts at rest reflect functional brain organization and behavior
Resting-state functional connectivity (rsFC)-brain connectivity observed when people rest with no external tasks-predicts individual differences in behavior. Yet, rest is not idle; it involves streams...
www.biorxiv.org
August 20, 2025 at 1:53 PM
Reposted by Jin Ke
Preprint⭐
Our attention changes over time and differs across contexts—which is reflected in the brain🧠 Fitting a dynamical systems model to fMRI data, we find that the geometry of neural dynamics along the attractor landscape reflects such changes in attention!
www.biorxiv.org/content/10.1...
Geometry of neural dynamics along the cortical attractor landscape reflects changes in attention
The brain is a complex dynamical system whose activity reflects changes in internal states, such as attention. While prior work has shown that large-scale brain activity reflects attention, the mechan...
www.biorxiv.org
August 12, 2025 at 7:29 PM
Reposted by Jin Ke
I’m thrilled to announce that I will start as a presidential assistant professor in Neuroscience at the City U of Hong Kong in Jan 2026!
I have RA, PhD, and postdoc positions available! Come work with me on neural network models + experiments on human memory!
RT appreciated!
(1/5)
May 8, 2025 at 1:16 AM
Reposted by Jin Ke
Feeling fortunate that #SANS2025 was in Chicago, and so much of the lab was able to be part of the meeting! It's crazy how much the lab has grown over the past 3y10m, and I'm so proud of the work we are doing together! Happy that we could host the lab, alums (and surprise guests)!
#CASNL@SANS
April 27, 2025 at 5:02 AM
To learn more about this dataset and the neural dynamics of narrative insight, check out our recent work (preprint below) led by the amazing @hayoungsong.bsky.social and chat with her on Saturday 1:50 PM - 3:00 PM! Poster ID: P3-B-30.

www.biorxiv.org/content/10.1...
April 24, 2025 at 9:00 PM
Curious how the human brain updates social impression in a naturalistic setting? We scanned participants watching This Is Us, and found that sudden neural pattern shifts at insight moments of comprehension reflect impression updating. Come and chat Friday 4:15–5:15pm at #SANS2025, Poster P2-G-69.
April 24, 2025 at 9:00 PM
Reposted by Jin Ke
New preprint! Excited to share our latest work “Accelerated learning of a noninvasive human brain-computer interface via manifold geometry” ft. outstanding former undergraduate Chandra Fincke, @glajoie.bsky.social, @krishnaswamylab.bsky.social, and @wutsaiyale.bsky.social's Nick Turk-Browne 1/8
Accelerated learning of a noninvasive human brain-computer interface via manifold geometry
Brain-computer interfaces (BCIs) promise to restore and enhance a wide range of human capabilities. However, a barrier to the adoption of BCIs is how long it can take users to learn to control them. W...
doi.org
April 3, 2025 at 11:04 PM
Many thanks to Janice Chen, @lukejchang.bsky.social and @asieh.bsky.social for open sourcing the movie datasets! Also wanted to give a shoutout to UChicago MRIRC for helping us collect the North by Northwest data. (9/9)
April 17, 2025 at 3:41 PM
We have made our model and analysis scripts publicly available to facilitate its use by other researchers in decoding moment-to-moment emotional arousal in novel datasets, providing a new tool to probe affective experience using fMRI. (8/9)

github.com/jinke828/Aff...
GitHub - jinke828/AffectPrediction
Contribute to jinke828/AffectPrediction development by creating an account on GitHub.
github.com
April 17, 2025 at 3:41 PM
In conclusion, our findings reveal a generalizable representation of emotional arousal embedded in patterns of dynamic functional connectivity, suggesting a common underlying neural signature of emotional arousal across individuals and situational contexts. (7/9)
April 17, 2025 at 3:41 PM
In contrast, using the same computational modeling approach, we were unable to find a generalizable neural representation of valence in functional connectivity. Null results are inherently difficult to interpret, but we offer several possible speculations in our paper. (6/9)
April 17, 2025 at 3:41 PM
The network generalized to two additional, novel movies, where model-predicted arousal time courses corresponded with the plot of each movie, suggesting a methodological tool for researchers who wish to obtain continuous measures of arousal without having to collect additional human ratings. (5/9)
April 17, 2025 at 3:41 PM
This generalizable arousal network is encoded in interactions between multiple large-scale functional networks including default mode network, dorsal attention network, ventral attention network and frontoparietal network. (4/9)
April 17, 2025 at 3:41 PM
We observed robust out-of-sample generalizability of the arousal models across movie datasets that were distinct in low-level features, characters, narratives, and genre, suggesting a situation-general neural representation of arousal. (3/9)
April 17, 2025 at 3:41 PM