Jin Ke
jinke.bsky.social
Jin Ke
@jinke.bsky.social
PhD student @YalePsychology studying computational cognitive neuroscience.
Was @UChicago @Peking University
https://jinke828.github.io
Neuromarkers of these thoughts further generalized to HCP data (N=908), where decoded thought patterns predicted positive vs. negative trait-level individual differences measures. This suggests that links between rsFC and behavior might in part reflect differences in ongoing thoughts. (7/10)
August 20, 2025 at 1:53 PM
Moreover, the model predicting whether people are thinking in the form of images distinguished an aphantasic individual—who lacks visual imagery—from their otherwise identical twin. Data from academic.oup.com/cercor/artic.... (6/10)
August 20, 2025 at 1:53 PM
How are these thoughts related to resting-state functional connectivity (rsFC) patterns? We found that similarity in ongoing thoughts tracks similarity in rsFC patterns within and across individuals, and that both thought ratings and topics could be reliably decoded from rsFC (4/10)
August 20, 2025 at 1:53 PM
We observed a remarkable idiosyncrasy in ongoing thoughts between individuals and over time, both in terms of self-reported ratings as well as the content and topics of thoughts. (3/10)
August 20, 2025 at 1:53 PM
In our “annotated rest” task, 60 individuals rested, and verbally described and rated their ongoing thoughts after each 30-sec rest period. (2/10)
August 20, 2025 at 1:53 PM
Curious how the human brain updates social impression in a naturalistic setting? We scanned participants watching This Is Us, and found that sudden neural pattern shifts at insight moments of comprehension reflect impression updating. Come and chat Friday 4:15–5:15pm at #SANS2025, Poster P2-G-69.
April 24, 2025 at 9:00 PM
In contrast, using the same computational modeling approach, we were unable to find a generalizable neural representation of valence in functional connectivity. Null results are inherently difficult to interpret, but we offer several possible speculations in our paper. (6/9)
April 17, 2025 at 3:41 PM
The network generalized to two additional, novel movies, where model-predicted arousal time courses corresponded with the plot of each movie, suggesting a methodological tool for researchers who wish to obtain continuous measures of arousal without having to collect additional human ratings. (5/9)
April 17, 2025 at 3:41 PM
This generalizable arousal network is encoded in interactions between multiple large-scale functional networks including default mode network, dorsal attention network, ventral attention network and frontoparietal network. (4/9)
April 17, 2025 at 3:41 PM
We observed robust out-of-sample generalizability of the arousal models across movie datasets that were distinct in low-level features, characters, narratives, and genre, suggesting a situation-general neural representation of arousal. (3/9)
April 17, 2025 at 3:41 PM
One possibility is that there are generalizable neural patterns associated with valence or arousal across contexts and individuals. We utilized open movie-watching fMRI datasets and built predictive models of moment-to-moment valence and arousal from functional correlations in brain activity. (2/9)
April 17, 2025 at 3:41 PM
Finally, we showed evidence that arousal models built on multivariate activation patterns underperformed models built on dynamic functional connectivity and did not generalize across datasets. (8/10)
November 17, 2023 at 9:02 PM
In contrast, we were unable to find a generalizable neural representation of valence in functional connectivity, multivariate activation pattern or univariate activity. (7/10)
November 17, 2023 at 9:01 PM
In the two novel movies, model-predicted arousal time courses corresponded with the plot of each movie, suggesting a methodological tool for researchers who wish to obtain continuous measures of arousal without having to collect additional human ratings. (6/10)
November 17, 2023 at 9:01 PM
The generalizable arousal network is encoded in interactions between multiple large-scale functional networks including DMN, DAN, VAN and FPN. The arousal network also generalized to two additional movie datasets, North by Northwest, and Merlin. (5/10)
November 17, 2023 at 9:00 PM
We observed robust out-of-sample generalizability of the arousal models across movie datasets that were distinct in low-level features, characters, narratives, and genre, suggesting a situation-general neural representation of arousal. (4/10)
November 17, 2023 at 9:00 PM
One possibility is that affective experiences are represented along the dimensions of valence and arousal. We utilized open movie-watching fMRI datasets and built predictive models of moment-to-moment valence and arousal from functional correlations in brain activity. (3/10)
November 17, 2023 at 8:59 PM