Christoph Strauch
banner
cstrauch.bsky.social
Christoph Strauch
@cstrauch.bsky.social
Assistant professor @utrechtuniversity.bsky.social studying spatial attention, eye-movements, pupillometry, and more. Co-PI @attentionlab.bsky.social
Reposted by Christoph Strauch
Most of you know @suryagayet.bsky.social as a successful visual-attention researcher. But he also had an active #music career as a #rap artist. And like Jay-Z before him, he has briefly come out of retirement with a new album. Check it out—it's very good! 🎤🎶 open.spotify.com/album/7HrnAB...
Drijfzand
open.spotify.com
November 12, 2025 at 8:25 AM
Very proud of @dkoevoet.bsky.social's latest piece now accepted in Journal of Neuroscience, together with student Vicky Voet, Stefan van der @stigchel.bsky.social (all @attentionlab.bsky.social Utrecht) and Henry Jones & Ed Awh from Uni of Chicago.
Filled with a bunch of extra analyses, this is now accepted in The Journal of Neuroscience @sfn.org! You can have a sneak peak here: www.biorxiv.org/content/10.1...
October 24, 2025 at 11:33 AM
Reposted by Christoph Strauch
If you are looking, or know someone who is looking - please forward them!

I'm open to discuss the topic, but o.c. many ideas!!
S-CCS Lab PhD Position

3+2 year 100% TVL-13 position in '26 - open topic on the intersection of combined EEG-EyeTracking, Statistical Methods, Cognitive Modelling, VR/Mobile EEG, Vision ...

Apply via Max-Planck IMPRS-IS program until 2025-11-16 imprs.is.mpg.de

Read: www.s-ccs.de/philosophy
October 23, 2025 at 11:22 AM
Reposted by Christoph Strauch
Spatial attention and working memory are popularly thought to be tightly coupled. Yet, distinct neural activity tracks attentional breadth and WM load.

In a new paper @jocn.bsky.social, we show that pupil size independently tracks breadth and load.

doi.org/10.1162/JOCN...
October 14, 2025 at 2:04 PM
Reposted by Christoph Strauch
Incredible study by Raut et al.: by tracking a single measure (pupil size), you can model slow, large-scale dynamics in neuronal calcium, metabolism, and brain blood oxygen through a shared latent space! www.nature.com/articles/s41...
September 25, 2025 at 8:53 AM
I'll show some (I think) cool stuff about how we can measure the phenomenology of synesthesia in a physiological way at #ECVP - Color II, atrium maximum, 9:15, Thursday.

say hi and show your colleagues that you're one of the dedicated ones by getting up early on the last day!
#ECVP2025 starts with a fully packed room!

I'll show data, demonstrating that synesthetic perception is perceptual, automatic, and effortless.
Join my talk (Thursday, early morning.., Color II) to learn how the qualia of synesthesia can be inferred from pupil size.
Join and (or) say hi!
August 27, 2025 at 9:13 PM
Reposted by Christoph Strauch
A good reason to make it to the early #visualcognition session of today's #ecvp2025 👉 @anavili.bsky.social will talk about how attending to fuzzy bright/dark patches that have faded from awareness (through adaptation-like processes) still affect pupil size! ⚫👀⚪ Paper: dx.doi.org/10.1016/j.co...
Redirecting
dx.doi.org
August 27, 2025 at 5:45 AM
Reposted by Christoph Strauch
Excited to present at #ECVP2025 - Monday afternoon, Learning & Memory - about how anticipating relevant visual events prepares visual processing for efficient memory-guided visual selection! 🧠🥳
@attentionlab.bsky.social @ecvp.bsky.social

Preprint for more details: www.biorxiv.org/content/10.1...
Anticipated Relevance Prepares Visual Processing for Efficient Memory-Guided Selection
Finding an object typically involves the use of working memory to prioritize relevant visual information at the right time. For example, successfully detecting a highway exit sign is useless when your...
www.biorxiv.org
August 24, 2025 at 1:47 PM
#ECVP2025 starts with a fully packed room!

I'll show data, demonstrating that synesthetic perception is perceptual, automatic, and effortless.
Join my talk (Thursday, early morning.., Color II) to learn how the qualia of synesthesia can be inferred from pupil size.
Join and (or) say hi!
August 24, 2025 at 4:28 PM
Reposted by Christoph Strauch
Eye movements are cheap, right? Not necessarily! 💰 In our review just out in @natrevpsychol.nature.com, Alex Schütz and I discuss the different costs associated with making an eye movement, how these costs affect behaviour, and the challenges of measuring this… rdcu.be/eAm69 #visionscience #vision
A review of the costs of eye movements
Nature Reviews Psychology - Eye movements are the most frequent movements that humans make. In this Review, Schütz and Stewart integrate evidence regarding the costs of eye movements and...
rdcu.be
August 12, 2025 at 10:44 AM
Gaze heatmaps (are popular especially for eye-tracking beginners and in many applied domains. How many participants should be tested?
Depends of course, but our guidelines help navigating this in an informed way.

Out now in BRM (free) doi.org/10.3758/s134...
@psychonomicsociety.bsky.social
July 29, 2025 at 7:37 AM
Reposted by Christoph Strauch
Jelmer Borst and I are looking for a PhD candidate to build an EEG-based model of human working memory! This is a really cool project that I've wanted to kick off for a while, and I can't wait to see it happen. Please share and I'm happy to answer any Qs about the project!
www.rug.nl/about-ug/wor...
Vacatures bij de RUG
www.rug.nl
July 3, 2025 at 1:29 PM
Reposted by Christoph Strauch
Curious about the visual human brain, a vibrant and collaborative lab, and pursuing a PhD in the heart of Europe? My lab is recruiting for a 3-year PhD position. More details: www.rademakerlab.com/job-add
PhD position — Rademaker lab
www.rademakerlab.com
July 1, 2025 at 6:43 AM
We had a splendid day: great weather, got to wear peculiar/special clothes, and then Alex even defended his PhD (and nailed it!).

Congratulations dr. Alex, super proud of your achievements!!!
Thrilled to share that I successfully defended my PhD dissertation on Monday June 16th!

The dissertation is available here: doi.org/10.33540/2960
June 18, 2025 at 2:50 PM
Reposted by Christoph Strauch
Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social

Open Access link: doi.org/10.3758/s134...
June 12, 2025 at 7:21 AM
Reposted by Christoph Strauch
@vssmtg.bsky.social
presentations today!

R2, 15:00
@chrispaffen.bsky.social:
Functional processing asymmetries between nasal and temporal hemifields during interocular conflict

R1, 17:15
@dkoevoet.bsky.social:
Sharper Spatially-Tuned Neural Activity in Preparatory Overt than in Covert Attention
May 18, 2025 at 9:41 AM
Cool new preprint by Damian. Among other findings: eye pupillometry, EEG & IEMs show that the premotor theory of attention can't be the full story: eye movements are associated with an additional, separable, spatially tuned process compared to covert attention, hundreds of ms before shifts happen.
Preparing overt eye movements and directing covert attention are neurally coupled. Yet, this coupling breaks down at the single-cell level. What about populations of neurons?

We show: EEG decoding dissociates preparatory overt from covert attention at the population level:
doi.org/10.1101/2025...
May 13, 2025 at 8:17 AM
Reposted by Christoph Strauch
Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.
A move you can afford
Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.
buff.ly
May 4, 2025 at 8:21 AM
Reposted by Christoph Strauch
Very cool science here 😎

OK, I might be biased but still, reading out different cognitive strategies by combining EEG data, our reaction time decomposition method, and a deep learning sequence model is super cool!

Congrats Rick for your first preprint 🥳
Pre-print out arxiv.org/abs/2504.10028
If you decide to click on this URL too quickly, you might just skip a cognitive operation! We combine Hidden Multivariate Pattern analysis (HMP) and deep learning (Mamba/State Space Models) to detect cognitive operations at a trial level from EEG data.
Sequence models for by-trial decoding of cognitive strategies from neural data
Understanding the sequence of cognitive operations that underlie decision-making is a fundamental challenge in cognitive neuroscience. Traditional approaches often rely on group-level statistics, whic...
arxiv.org
April 15, 2025 at 10:55 AM
Reposted by Christoph Strauch
Where you look depends on the effort (measured through pupil size) associated with the eye movement (e.g., diagonal eye movements are more effortful than horizontal ones). Proud of this work! Spearheaded by @cstrauch.bsky.social and out in @elife.bsky.social elifesciences.org/digests/9776...
A move you can afford
Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.
elifesciences.org
April 8, 2025 at 7:27 AM
We show that eye-movements are selected based on effort minimization - finally final in @elife.bsky.social
eLife's digest:
elifesciences.org/digests/9776...
& the 'convincing & important' paper:
elifesciences.org/articles/97760

I consider this my coolest ever project!

#VisionScience #Neuroscience
A move you can afford
Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.
elifesciences.org
April 7, 2025 at 7:28 PM
Reposted by Christoph Strauch
New preprint!

We present two very large eye tracking datasets of museum visitors (4-81 y.o.!) who freeviewed (n=1248) or searched for a +/x (n=2827) in a single feature-rich image.

We invite you to (re)use the dataset and provide suggestions for future versions 📋

osf.io/preprints/os...
March 28, 2025 at 9:34 AM
Out in Psychophysiology (OA):

Typically, pupillometry struggles with complex stimuli. We introduced a method to study covert attention allocation in complex video stimuli -
effects of top-down attention, bottom-up attention, and pseudoneglect could all be recovered.

doi.org/10.1111/psyp.70036
<em>Psychophysiology</em> | SPR Journal | Wiley Online Library
Previous studies have shown that the pupillary light response (PLR) can physiologically index covert attention, but only with highly simplistic stimuli. With a newly introduced technique that models ....
doi.org
March 21, 2025 at 2:53 PM
Reposted by Christoph Strauch
Presaccadic attention facilitates visual continuity across eye movements. However, recent work may suggest that presaccadic attention doesn't shift upward. What's going on?

Our paper shows that presaccadic attention moves up- and downward using the pupil light response.

doi.org/10.1111/psyp.70047
<em>Psychophysiology</em> | SPR Journal | Wiley Online Library
Dominant theories posit that attentional shifts prior to saccades enable a stable visual experience despite abrupt changes in visual input caused by saccades. However, recent work may challenge this ...
onlinelibrary.wiley.com
March 19, 2025 at 8:28 AM
Reposted by Christoph Strauch
Why academia is sleepwalking into self-destruction. My editorial @brain1878.bsky.social If you agree with the sentiments please repost. It's important for all our sakes to stop the madness
academic.oup.com/brain/articl...
March 6, 2025 at 7:16 PM