Alex J. Hoogerbrugge
ajhoogerbrugge.bsky.social
Alex J. Hoogerbrugge
@ajhoogerbrugge.bsky.social
Postdoc at University of Manchester | Interested in memory, attention, visual search, eye movements | he/him
Reposted by Alex J. Hoogerbrugge
Synesthetes claim sensory experiences, such as seeing color when reading or hearing a (black) number. 
But how genuine are these reports and sensations? We introduce a rather direct measure of synesthetic perception: Synesthetes’ pupils respond to evoked color as if it was real color #vision! 👁️🎨🧪
November 26, 2025 at 4:40 PM
Reposted by Alex J. Hoogerbrugge
#EenWereldVolDenkers is een reis door de gedachtewerelden van mens, dier, plant en AI! Als je hem nu voorbestelt, ontvang je direct gratis de onderstaande verhalenbundel (Undercurrents) als e-book + kans op een uniek, handgemaakt exemplaar! forms.gle/bEJz4Spk2p72... #psychologie #biology #wetenschap
August 8, 2025 at 2:37 PM
Reposted by Alex J. Hoogerbrugge
🚨 PostDoc Opening 🚨 The lab of Klaus Oberauer is looking for a new postdoc, starting end of this/beginning next year. Research focus is #cognition, #workingmemory, #methods and #computationalmodeling or anything in that direction. I cannot highlight ENOUGH how great it is to work in this lab 🥰🤓
August 6, 2025 at 12:05 PM
Reposted by Alex J. Hoogerbrugge
Gaze heatmaps (are popular especially for eye-tracking beginners and in many applied domains. How many participants should be tested?
Depends of course, but our guidelines help navigating this in an informed way.

Out now in BRM (free) doi.org/10.3758/s134...
@psychonomicsociety.bsky.social
July 29, 2025 at 7:37 AM
Reposted by Alex J. Hoogerbrugge
Jelmer Borst and I are looking for a PhD candidate to build an EEG-based model of human working memory! This is a really cool project that I've wanted to kick off for a while, and I can't wait to see it happen. Please share and I'm happy to answer any Qs about the project!
www.rug.nl/about-ug/wor...
Vacatures bij de RUG
www.rug.nl
July 3, 2025 at 1:29 PM
Reposted by Alex J. Hoogerbrugge
Postdoc position open in our #workingmemory lab! See here for more info: www.unige.ch/fapse/womcog...
June 30, 2025 at 11:07 AM
Just added a few thousand new search datasets, bringing the total up to almost 6k museum visitors 😮‍💨

Updated preprint: osf.io/preprints/os...
Data: osf.io/kf4sb/
June 28, 2025 at 10:24 AM
Thrilled to share that I successfully defended my PhD dissertation on Monday June 16th!

The dissertation is available here: doi.org/10.33540/2960
June 18, 2025 at 2:21 PM
Reposted by Alex J. Hoogerbrugge
Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social

Open Access link: doi.org/10.3758/s134...
June 12, 2025 at 7:21 AM
Reposted by Alex J. Hoogerbrugge
I'm excited to announce that my lab's open textbook on Scientific Computing for Cognitive Neuroscience (v1.0) has just gone live! Our goal is to help mend the gap between the computational skills needed by cognitive neuroscience, and typical curricula that don't yet include it. 1/3
June 9, 2025 at 4:10 PM
Reposted by Alex J. Hoogerbrugge
Attending @vssmtg.bsky.social? Come check out my talk on EEG decoding of preparatory overt and covert attention!

Tomorrow in the Attention: Neural Mechanisms session at 17:15. You can check out the preprint in the meantime:
Preparing overt eye movements and directing covert attention are neurally coupled. Yet, this coupling breaks down at the single-cell level. What about populations of neurons?

We show: EEG decoding dissociates preparatory overt from covert attention at the population level:
doi.org/10.1101/2025...
May 17, 2025 at 6:06 PM
Reposted by Alex J. Hoogerbrugge
We previously showed that affordable eye movements are preferred over costly ones. What happens when salience comes into play?

In our new paper, we show that even when salience attracts gaze, costs remain a driver of saccade selection.

OA paper here:
doi.org/10.3758/s134...
May 16, 2025 at 1:36 PM
Reposted by Alex J. Hoogerbrugge
About time our latest project about time got out!

How do self-paced encoding and retention relate to performance in (working) memory-guided actions?

Find out now in Memory and Cognition: doi.org/10.3758/s134...
(or check the short version below)🧵
The rise and fall of memories: Temporal dynamics of visual working memory - Memory & Cognition
Visual working memory (VWM) is a cognitive system, which temporarily stores task-relevant visual information to enable interactions with the environment. In everyday VWM use, we typically decide how l...
doi.org
May 15, 2025 at 9:56 AM
Reposted by Alex J. Hoogerbrugge
Preparing overt eye movements and directing covert attention are neurally coupled. Yet, this coupling breaks down at the single-cell level. What about populations of neurons?

We show: EEG decoding dissociates preparatory overt from covert attention at the population level:
doi.org/10.1101/2025...
May 13, 2025 at 7:51 AM
Thrilled to share that, as of May 1st, I have started as a postdoc at The University of Manchester!

I will investigate looked-but-failed-to-see (LBFTS) errors in visual search, under the expert guidance of Johan Hulleman and Jeremy Wolfe. Watch this space!
May 7, 2025 at 12:37 PM
Reposted by Alex J. Hoogerbrugge
In our latest paper @elife.bsky.social we show that we choose to move our eyes based on effort minimization. Put simply, we prefer affordable over more costly eye movements.

eLife's digest:
elifesciences.org/digests/9776...

The paper:
elifesciences.org/articles/97760

#VisionScience
elifesciences.org
April 8, 2025 at 8:06 AM
Reposted by Alex J. Hoogerbrugge
We show that eye-movements are selected based on effort minimization - finally final in @elife.bsky.social
eLife's digest:
elifesciences.org/digests/9776...
& the 'convincing & important' paper:
elifesciences.org/articles/97760

I consider this my coolest ever project!

#VisionScience #Neuroscience
A move you can afford
Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.
elifesciences.org
April 7, 2025 at 7:28 PM
New preprint!

We present two very large eye tracking datasets of museum visitors (4-81 y.o.!) who freeviewed (n=1248) or searched for a +/x (n=2827) in a single feature-rich image.

We invite you to (re)use the dataset and provide suggestions for future versions 📋

osf.io/preprints/os...
March 28, 2025 at 9:34 AM
PI @cstrauch.bsky.social dragged me to a “German” conference – turned out to be an absolute blast (and very international)!

Great science, even better people :)
Warmest thank you to everyone for coming to #teap2025 We wish you a smooth trip home!! May the science be with you 🦾
March 12, 2025 at 9:11 PM
Reposted by Alex J. Hoogerbrugge
Our paper on how we use hearing and vision to localize(multimodal) objects, is now out in Journal of Experimental Psychology: General (dx.doi.org/10.1037/xge0...)! See @yichen-yuan.bsky.social's thread below for a thread!

#MultisensoryPerception #MotionTracking #MotionPrediction #WorkingMemory
January 29, 2025 at 12:14 PM
Reposted by Alex J. Hoogerbrugge
🎉New paper out in JEP:HPP with@andresahakian.bsky.social @suryagayet.bsky.social @chrispaffen.bsky.social and Stefan Van der Stigchel. We asked whether memory traces are formed for items that have not yet been selected for immediate action, while we are actively sampling targets for imminent action.
January 8, 2025 at 10:32 AM
Reposted by Alex J. Hoogerbrugge
New popscience piece on why pupil size changes are so cool. psyche.co/ideas/the-pu...
Included: an assignment that lets you measure pupil size. In my classes, this replicates Hess & Polt's 1964 effort finding without an eyetracker. Feel free to use it!

#VisionScience #neuroscience #psychology 🧪
Psyche | on the human condition
Psyche is a digital magazine from Aeon Media that illuminates the human condition through psychology, philosophy and the arts.
psyche.co
January 6, 2025 at 12:21 PM
Reposted by Alex J. Hoogerbrugge
In a dynamic world, items appear, disappear, and reappear within seconds. In our latest preprint (now with an additional experiment) we show: the reappearance of maintained items guides the prioritization of non-reappearing memory items.

www.biorxiv.org/content/10.1...
Sensory Input Matching Visual Working Memory Guides Internal Prioritization
Adaptive behavior necessitates the prioritization of the most relevant information in the environment (external) and in memory (internal). Internal prioritization is known to guide the selection of ex...
www.biorxiv.org
December 11, 2024 at 2:03 PM
Reposted by Alex J. Hoogerbrugge
We often rely on the external world rather than fully loading memory- if it's easy to access external information at least.

@tianyingq.bsky.social's new PBR metaanalysis across 28 exp shows: increases in access cost reliably push toward internal WM use.

doi.org/10.3758/s134...

#VisionScience 🧪
December 5, 2024 at 8:09 AM
Reposted by Alex J. Hoogerbrugge
Scientists, academics, researchers: We’re excited to share that @altmetric.com is now tracking mentions of your research on Bluesky! 🧪
There are already many articles for which there is more attention on Bluesky than on other comparable micro-blogging sites, meaning the academic community and the general public have clearly adopted Bluesky as one of its core places to disseminate and discuss new research.

A Place of Joy.
December 3, 2024 at 2:10 PM