dorottya hetenyi
banner
dotproduct.bsky.social
dorottya hetenyi
@dotproduct.bsky.social
PhD student studying visual perception using MEG @ FIL, UCL

curious about how we obtain and process information across time, also how our perception is influenced by our prior knowledge.

loves cats and bikes. 🐈‍⬛
Pinned
Super happy to share my very first first-author paper out in
@sfnjournals.bsky.social! We show content-specific predictions are represented in an alpha rhythm. It’s been a beautiful, inspiring, yet challenging journey.
Huge thanks to everyone, especially @peterkok.bsky.social @jhaarsma.bsky.social
Reposted by dorottya hetenyi
New BBS article w/ @lauragwilliams.bsky.social and Hinze Hogendoorn, just accepted! We respond to a thought-provoking article by @smfleming.bsky.social & @matthiasmichel.bsky.social, and argue that it's premature to conclude that conscious perception is delayed by 350-450ms: bit.ly/4nYNTlb
OSF
bit.ly
September 29, 2025 at 7:00 PM
Reposted by dorottya hetenyi
A ✨bittersweet✨ moment – after 5 years at UCL, my final first-author project with @smfleming.bsky.social is ready to read as a preprint! 🥲
Distinct neural representations of perceptual and numerical absence in the human brain: https://doi.org/10.31234/osf.io/zyrdk_v1
July 25, 2025 at 9:23 AM
Reposted by dorottya hetenyi
I said it before and I'll say it again: Cognition is rhythmic
Contents of visual predictions oscillate at alpha frequencies
www.jneurosci.org/content/earl...
#neuroscience
Contents of visual predictions oscillate at alpha frequencies
Predictions of future events have a major impact on how we process sensory signals. However, it remains unclear how the brain keeps predictions online in anticipation of future inputs. Here, we combin...
www.jneurosci.org
October 21, 2025 at 12:00 PM
Super happy to share my very first first-author paper out in
@sfnjournals.bsky.social! We show content-specific predictions are represented in an alpha rhythm. It’s been a beautiful, inspiring, yet challenging journey.
Huge thanks to everyone, especially @peterkok.bsky.social @jhaarsma.bsky.social
October 21, 2025 at 3:57 PM
Reposted by dorottya hetenyi
From line drawings to scene perception — our new review argues for moving beyond experimenter-driven manipulations toward participant-driven approaches to reveal what’s in our internal models of the visual world. 👁️✍️🛋
royalsocietypublishing.org/doi/10.1098/...
Characterizing internal models of the visual environment | Proceedings of the Royal Society B: Biological Sciences
Despite the complexity of real-world environments, natural vision is seamlessly efficient. To explain this efficiency, researchers often use predictive processing frameworks, in which perceptual effic...
royalsocietypublishing.org
October 8, 2025 at 9:07 AM
Hellohello #ICON2025! Please come and have a chat with me today at 10.45am about some content-specific alpha fluctuations! 🤓
Finally, on Friday at 10:45, we have Imagine Reality Lab affiliate @dotproduct.bsky.social presenting:

P6.36 | Pre-stimulus Shape Predictions Fluctuate At Alpha Rhythms And Bias Subsequent Perception.

Showing how content-specific pre-stimulus alpha-band oscillations influence perception.
September 19, 2025 at 8:15 AM
Reposted by dorottya hetenyi
At long last, the pre-print to our MEG study + RIFT study and the final paper from my Ph.D with @olejensen.bsky.social We show that strong pre-search alpha oscillations are associated with faster responses in visual search www.biorxiv.org/content/10.1...
@thechbh.bsky.social #neuroskyence
August 30, 2025 at 2:26 PM
Reposted by dorottya hetenyi
Very proud to share this one🥹! We show that personalized signatures of brain activity are heritable and relate to the expression of specific genes. That means my brain-fingerprint is very similar to my twin brother's! #ResearchIsMeSearch🧠 🧬 ♊️
Genetic foundations of interindividual neurophysiological variability
Individual brain activity profiles are shaped by lifelong genetic influences.
www.science.org
July 24, 2025 at 8:20 AM
very cool stuff from brilliant people, an absolute must-read!!🫶
🚨New preprint🚨 out with the dream team @matanmazor.bsky.social @giuliacabbai.bsky.social and @nadinedijkstra.bsky.social!

We report a novel and robust effect across five different datasets: vivid imagery is reported faster than weak imagery.

📝: osf.io/preprints/ps...
July 21, 2025 at 3:34 PM
Hellohello again! Tomorrow at 5:15pm I’m giving a short talk on our latest MEG study about our very well-loved oscillating perceptual predictions @meguki2025.bsky.social. Come by and talk brains!
There are amazing talks and very cool science happening all around! MEGUKI 2025👌
Programme – MEG UKI
meguk.ac.uk
July 16, 2025 at 1:02 PM
Reposted by dorottya hetenyi
Out now @cp-trendscognsci.bsky.social, w/ @akalt.bsky.social & @drmattdavis.bsky.social.

Are sensory sampling rhythms fixed by intrinsically-determined processes, or do they couple to external structure? Here we highlight the incompatibility between these accounts and propose a resolution [1/6]
June 19, 2025 at 11:18 AM
Reposted by dorottya hetenyi
NEW DEADLINE: Friday 20th June 🚨

MEG‑UKI 2025 lands in London (16–18 July)! A 3-day deep dive into the brain—naturalistic neuroscience, OP-MEG, cutting-edge methods, and real-world impact. Keynotes by Dominik Bach & Jamie Ward. Art, abstracts, and more!

Register here: meguk.ac.uk/registration/
June 17, 2025 at 10:38 AM
Reposted by dorottya hetenyi
Finally out in @commsbio.nature.com !
Using MEG and Rapid Invisible Frequency Tagging (RIFT) in a classic visual search paradigm we show that neuronal excitability in V1 is modulated in line with a priority-map-based mechanism to boost targets and suppress distractors!
June 11, 2025 at 8:40 PM
Reposted by dorottya hetenyi
Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social

Open Access link: doi.org/10.3758/s134...
June 12, 2025 at 7:21 AM
Beautiful paper by the brilliant @jhaarsma.bsky.social exploring how false and veridical percepts converge and diverge. Turns out both share content-specific and confidence signals; and hellohello pre-stimulus alpha... what are you up to with false percepts?
June 10, 2025 at 11:17 AM
Reposted by dorottya hetenyi
Looking for a PhD in #psychology or #cogneuro ? I’m recruiting a fully-funded @leverhulme.ac.uk PhD student to join my lab at @birkbeckpsychology.bsky.social. If you’re interested in metacognition, learning, social cognition and culture I’d love you to apply 🧠👇

www.jobs.ac.uk/job/DNJ330/p...
PhD Studentship: Public Communication and Private Confidence at Birkbeck, University of London
Find a PhD Studentship: Public Communication and Private Confidence on jobs.ac.uk, the top job board in higher education. Click to view more!
www.jobs.ac.uk
June 6, 2025 at 12:07 PM
Reposted by dorottya hetenyi
Our study using layer fMRI to study the direction of communication between the hippocampus and cortex during perceptual predictions is finally out in Science Advances! Predicted-but-omitted shapes are represented in CA2/3 and correlate specifically with deep layers of PHC, suggesting feedback. 🧠🟦
Communication of perceptual predictions from the hippocampus to the deep layers of the parahippocampal cortex
High-resolution neuroimaging reveals stimulus-specific predictions sent from hippocampus to the neocortex during perception.
www.science.org
May 22, 2025 at 1:55 AM
Reposted by dorottya hetenyi
I am so excited to share that our paper 'A neural basis for distinguishing imagination from reality' is now published in @cp-neuron.bsky.social! 🧠✨ See thread below! doi.org/10.1016/j.ne...
June 5, 2025 at 3:05 PM
Reposted by dorottya hetenyi
News story on the UCL Brain Sciences website about our recent paper using 7T fMRI to study the communication between hippocampus and neocortex. www.ucl.ac.uk/brain-scienc... Link to the paper: www.science.org/doi/10.1126/.... #neuroskyence
Researchers reveal how our brains predict what we’re about to see
Researchers in the UCL Queen Square Institute of Neurology find that the hippocampus sends signals to the visual cortex to predict what we are about to see.
www.ucl.ac.uk
June 5, 2025 at 8:22 AM
Reposted by dorottya hetenyi
🚨Excited to share my first preprint from my postdoc with @smfleming.bsky.social! 🚨 We explore how attention shapes simplified mental representations for planning. We show that inductive biases characteristic of attentional selection shape how we plan. Check it out: osf.io/preprints/ps... 🧠🔦🤖
OSF
osf.io
May 22, 2025 at 11:56 AM
Reposted by dorottya hetenyi
New paper out in @plosbiology.org w/ Charlie, @phil-johnson.bsky.social, Ella, and Hinze 🎉

We track moving stimuli via EEG, find evidence that motion is extrapolated across distinct stages of processing + show how this effect may emerge from a simple synaptic learning rule!

tinyurl.com/2szh6w5c
May 23, 2025 at 8:34 PM
This was looooads of fun! Thank you so much everyone for the amazing and interesting chats.🫶
Also @pieterbarkema.bsky.social has a very cool talk on Monday @ 11.15am on some yummy layer-specific fMRI of postdictive perception.🙀 #VSS2025
Hellohello! Come say hi today, 3:00-5:00pm, Banyan Breezeway (16.322).
How does the brain hold onto predictions before something even happens? We show that predicted shape info lives in pre-stimulus alpha-band oscillations (10–11Hz), and biases perception without boosting sensitivity.🧠
#VSS2025
Looking forward to #VSS2025! On the first day, @dotproduct.bsky.social sky.social will be presenting a poster on how predictions embedded in alpha oscillations modulate perception of noisy stimuli. Come one, come all! Poster 16.322, Friday 3-5pm, Banyan Breezeway. #neuroskyence #visionscience
May 17, 2025 at 1:52 PM
Hellohello! Come say hi today, 3:00-5:00pm, Banyan Breezeway (16.322).
How does the brain hold onto predictions before something even happens? We show that predicted shape info lives in pre-stimulus alpha-band oscillations (10–11Hz), and biases perception without boosting sensitivity.🧠
#VSS2025
Looking forward to #VSS2025! On the first day, @dotproduct.bsky.social sky.social will be presenting a poster on how predictions embedded in alpha oscillations modulate perception of noisy stimuli. Come one, come all! Poster 16.322, Friday 3-5pm, Banyan Breezeway. #neuroskyence #visionscience
May 16, 2025 at 12:12 PM
Reposted by dorottya hetenyi
A large-scale collaborative consensus piece on predictive processing is now online arxiv.org/abs/2504.09614 impressively orchestrated by @jeromelecoq.bsky.social
Neural mechanisms of predictive processing: a collaborative community experiment through the OpenScope program
This review synthesizes advances in predictive processing within the sensory cortex. Predictive processing theorizes that the brain continuously predicts sensory inputs, refining neuronal responses by...
arxiv.org
April 15, 2025 at 6:40 AM
Reposted by dorottya hetenyi
Hey cognitive scientists

I asked lovable.dev to make me a 2-arm bandit RL task

2 prompts and 3 minutes later
🤯🤯🤯🤯

Play it:
preview--cosmic-bandit-quest.lovable.app

Think of the hours and grad student tears saved 😅

Yes lovable I accept your sponsorship terms
April 9, 2025 at 7:41 PM