Dan Wang
banner
danwang7.bsky.social
Dan Wang
@danwang7.bsky.social
PhD candidate Utrecht University|AttentionLab UU | CAP-Lab | Visual working memory | Attention
Reposted by Dan Wang
Many thanks to my co-authors!! @suryagayet.bsky.social @jthee.bsky.social @arora-borealis.bsky.social @Stefan Van der Stigchel @Samson Chota
August 27, 2025 at 9:35 PM
In conclusion, we show that the dynamic interplay between top-down control and bottom-up saliency directly impacts early visual responses, thereby illuminating a complete timeline of attentional competition in visual cortex.
August 27, 2025 at 9:33 PM
Last, the greater the RIFT responses to the target compared to the distractor, the faster the participant responded to the target, demonstrating that the RIFT responses capture behaviorally relevant processes.
August 27, 2025 at 9:32 PM
2) The presence of a distractor attenuated the initial RIFT response to the target, reflecting competition during the initial stages of visual processing
August 27, 2025 at 9:31 PM
For conditional comparisons of the RIFT responses, we found that 1)Both target and distractor evoked stronger initial RIFT responses than nontargets, reflecting top-down and bottom-up attentional effects on early visual processing. And RIFT responses to the distractor eventually be suppressed.
August 27, 2025 at 9:29 PM
For tagging manipulation, we tagged target and distractor in distractor present condition, tagged target and one of the nontarget in distractor absent condition. And frequency-tagging manipulation successfully elicited corresponding frequency-specific neural responses
August 27, 2025 at 9:25 PM
We found that the salient distractor captured attention on behavioral level
August 27, 2025 at 9:23 PM
In this study, to determine how top-down and bottom-up processes unfold over time in early visual cortex, we employed Rapid Invisible Frequency Tagging (RIFT) while participants performed the additional singleton task.
August 27, 2025 at 9:18 PM
🧠 Excited to share that our new preprint is out!🧠
In this work, we investigate the dynamic competition between bottom-up saliency and top-down goals in the early visual cortex using rapid invisible frequency tagging (RIFT).

📄 Check it out on bioRxiv: www.biorxiv.org/cgi/content/...
Dynamic competition between bottom-up saliency and top-down goals in early visual cortex
Task-irrelevant yet salient stimuli can elicit automatic, bottom-up attentional capture and compete with top-down, goal-directed processes for neural representation. However, the temporal dynamics und...
www.biorxiv.org
August 27, 2025 at 9:16 PM
Reposted by Dan Wang
Looking forward to joining #ECVP2025 tomorrow. CAP-Lab is well represented, with 3 talks (@lassedietz.bsky.social on Monday, and @arora-borealis.bsky.social and I on Tuesday), and 2 posters (by @danwang7.bsky.social on Tuesday, and @yichen-yuan.bsky.social on Wednesday). Please come by for a chat! 💜
August 24, 2025 at 8:34 PM
Reposted by Dan Wang
🚨 New preprint: Invisible neural frequency tagging (RIFT) for the underfunded researcher:
👉 www.biorxiv.org/cgi/content/...

RIFT uses high-frequency flicker to probe attention in M/EEG with minimal stimulus visibility and little distraction. Until now, it required a costly high-speed projector.
Rapid Invisible Frequency Tagging (RIFT) with a consumer monitor: A proof-of-concept
Rapid Invisible Frequency Tagging (RIFT) enables neural frequency tagging at rates above the flicker fusion threshold, eliciting steady-state responses to flicker that is almost imperceptible. While R...
www.biorxiv.org
August 22, 2025 at 11:52 AM
Reposted by Dan Wang
Excited to present at #ECVP2025 - Monday afternoon, Learning & Memory - about how anticipating relevant visual events prepares visual processing for efficient memory-guided visual selection! 🧠🥳
@attentionlab.bsky.social @ecvp.bsky.social

Preprint for more details: www.biorxiv.org/content/10.1...
Anticipated Relevance Prepares Visual Processing for Efficient Memory-Guided Selection
Finding an object typically involves the use of working memory to prioritize relevant visual information at the right time. For example, successfully detecting a highway exit sign is useless when your...
www.biorxiv.org
August 24, 2025 at 1:47 PM
Reposted by Dan Wang
Excited to give a talk at #ECVP2025 (Tuesday morning, Attention II) on how spatially biased attention during VWM does not boost excitability the same way it does when attending the external world, using Rapid Invisible Frequency Tagging (RIFT). @attentionlab.bsky.social @ecvp.bsky.social
August 24, 2025 at 1:14 PM
Excited to share that I’ll be presenting my poster at #ECVP2025 on August 26th (afternoon session)!

🧠✨ Our work focused on dynamic competition between bottom-up saliency and top-down goals in early visual cortex by using Rapid Invisible Frequency Tagging

@attentionlab.bsky.social @ecvp.bsky.social
August 24, 2025 at 1:28 PM
Reposted by Dan Wang
I had loads of fun today, sharing thoughts and projects during a joint lab-meeting with @nadinedijkstra.bsky.social's Imagine Reality Lab. Two hours were way too short to discuss all the cool projects!

Thanks everyone for your contributions 💜
July 24, 2025 at 9:00 PM
Best hunter trainer ever🫡
July 15, 2025 at 12:39 PM
Reposted by Dan Wang
In this Article, Paluch et al. show that unattended working memory items, as well as attended ones, are encoded in persistent activity in the medial temporal lobe. @jankaminski.bsky.social
www.nature.com/articles/s41...
Unattended working memory items are coded by persistent activity in human medial temporal lobe neurons - Nature Human Behaviour
Paluch et al. show that unattended working memory items, as well as attended ones, are encoded in persistent activity in the medial temporal lobe.
www.nature.com
July 9, 2025 at 11:33 AM
Reposted by Dan Wang
Thrilled to share our new opinion piece—hot off the press—on attentional sampling, co-authored with the magnificent Flor Kusnir and Daniele Re. It captures where our thinking has landed on this topic after years of work.

www.cell.com/trends/cogni...
Attentional sampling resolves competition along the visual hierarchy
Navigating the environment involves engaging with multiple objects, each activating specific neuronal populations. When objects appear together, these populations compete. Classical attention theories...
www.cell.com
July 9, 2025 at 9:20 AM
Reposted by Dan Wang
Last week's symposium titled "Advances in the Encephalographic Study of Attention" was a great success! Held in the KNAW building in Amsterdam and sponsored by the NWO, many of (Europe's) leading attention researchers assembled to discuss the latest advances in attention research using M/EEG.
June 30, 2025 at 7:13 AM
Reposted by Dan Wang
Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social

Open Access link: doi.org/10.3758/s134...
June 12, 2025 at 7:21 AM
Reposted by Dan Wang
In our new MEG/RIFT study from @thechbh.bsky.social by
@katduecker.bsky.social , we show that feature-guidance in visual search alters neuronal excitability in early visual cortex —supporting a priority-map-based attentional mechanism.
rdcu.be/eqFX7
Guided visual search is associated with target boosting and distractor suppression in early visual cortex
Communications Biology - Magnetoencephalography in human participants paired with Rapid Invisible Frequency Tagging reveals that excitability in early visual cortex is modulated to boost targets...
rdcu.be
June 12, 2025 at 10:17 AM
Reposted by Dan Wang
Thanks to the support of the Dutch Research Council (NWO) and @knaw-nl.bsky.social , we're thrilled to announce the international symposium "Advances in the Encephalographic study of Attention"! 🧠🔍

📅 Date: June 25th & 26th
📍 Location: Trippenhuis, Amsterdam
June 4, 2025 at 8:14 PM
Reposted by Dan Wang
Through experience, humans can learn to suppress locations that frequently contain distracting stimuli. Using SSVEPs and ERPs, this study shows that such learned suppression modulates early neural responses, indicating it occurs during initial visual processing.
www.jneurosci.org/content/jneu...
www.jneurosci.org
May 26, 2025 at 10:13 AM
Reposted by Dan Wang
Good morning #VSS2025, if you care for a chat about the role of attention in binding object features (during perceptual encoding and memory maintenance), drop by my poster now (8:30-12:30) in the pavilion (422). Hope to see you there!
May 19, 2025 at 12:36 PM