Dan Wang
@danwang7.bsky.social
PhD candidate Utrecht University|AttentionLab UU | CAP-Lab | Visual working memory | Attention
Reposted by Dan Wang
Sensory reformatting for a working visual memory www.sciencedirect.com/science/arti...
Sensory reformatting for a working visual memory
A core function of visual working memory (WM) is to sustain mental representations of recent visual inputs, thereby bridging moments of experience. Th…
www.sciencedirect.com
October 10, 2025 at 3:18 AM
Sensory reformatting for a working visual memory www.sciencedirect.com/science/arti...
🧠 Excited to share that our new preprint is out!🧠
In this work, we investigate the dynamic competition between bottom-up saliency and top-down goals in the early visual cortex using rapid invisible frequency tagging (RIFT).
📄 Check it out on bioRxiv: www.biorxiv.org/cgi/content/...
In this work, we investigate the dynamic competition between bottom-up saliency and top-down goals in the early visual cortex using rapid invisible frequency tagging (RIFT).
📄 Check it out on bioRxiv: www.biorxiv.org/cgi/content/...
Dynamic competition between bottom-up saliency and top-down goals in early visual cortex
Task-irrelevant yet salient stimuli can elicit automatic, bottom-up attentional capture and compete with top-down, goal-directed processes for neural representation. However, the temporal dynamics und...
www.biorxiv.org
August 27, 2025 at 9:16 PM
🧠 Excited to share that our new preprint is out!🧠
In this work, we investigate the dynamic competition between bottom-up saliency and top-down goals in the early visual cortex using rapid invisible frequency tagging (RIFT).
📄 Check it out on bioRxiv: www.biorxiv.org/cgi/content/...
In this work, we investigate the dynamic competition between bottom-up saliency and top-down goals in the early visual cortex using rapid invisible frequency tagging (RIFT).
📄 Check it out on bioRxiv: www.biorxiv.org/cgi/content/...
Reposted by Dan Wang
Looking forward to joining #ECVP2025 tomorrow. CAP-Lab is well represented, with 3 talks (@lassedietz.bsky.social on Monday, and @arora-borealis.bsky.social and I on Tuesday), and 2 posters (by @danwang7.bsky.social on Tuesday, and @yichen-yuan.bsky.social on Wednesday). Please come by for a chat! 💜
August 24, 2025 at 8:34 PM
Looking forward to joining #ECVP2025 tomorrow. CAP-Lab is well represented, with 3 talks (@lassedietz.bsky.social on Monday, and @arora-borealis.bsky.social and I on Tuesday), and 2 posters (by @danwang7.bsky.social on Tuesday, and @yichen-yuan.bsky.social on Wednesday). Please come by for a chat! 💜
Reposted by Dan Wang
🚨 New preprint: Invisible neural frequency tagging (RIFT) for the underfunded researcher:
👉 www.biorxiv.org/cgi/content/...
RIFT uses high-frequency flicker to probe attention in M/EEG with minimal stimulus visibility and little distraction. Until now, it required a costly high-speed projector.
👉 www.biorxiv.org/cgi/content/...
RIFT uses high-frequency flicker to probe attention in M/EEG with minimal stimulus visibility and little distraction. Until now, it required a costly high-speed projector.
Rapid Invisible Frequency Tagging (RIFT) with a consumer monitor: A proof-of-concept
Rapid Invisible Frequency Tagging (RIFT) enables neural frequency tagging at rates above the flicker fusion threshold, eliciting steady-state responses to flicker that is almost imperceptible. While R...
www.biorxiv.org
August 22, 2025 at 11:52 AM
🚨 New preprint: Invisible neural frequency tagging (RIFT) for the underfunded researcher:
👉 www.biorxiv.org/cgi/content/...
RIFT uses high-frequency flicker to probe attention in M/EEG with minimal stimulus visibility and little distraction. Until now, it required a costly high-speed projector.
👉 www.biorxiv.org/cgi/content/...
RIFT uses high-frequency flicker to probe attention in M/EEG with minimal stimulus visibility and little distraction. Until now, it required a costly high-speed projector.
Reposted by Dan Wang
Excited to present at #ECVP2025 - Monday afternoon, Learning & Memory - about how anticipating relevant visual events prepares visual processing for efficient memory-guided visual selection! 🧠🥳
@attentionlab.bsky.social @ecvp.bsky.social
Preprint for more details: www.biorxiv.org/content/10.1...
@attentionlab.bsky.social @ecvp.bsky.social
Preprint for more details: www.biorxiv.org/content/10.1...
Anticipated Relevance Prepares Visual Processing for Efficient Memory-Guided Selection
Finding an object typically involves the use of working memory to prioritize relevant visual information at the right time. For example, successfully detecting a highway exit sign is useless when your...
www.biorxiv.org
August 24, 2025 at 1:47 PM
Excited to present at #ECVP2025 - Monday afternoon, Learning & Memory - about how anticipating relevant visual events prepares visual processing for efficient memory-guided visual selection! 🧠🥳
@attentionlab.bsky.social @ecvp.bsky.social
Preprint for more details: www.biorxiv.org/content/10.1...
@attentionlab.bsky.social @ecvp.bsky.social
Preprint for more details: www.biorxiv.org/content/10.1...
Reposted by Dan Wang
Excited to give a talk at #ECVP2025 (Tuesday morning, Attention II) on how spatially biased attention during VWM does not boost excitability the same way it does when attending the external world, using Rapid Invisible Frequency Tagging (RIFT). @attentionlab.bsky.social @ecvp.bsky.social
August 24, 2025 at 1:14 PM
Excited to give a talk at #ECVP2025 (Tuesday morning, Attention II) on how spatially biased attention during VWM does not boost excitability the same way it does when attending the external world, using Rapid Invisible Frequency Tagging (RIFT). @attentionlab.bsky.social @ecvp.bsky.social
Excited to share that I’ll be presenting my poster at #ECVP2025 on August 26th (afternoon session)!
🧠✨ Our work focused on dynamic competition between bottom-up saliency and top-down goals in early visual cortex by using Rapid Invisible Frequency Tagging
@attentionlab.bsky.social @ecvp.bsky.social
🧠✨ Our work focused on dynamic competition between bottom-up saliency and top-down goals in early visual cortex by using Rapid Invisible Frequency Tagging
@attentionlab.bsky.social @ecvp.bsky.social
August 24, 2025 at 1:28 PM
Excited to share that I’ll be presenting my poster at #ECVP2025 on August 26th (afternoon session)!
🧠✨ Our work focused on dynamic competition between bottom-up saliency and top-down goals in early visual cortex by using Rapid Invisible Frequency Tagging
@attentionlab.bsky.social @ecvp.bsky.social
🧠✨ Our work focused on dynamic competition between bottom-up saliency and top-down goals in early visual cortex by using Rapid Invisible Frequency Tagging
@attentionlab.bsky.social @ecvp.bsky.social
Reposted by Dan Wang
I had loads of fun today, sharing thoughts and projects during a joint lab-meeting with @nadinedijkstra.bsky.social's Imagine Reality Lab. Two hours were way too short to discuss all the cool projects!
Thanks everyone for your contributions 💜
Thanks everyone for your contributions 💜
July 24, 2025 at 9:00 PM
I had loads of fun today, sharing thoughts and projects during a joint lab-meeting with @nadinedijkstra.bsky.social's Imagine Reality Lab. Two hours were way too short to discuss all the cool projects!
Thanks everyone for your contributions 💜
Thanks everyone for your contributions 💜
Reposted by Dan Wang
In this Article, Paluch et al. show that unattended working memory items, as well as attended ones, are encoded in persistent activity in the medial temporal lobe. @jankaminski.bsky.social
www.nature.com/articles/s41...
www.nature.com/articles/s41...
Unattended working memory items are coded by persistent activity in human medial temporal lobe neurons - Nature Human Behaviour
Paluch et al. show that unattended working memory items, as well as attended ones, are encoded in persistent activity in the medial temporal lobe.
www.nature.com
July 9, 2025 at 11:33 AM
In this Article, Paluch et al. show that unattended working memory items, as well as attended ones, are encoded in persistent activity in the medial temporal lobe. @jankaminski.bsky.social
www.nature.com/articles/s41...
www.nature.com/articles/s41...
Reposted by Dan Wang
Thrilled to share our new opinion piece—hot off the press—on attentional sampling, co-authored with the magnificent Flor Kusnir and Daniele Re. It captures where our thinking has landed on this topic after years of work.
www.cell.com/trends/cogni...
www.cell.com/trends/cogni...
Attentional sampling resolves competition along the visual hierarchy
Navigating the environment involves engaging with multiple objects, each activating
specific neuronal populations. When objects appear together, these populations compete.
Classical attention theories...
www.cell.com
July 9, 2025 at 9:20 AM
Thrilled to share our new opinion piece—hot off the press—on attentional sampling, co-authored with the magnificent Flor Kusnir and Daniele Re. It captures where our thinking has landed on this topic after years of work.
www.cell.com/trends/cogni...
www.cell.com/trends/cogni...
Reposted by Dan Wang
Last week's symposium titled "Advances in the Encephalographic Study of Attention" was a great success! Held in the KNAW building in Amsterdam and sponsored by the NWO, many of (Europe's) leading attention researchers assembled to discuss the latest advances in attention research using M/EEG.
June 30, 2025 at 7:13 AM
Last week's symposium titled "Advances in the Encephalographic Study of Attention" was a great success! Held in the KNAW building in Amsterdam and sponsored by the NWO, many of (Europe's) leading attention researchers assembled to discuss the latest advances in attention research using M/EEG.
Reposted by Dan Wang
Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social
Open Access link: doi.org/10.3758/s134...
Open Access link: doi.org/10.3758/s134...
June 12, 2025 at 7:21 AM
Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social
Open Access link: doi.org/10.3758/s134...
Open Access link: doi.org/10.3758/s134...
Reposted by Dan Wang
In our new MEG/RIFT study from @thechbh.bsky.social by
@katduecker.bsky.social , we show that feature-guidance in visual search alters neuronal excitability in early visual cortex —supporting a priority-map-based attentional mechanism.
rdcu.be/eqFX7
@katduecker.bsky.social , we show that feature-guidance in visual search alters neuronal excitability in early visual cortex —supporting a priority-map-based attentional mechanism.
rdcu.be/eqFX7
Guided visual search is associated with target boosting and distractor suppression in early visual cortex
Communications Biology - Magnetoencephalography in human participants paired with Rapid Invisible Frequency Tagging reveals that excitability in early visual cortex is modulated to boost targets...
rdcu.be
June 12, 2025 at 10:17 AM
In our new MEG/RIFT study from @thechbh.bsky.social by
@katduecker.bsky.social , we show that feature-guidance in visual search alters neuronal excitability in early visual cortex —supporting a priority-map-based attentional mechanism.
rdcu.be/eqFX7
@katduecker.bsky.social , we show that feature-guidance in visual search alters neuronal excitability in early visual cortex —supporting a priority-map-based attentional mechanism.
rdcu.be/eqFX7
Reposted by Dan Wang
Thanks to the support of the Dutch Research Council (NWO) and @knaw-nl.bsky.social , we're thrilled to announce the international symposium "Advances in the Encephalographic study of Attention"! 🧠🔍
📅 Date: June 25th & 26th
📍 Location: Trippenhuis, Amsterdam
📅 Date: June 25th & 26th
📍 Location: Trippenhuis, Amsterdam
June 4, 2025 at 8:14 PM
Thanks to the support of the Dutch Research Council (NWO) and @knaw-nl.bsky.social , we're thrilled to announce the international symposium "Advances in the Encephalographic study of Attention"! 🧠🔍
📅 Date: June 25th & 26th
📍 Location: Trippenhuis, Amsterdam
📅 Date: June 25th & 26th
📍 Location: Trippenhuis, Amsterdam
Reposted by Dan Wang
Through experience, humans can learn to suppress locations that frequently contain distracting stimuli. Using SSVEPs and ERPs, this study shows that such learned suppression modulates early neural responses, indicating it occurs during initial visual processing.
www.jneurosci.org/content/jneu...
www.jneurosci.org/content/jneu...
www.jneurosci.org
May 26, 2025 at 10:13 AM
Through experience, humans can learn to suppress locations that frequently contain distracting stimuli. Using SSVEPs and ERPs, this study shows that such learned suppression modulates early neural responses, indicating it occurs during initial visual processing.
www.jneurosci.org/content/jneu...
www.jneurosci.org/content/jneu...
Reposted by Dan Wang
Good morning #VSS2025, if you care for a chat about the role of attention in binding object features (during perceptual encoding and memory maintenance), drop by my poster now (8:30-12:30) in the pavilion (422). Hope to see you there!
May 19, 2025 at 12:36 PM
Good morning #VSS2025, if you care for a chat about the role of attention in binding object features (during perceptual encoding and memory maintenance), drop by my poster now (8:30-12:30) in the pavilion (422). Hope to see you there!
🚨My first paper with Samson Chota, Luzi Xu , Stefan Van der Stigchel and @suryagayet.bsky.social accepted in Consciousness and Cognition(www.sciencedirect.com/science/arti...)
We asked whether the impact of VWM content on early visual processing depends on the priority state of the memory items.
The priority state of items in visual working memory determines their influence on early visual processing
Items held in visual working memory (VWM) influence early visual processing by enhancing memory-matching visual input. Depending on current task deman…
www.sciencedirect.com
December 20, 2024 at 12:01 AM
🚨My first paper with Samson Chota, Luzi Xu , Stefan Van der Stigchel and @suryagayet.bsky.social accepted in Consciousness and Cognition(www.sciencedirect.com/science/arti...)
We asked whether the impact of VWM content on early visual processing depends on the priority state of the memory items.