Kabir Arora
arora-borealis.bsky.social
Kabir Arora
@arora-borealis.bsky.social
PhD Candidate studying the 🧠
@AttentionLab, Utrecht University

Attention | Visual Working Memory | EEG | Rapid Inivisible Frequency Tagging
Together with recent work by @olaf.dimigen.de on using RIFT with a monitor setup, as well as our own new preprint on using a monitor setup to track attention (www.biorxiv.org/content/10.1...), RIFT is now a lot more accessible both in terms of available recommendations and materials.
Tracking attention using RIFT with a consumer-monitor setup
Rapid Invisible Frequency Tagging (RIFT) is a recent technique that extends the traditional frequency tagging approach by stimulating at frequencies beyond the threshold of perception (≥60Hz). By doin...
www.biorxiv.org
October 29, 2025 at 10:52 AM
We offer advice and data-driven recommendations on hardware, experimental design, and analysis considerations.

If you're a cognition researcher considering using RIFT, it just got a lot easier!
October 29, 2025 at 10:52 AM
Planning on running a RIFT study? In a new manuscript, we put together the RIFT know-how accumulated over the years by multiple labs (@lindadrijvers.bsky.social, @schota.bsky.social, @eelkespaak.bsky.social, with Cecília Hustá and others).

Preprint: osf.io/preprints/ps...
OSF
osf.io
October 29, 2025 at 10:52 AM
Reposted by Kabir Arora
Spatial attention and working memory are popularly thought to be tightly coupled. Yet, distinct neural activity tracks attentional breadth and WM load.

In a new paper @jocn.bsky.social, we show that pupil size independently tracks breadth and load.

doi.org/10.1162/JOCN...
October 14, 2025 at 2:04 PM
Reposted by Kabir Arora
If you were as unfortunate as me, and missed King Kabir (@arora-borealis.bsky.social)'s talk at #ECVP2025 on the differences (in early visual processing) between internal and external attentional selection... no worries, you can find the paper here: share.google/TDIZCDK9puB6...
August 28, 2025 at 11:44 AM
Reposted by Kabir Arora
🧠 Excited to share that our new preprint is out!🧠
In this work, we investigate the dynamic competition between bottom-up saliency and top-down goals in the early visual cortex using rapid invisible frequency tagging (RIFT).

📄 Check it out on bioRxiv: www.biorxiv.org/cgi/content/...
Dynamic competition between bottom-up saliency and top-down goals in early visual cortex
Task-irrelevant yet salient stimuli can elicit automatic, bottom-up attentional capture and compete with top-down, goal-directed processes for neural representation. However, the temporal dynamics und...
www.biorxiv.org
August 27, 2025 at 9:16 PM
Reposted by Kabir Arora
And now without bluesky making the background black...
August 24, 2025 at 8:57 PM
Reposted by Kabir Arora
Excited to share that I’ll be presenting my poster at #ECVP2025 on August 26th (afternoon session)!

🧠✨ Our work focused on dynamic competition between bottom-up saliency and top-down goals in early visual cortex by using Rapid Invisible Frequency Tagging

@attentionlab.bsky.social @ecvp.bsky.social
August 24, 2025 at 1:28 PM
Excited to give a talk at #ECVP2025 (Tuesday morning, Attention II) on how spatially biased attention during VWM does not boost excitability the same way it does when attending the external world, using Rapid Invisible Frequency Tagging (RIFT). @attentionlab.bsky.social @ecvp.bsky.social
August 24, 2025 at 1:14 PM
Reposted by Kabir Arora
Looking forward to @compcogneuro.bsky.social's #CCN2025, which takes place in my backyard this year.

If you are there as well, hook me up for a chat, and go and visit
@lassedietz.bsky.social, @liangyouzhang.bsky.social, and
@arora-borealis.bsky.social's posters on Tue/Wed/Fri.

[1/2]
August 11, 2025 at 1:13 PM