Will Harrison
banner
willjharrison.bsky.social
Will Harrison
@willjharrison.bsky.social
Vision scientists and horror film enthusiast.

I also lecture at the University of the Sunshine Coast when I'm not at the beach.
Reposted by Will Harrison
Happy to share this new paper from the lab led by Angus Chapman, now out in PLoS Biology! It presents an integrated spatiotemporal normalization model for continuous vision. @afchapman.bsky.social

journals.plos.org/plosbiology/...
A dynamic spatiotemporal normalization model captures perceptual and neural effects of spatial and temporal context
The effects of spatial and temporal context on sensory systems have mostly been studied independently of each other. This study shows that the modulation of visual perception and neural activity by th...
journals.plos.org
December 18, 2025 at 6:30 PM
The idea to use a bot (probably) to write about the idea that we could be running out of new ideas is itself a new idea, thereby proving that we are not, in fact, running out of new ideas.
This column on Vox is the biggest pile of steaming AI crap I've seen this year. Human's aren't running out of ideas. Instead, those with power and money don't want to listen to ideas that threaten their power and money.

Ideas are plentiful. Putting those ideas into action is the hard part.
We’re running out of good ideas. AI might be how we find new ones.
What if the best use of AI is restarting the world’s idea machine?
www.vox.com
December 14, 2025 at 2:11 AM
Reposted by Will Harrison
There is money just sitting on the table for a university brave enough to grab it.
The first school to market itself as AI free is going to corner the market on people interested in actually learning. And I would not be surprised if rich families and the children of people creating this tech were the first movers.
My employer, Dartmouth College, today boasts it's 1st Ivy "to launch AI at an institutional scale." It is doing this by partnering--"more than a collaboration"--with Anthropic, a company that stole the books of many faculty, me included, which many of us are suing.
December 5, 2025 at 4:20 AM
Reposted by Will Harrison
Do you have an open working memory dataset and want it to be findable and reused? You can now add it to the Open WM Data Hub: williamngiam.github.io/OpenWMData! The collection of datasets tagged with useful metadata is steadily growing thanks to a small team of volunteers!
OpenWMData
A collection of publicly available working memory datasets
williamngiam.github.io
December 1, 2025 at 11:28 PM
Reposted by Will Harrison
Thank you to the 2025 IJE cohort who helped improve manuscripts this year via our pre-external review process!

Matthew O'Donohue: scholar.google.com.au/citations?us...
Marcell Székely: scholar.google.com/citations?hl...
Zekun Sun: zekun-sun.github.io
December 3, 2025 at 11:48 PM
Reposted by Will Harrison
New pre-print with @drbarner.bsky.social! We ask how children come to understand age. We find that young children use numerical age and facial morphology to identify who’s older, not just size, and point to acquiring a number system as key to developing an understanding of age.
osf.io/gvb46
OSF
osf.io
December 1, 2025 at 5:04 PM
Reposted by Will Harrison
My Visual Attention in Crisis paper has finally appeared, along with 30 commentaries and response. I argue that it’s time to rethink attention from the ground up, and suggest key phenomena and possible directions. Requires access, alas. doi.org/10.1017/S014...
Visual Attention in Crisis | Behavioral and Brain Sciences | Cambridge Core
Visual Attention in Crisis - Volume 48
doi.org
November 26, 2025 at 1:23 PM
Delicious.
November 26, 2025 at 1:26 AM
Reposted by Will Harrison
despite the questionable choice of keynote speaker, i think this is likely going to be a good conference
It is our great honour to announce the APCV Keynote at #epc-apcv-2026: Prof. Hakwan Lau from the Institute for Basic Science in Korea!
The call for member-initiated symposia & abstracts is now open: visualneuroscience.auckland.ac.nz/epc-apcv-2026

#psychscisky #visionscience #neuroskyence
November 25, 2025 at 1:04 AM
One of the few papers of my own that I enjoy reading is an analysis of a wonderful image database. But I'm a nerd.

Harrison, W. J. (2022). Luminance and Contrast of Images in the THINGS Database: Perception, 51(4), 244–262. doi.org/10.1177/0301...
November 21, 2025 at 9:14 AM
Reposted by Will Harrison
Please repost! I am looking for a PhD candidate in the area of Computational Cognitive Neuroscience to start in early 2026.

The position is funded as part of the Excellence Cluster "The Adaptive Mind" at @jlugiessen.bsky.social.

Please apply here until Nov 25:
www.uni-giessen.de/de/ueber-uns...
November 4, 2025 at 1:57 PM
Reposted by Will Harrison
This is wild!
Introducing CorText: a framework that fuses brain data directly into a large language model, allowing for interactive neural readout using natural language.

tl;dr: you can now chat with a brain scan 🧠💬

1/n
November 3, 2025 at 6:12 PM
Reposted by Will Harrison
A reminder that the original Hallowe'en jack o' lanterns date back to 18th century Ireland if not earlier, were based on a shady boozy blacksmith called Stingy Jack who cheated the devil & was trapped between 2 worlds, were carved from turnips and looked like this:
October 31, 2025 at 7:59 AM
Great thread and I wanna throw in my two cents too: increasing N does not increase the chance of a false positive per se (theoretically it'll always be alpha).

However, increasing N, particularly with lots of DVs, makes very small confounds more likely to produce a statistically significant result.
There still seems to be a lot of confusion about significance testing in psych. No, p-values *don’t* become useless at large N. This flawed point also used to be framed as "too much power". But power isn't the problem – it's 1) unbalanced error rates and 2) the (lack of a) SESOI. 1/ >
November 1, 2025 at 12:56 AM
Reposted by Will Harrison
*I* would never say this of course, but SOME people might say it's almost like university managements are in on the grift.
Maybe the thing that gives me a pulse-pounding throbbing headache is when I think of how techbros sold ChatGPT to universities as ‘revolutionizing’ higher education and our institutions just forked over money without even asking for actual evidence
When people learn with ChatGPT instead of following their own searches, they end up knowing less, caring less, and producing worse advice, even when the facts are the same.

Friction is an essential ingredient for learning! Convenience makes us shallow.

academic.oup.com/pnasnexus/ar...
October 28, 2025 at 9:09 PM
Huge career highlight for me: I got to chat about the psychological science of horror movies with legendary actor John Jarratt, and about how he created his iconic role in Wolf Creek.
#RecreationalFear

Article written by Thomas Fowles:
www.usc.edu.au/about/unisc-...
Holy s**t, something bad is about to happen! The art vs the science of scary cinema
Dr Will Harrison and actor John Jarratt explore the psychological tricks used in Australian horror films Picnic at Hanging Rock and Wolf Creek, and why audiences want to be thrilled.
www.usc.edu.au
October 27, 2025 at 1:02 AM
Reposted by Will Harrison
As this is my final PhD paper to be published, I want to give a special shout-out to my supervisors, @willjharrison.bsky.social and Jason. There’s no way to do them justice in 300 characters, so I’ve attached an excerpt from my thesis acknowledgements that still rings true today.

11/11
October 8, 2025 at 7:13 AM
The real pain won't be felt for another decade, not that I'm convinced the US will still exist then.
October 14, 2025 at 1:07 AM
New paper looking at how manipulating a movie's static statistics can influence subsequent perceptual inference.
Thanks for reading if you've made it this far! I've had to skip a lot of the details, so if you're interested in learning more, feel free to have a read of the paper - here's the link again:

www.nature.com/articles/s41...

10/10
Investigating orientation adaptation following naturalistic film viewing - Scientific Reports
Scientific Reports - Investigating orientation adaptation following naturalistic film viewing
www.nature.com
September 29, 2025 at 11:19 PM
We got people to watch Casablanca after we manipulated the movie's image statistics in one of four ways. Emily came up with a cool way to probe people's orientation estimation during the movie. Paper now out - details in the thread 💪
The movie itself was filtered frame by frame to have contrast at a specified adaptor orientation at low spatial frequencies. We completed this process four times, so we could test different adaptor orientations (0, 45, 90, and 135).

2/10
September 29, 2025 at 10:03 PM
Punny footnote:

"It is not entirely clear what prompted the cow connection. Informally we referred to this project as the multi-site overview of oscillations (MOO) which got the ball rolling, after which we decided to milk it for all it was worth."

royalsocietypublishing.org/doi/10.1098/...
How strong is the rhythm of perception? A registered replication of Hickok et al. (2015) | Royal Society Open Science
Our ability to predict upcoming events is a fundamental component of human cognition. One way in which we do so is by exploiting temporal regularities in sensory signals: the ticking of a clock, falli...
royalsocietypublishing.org
September 18, 2025 at 1:17 AM
I'm only just learning about splitapply in matlab:
au.mathworks.com/help/matlab/...
au.mathworks.com
September 16, 2025 at 11:05 PM
Reposted by Will Harrison
Finally! 🤩 Our position piece: Against the Uncritical Adoption of 'AI' Technologies in Academia:
doi.org/10.5281/zeno...

We unpick the tech industry’s marketing, hype, & harm; and we argue for safeguarding higher education, critical
thinking, expertise, academic freedom, & scientific integrity.
1/n
September 6, 2025 at 8:13 AM
Why everyone needs to understand signal detection theory (or at least sensitivity versus specificity).
September 3, 2025 at 12:03 AM