William Ngiam | 严祥全
banner
williamngiam.github.io
William Ngiam | 严祥全
@williamngiam.github.io
Cognitive Neuroscientist at Adelaide University | Perception, Attention, Learning and Memory Lab (https://palm-lab.github.io) | Open Practices Editor at Attention, Perception, & Psychophysics | ReproducibiliTea | http://williamngiam.github.io
Hi Eiko! Will you be attending AIMOS? If so, I'd love to catch you during a break and perhaps chat all things measurement schmeasurement / jingle-jangle and the role of formal models in building theories in psychology. I have a few favourite cafes in Sydney too! 🙏
November 12, 2025 at 4:07 AM
November 9, 2025 at 4:39 AM
For the Feyerabend anarchists:
November 9, 2025 at 4:39 AM
me: finding a recent and relevant Chaz phil vis paper that I had missed through zohran-posting
November 9, 2025 at 3:37 AM
I'll be presenting this work at the Australasian Society of Philosophy and Psychology conference (and alluding to it at the Australasian Cognitive Neuroscience Society conference) #ACNS. Would love to chat to anyone interested about this work! Please feel free to reach out! /fin
November 3, 2025 at 7:34 AM
So, what is a working memory representation? More from me to come on that, including further development of these models, a couple of #cogneuro projects in prep, and a follow up on my "theory map" piece. But for now, please check out our preprint: osf.io/fm9vz /10
OSF
osf.io
November 3, 2025 at 7:34 AM
We think we have provided a principled modeling approach that can clarify the interactions of stimulus-specific effects and more general mechanisms in VWM. Our approach can help #cogsci better define cognitive models of representations and mechanisms as explanations of task performance. /9
November 3, 2025 at 7:34 AM
We also find that the recovered representation is improved by refining the cognitive mechanisms (here, swap errors in memory reproduction). This emphasizes a key point – that the representation should be incorporated in cognitive models, not simply offloaded to similarity judgments. /8
November 3, 2025 at 7:34 AM
As per Schurgin et al. (2020), we find that the cognitive representation underlying all tasks do not match the physical stimulus space. But we also find that the representation is different for similarity comparison and both reproduction tasks; similarity is not the basis for working memory. /7
November 3, 2025 at 7:34 AM
Our models fit for the cognitive representation jointly with the theorised mechanisms using the task data itself, rather than treating the representation as independent to the model (or assuming that the psychological representation is similarity-based). The model code can be found here: /6
GitHub - mdlee/orientationModeling
Contribute to mdlee/orientationModeling development by creating an account on GitHub.
github.com
November 3, 2025 at 7:34 AM
We fit models for three separate tasks: a similarity comparison task, a perceptual reproduction task, and a working memory reproduction task. The dataset comes from Tomic and Bays (2024): doi.org/10.1037/xlm0.... Thanks to @ivntmc.bsky.social and @bayslab.org for providing the open dataset. /5
November 3, 2025 at 7:34 AM
Recent work has found adjustments for psychological similarity using an approximation or empirical measures does not lead to better model fits of VWM performance. Oberauer (2023): psycnet.apa.org/doi/10.1037/...

But the question remains, what is the cognitive representation underlying VWM? /4
November 3, 2025 at 7:34 AM
The TCC model is like many other formal models of cognition that collect similarity judgments to then use multidimensional scaling methods and derive the psychological representation. The cognitive mechanisms are built upon that MDS representation to predict the cognitive task performance. /3
Target Confusability Competition Model
bradylab.ucsd.edu
November 3, 2025 at 7:34 AM
The issue of defining the appropriate representation is core to the field of visual #workingmemory. Schurgin et al. (2020) introduced the TCC model and influentially argued that models of VWM should be in terms of psychological similarity, rather than in terms of the physical stimulus space. /2
Psychophysical scaling reveals a unified theory of visual memory strength - Nature Human Behaviour
Schurgin et al. propose a model of visual memory, arguing against a distinction between how many items are represented and how precisely they are represented, and in favour of a view based on continuo...
doi.org
November 3, 2025 at 7:34 AM
A bit of a stretch but how about SoundCheck? Mainly for the pun, could imply research that has open materials is more sound...
October 14, 2025 at 11:23 PM