jcrwhittington.bsky.social
@jcrwhittington.bsky.social
The job is still available despite what it says on the link! #neurojobs
November 1, 2025 at 7:48 PM
Great weather, a beautiful city, and a magical conference venue!
February 25, 2025 at 5:46 PM
Keynote Speakers:
Prof Jay McClelland
Prof Eve Marder
Prof Wolfgang Maass
Prof Christine Constantinople
Prof @kanakarajanphd.bsky.social (@kempnerinstitute.bsky.social )
Dr Ida Momennejad @neuroai.bsky.social
Dr Kim Stachenfeld @neurokim.bsky.social
Prof Jakob Foerster
February 25, 2025 at 5:46 PM
probs relevent to the story: arxiv.org/pdf/2112.04035
January 16, 2025 at 9:26 AM
Sent via email :)
December 9, 2024 at 9:49 AM
Two separate models with different architectures, but their learned mechanisms - which also look different - can be seen in the same light. Don't model interactions
December 8, 2024 at 6:57 PM
Thank you!
December 8, 2024 at 4:18 PM
December 8, 2024 at 3:28 PM
Overall, a new unifying mechanistic theory of PFC / RNNs / SSMs, as well as a unifying understanding of cognitive maps in neural networks. (9/9)
December 8, 2024 at 3:27 PM
Now how’s this all related to cognitive maps and hippocampus? We show that this slot algorithm is exactly the same maths as TEM just with some equations reshaped and rearranged. Frontal and temporal cognitive maps get unified! (8/9)
December 8, 2024 at 3:27 PM
Slots don’t just have to be flat, they can be hierarchical. We show that in a hierarchical task, RNNs learn hierarchical slots! A prediction for experimentalists to test 😀. (7/9)
December 8, 2024 at 3:27 PM
Slots also work for structured behavioural tasks too (Basu et al & El-Gaby et al). A continuous manifold of slots account for PFC neurons including ‘progress’ and ‘structured memory buffer’ neurons. These ‘behavioural’ tasks are really just working memory tasks in disguise! (6/9)
December 8, 2024 at 3:27 PM
Armed with this theory, we can reinterpret PFC representations as slots! Not just in simple sequence repetition tasks (Xie et al), but also cue-dependent recall tasks (Panichello et al). Here the cue acts like a velocity signal to move memories between slots 😀. (5/9)
December 8, 2024 at 3:27 PM
And slots are exactly what randomly initialised RNNs/SSMs actually learn! And for many many task structures and sizes (not just the one shown in the figure here) (4/9)
December 8, 2024 at 3:27 PM
Our new slot algorithm works for any recurrent network (PFC / RNN / SSM) for any structured sequence memory task. This is a big generalisation of prior mechanisms (e.g. Botvinick or Ganguli) that could only repeat the exact sequence it was presented. (3/9)
December 8, 2024 at 3:27 PM
What’s the new mechanism? Different memories get stored in different neural subspaces–‘slots’. Importantly, slots are connected to each other (mirroring task connectivity!) so that memories get passed between slots allowing the right memory to be readout at the right time! (2/9)
December 8, 2024 at 3:27 PM