hchateaulaurent.bsky.social
@hchateaulaurent.bsky.social
Thanks to Frédéric Alexandre for his wonderful supervision, Dolton Fernandes for the UHN pointer, Thierry Vieville for helping with the math proofs, and Inria for the funding! Can't wait to see you at
@neuripsconf.bsky.social (Poster #2210 13/12 4.30pm @ East Exhibit Hall A-C)
December 12, 2024 at 6:45 PM
These results suggest that better choices of similarity and separation functions could be used to improve the performance of Neural Episodic Control in RL tasks. You're very welcome to experiment with these choices and let us know what you find!
December 12, 2024 at 6:45 PM
...where a trial is correct when the retrieval output is closer to the correct memory than to any other memory. In this case, the Max separation function yields the best results. We also show that k-Max can almost always be outperformed by softmax.
December 12, 2024 at 6:45 PM
We also show that, unlike previously conjectured by Millidge et al. (2022), the Max function (selecting the most similar memory) does not always yield optimal performance when an error threshold is used to assess whether retrieval is correct. We introduce a new criterion...
December 12, 2024 at 6:45 PM
We empirically evaluate the associative memory performance of DND on MNIST, CIFAR10 and Tiny ImageNet. We show that performance can be improved by replacing the Euclidean similarity by Manhattan similarity.
December 12, 2024 at 6:45 PM
...evaluate the Q-value from a sensory observation. We thus derive two energy functions for the DND. The implications are twofold: DND can store any vector-based information (e.g. ImageNet images) and other (better?) UHN instances could be used to improve NEC in RL tasks.
December 12, 2024 at 6:45 PM
The UHN framework (Millidge et al. 2022) states that Hopfield nets and many other models successively apply similarity, separation and projection operations to retrieve a memory from a cue. We show that the DND is an instance of this framework, performing similar operations to...
December 12, 2024 at 6:45 PM