Bao Pham
@baopham.bsky.social
PhD Student at RPI. Interested in Hopfield or Associative Memory models and Energy-based models.
Pinned
Bao Pham
@baopham.bsky.social
· Dec 5
Diffusion models create beautiful novel images, but they can also memorize samples from the training set. How does this blending of features allow creating novel patterns? Our new work in Sci4DL workshop #neurips2024 shows that diffusion models behave like Dense Associative Memory networks.
Reposted by Bao Pham
I am excited to announce the call for papers for the New Frontiers in Associative Memories workshop at ICLR 2025. New architectures and algorithms, memory-augmented LLMs, energy-based models, Hopfield nets, AM and diffusion, and many other topics.
Website: nfam.vizhub.ai
@iclr-conf.bsky.social
Website: nfam.vizhub.ai
@iclr-conf.bsky.social
January 14, 2025 at 4:56 PM
I am excited to announce the call for papers for the New Frontiers in Associative Memories workshop at ICLR 2025. New architectures and algorithms, memory-augmented LLMs, energy-based models, Hopfield nets, AM and diffusion, and many other topics.
Website: nfam.vizhub.ai
@iclr-conf.bsky.social
Website: nfam.vizhub.ai
@iclr-conf.bsky.social
Reposted by Bao Pham
Most of the work on Dense Associative Memory (DenseAM) thus far has focused on the regime when the amount of data (number of memories) is below the critical memory storage capacity. We are beginning to explore the opposite limit, when the data is large.
Diffusion models create beautiful novel images, but they can also memorize samples from the training set. How does this blending of features allow creating novel patterns? Our new work in Sci4DL workshop #neurips2024 shows that diffusion models behave like Dense Associative Memory networks.
December 5, 2024 at 6:19 PM
Most of the work on Dense Associative Memory (DenseAM) thus far has focused on the regime when the amount of data (number of memories) is below the critical memory storage capacity. We are beginning to explore the opposite limit, when the data is large.
Reposted by Bao Pham
Diffusion models create beautiful novel images, but they can also memorize samples from the training set. How does this blending of features allow creating novel patterns? Our new work in Sci4DL workshop #neurips2024 shows that diffusion models behave like Dense Associative Memory networks.
December 5, 2024 at 5:29 PM
Diffusion models create beautiful novel images, but they can also memorize samples from the training set. How does this blending of features allow creating novel patterns? Our new work in Sci4DL workshop #neurips2024 shows that diffusion models behave like Dense Associative Memory networks.
Diffusion models create beautiful novel images, but they can also memorize samples from the training set. How does this blending of features allow creating novel patterns? Our new work in Sci4DL workshop #neurips2024 shows that diffusion models behave like Dense Associative Memory networks.
December 5, 2024 at 5:29 PM
Diffusion models create beautiful novel images, but they can also memorize samples from the training set. How does this blending of features allow creating novel patterns? Our new work in Sci4DL workshop #neurips2024 shows that diffusion models behave like Dense Associative Memory networks.