Bao Pham
baopham.bsky.social
Bao Pham
@baopham.bsky.social
PhD Student at RPI. Interested in Hopfield or Associative Memory models and Energy-based models.
In the low training data regime (number of memories), diffusion models memorize. As the data size increases, spurious states emerge, signaling the blending of stored features into new combinations which enables generalization. This is how such models create novel outputs in the high data regime.
December 5, 2024 at 5:29 PM
Diffusion models create beautiful novel images, but they can also memorize samples from the training set. How does this blending of features allow creating novel patterns? Our new work in Sci4DL workshop #neurips2024 shows that diffusion models behave like Dense Associative Memory networks.
December 5, 2024 at 5:29 PM