Aditi Krishnapriyan
ask1729.bsky.social
Aditi Krishnapriyan
@ask1729.bsky.social
Assistant Professor at UC Berkeley
7/ This was a very fun project with Ishan Amin and Sanjeev Raja, and will appear at #ICLR2025! Paper and code below:

Paper: openreview.net/forum?id=1du...

Code: github.com/ASK-Berkeley...
Towards Fast, Specialized Machine Learning Force Fields: Distilling...
The foundation model (FM) paradigm is transforming Machine Learning Force Fields (MLFFs), leveraging general-purpose representations and scalable training to perform a variety of computational...
openreview.net
March 13, 2025 at 3:06 PM
6/ The distilled MLFFs are much faster to run than the original large-scale MLFF: not everyone has the GPU resources to use big models and many scientists only care about studying specific systems (w/ the correct physics!). This is a way to get the best of all worlds!
March 13, 2025 at 3:06 PM
5/ We can also balance training at scale efficiently (often w/ minimal constraints) with distilling the correct physics into the small MLFF at test time: e.g., taking energy gradients to get conservative forces, and ensuring energy conservation for molecular dynamics.
March 13, 2025 at 3:06 PM
4/ Smaller, specialized MLFFs distilled from the large-scale model are more accurate than training from scratch on the same subset of data: the representations from the large-scale model help boost performance, while the smaller models are much faster to run
March 13, 2025 at 3:06 PM
3/ We formulate our distillation procedure as the smaller MLFF is trained to match Hessians of the energy predictions of the large-scale model (using subsampling methods to improve efficiency). This works better than distillation methods to try to match features.
March 13, 2025 at 3:06 PM
2/ Model distillation involves transferring the general-purpose representations learned by a large-scale model into smaller, faster models: in our case, specialized to specific regions of chemical space. We can use these faster MLFFs for a variety of downstream tasks.
March 13, 2025 at 3:06 PM
😃
November 16, 2024 at 4:47 PM
Would also appreciate being added, thanks!
November 16, 2024 at 4:46 PM