Aaron Havens
aaronjhavens.bsky.social
Aaron Havens
@aaronjhavens.bsky.social
PhD student at UIUC looking at control theory and generative modeling. previously intern at FAIR NY and Preferred Networks Tokyo.
This work was done during a PhD internship at FAIR NYC, thanks to my amazing supervisors @bdamos.bsky.social, Ricky T.Q. Chen and Brian Karrer.

Special thanks to our core contributors:
@bkmi.bsky.social, Bing Yan, Xiang Fu, Guang-Horng Liu (and of course Carles Domingo-Enrich).
May 1, 2025 at 1:34 AM
Our evaluation offers a new, challenging, *amortized* sampling benchmark for molecular conformer generation.

The benchmark features real, drug-like molecules from the SPICE dataset, and we hope it drives direct and tangible progress in sampling for computational chemistry (coming soon).
May 1, 2025 at 1:34 AM
This lets us train conditional diffusion samplers directly from expensive energy functions, namely, state-of-art molecular foundation models, amortizing sampling across thousands of molecules—unlike traditional samplers, which require heavy energy access per new molecule structure, per sample.
May 1, 2025 at 1:34 AM
We specialize Adjoint Matching—originally designed for reward fine-tuning—to the sampling setting.

By exploiting a factorization of the optimal transition density (a Schrödinger bridge), our new loss enables heavy reuse of simulations and energy evaluations.
May 1, 2025 at 1:34 AM
This lets us train conditional diffusion samplers directly from expensive energy functions, namely, state-of-art molecular foundation models, amortizing sampling across thousands of molecules—unlike traditional samplers, which require heavy energy access per new molecule structure, per sample.
May 1, 2025 at 12:52 AM
We specialize Adjoint Matching—originally designed for reward fine-tuning—to the sampling setting.

By exploiting a factorization of the optimal transition density (a Schrödinger bridge), our new loss enables heavy reuse of simulations and energy evaluations.
May 1, 2025 at 12:52 AM