lenz3000.bsky.social
lenz3000.bsky.social
@lenz3000.bsky.social
Whoop whoop, this got accepted at Neurips 25 🎉🎉🎉
I'll make an update once we have the updated version
September 18, 2025 at 3:46 PM
If you care about improving the performance of Flow Matching, especially in molecular modeling, this might be for you.

No redesign. No retraining. Just better gradients.

arxiv.org/abs/2505.10139
By @leonklein.bsky.social & me.
Path Gradients after Flow Matching
Boltzmann Generators have emerged as a promising machine learning tool for generating samples from equilibrium distributions of molecular systems using Normalizing Flows and importance weighting. Rece...
arxiv.org
May 19, 2025 at 9:25 AM
The result?
Replicating the experiments from Equivariant Flow Matching (2306.15030), using the same limited resources, we get:
✓ up to 3× higher ESS on LJ55
✓ 2x on Alanine Dipeptide XTB
✓ Improved performance on most tasks
May 19, 2025 at 9:23 AM
Path Gradients were the topic of my dissertation — and they turn out to be a great fine-tuning step for Boltzmann Generators.
Just apply them after Flow Matching.
Fine-tuning reliably improves performance without big changes to the overall learned distribution.
For example, on alanine dipeptide:
May 19, 2025 at 9:17 AM