Mathurin Massias
mathurinmassias.bsky.social
Mathurin Massias
@mathurinmassias.bsky.social
Tenured Researcher @INRIA, Ockham team. Teacher @Polytechnique
and @ENSdeLyon

Machine Learning, Python and Optimization
Reposted by Mathurin Massias
The JMLR story and operating model should be widely known in academia as a clear success story for full open access. I have friends in the humanities and pure sciences that have no clue this is even possible
November 5, 2025 at 1:16 AM
To understand these phenomena, we study the spatial regularity of the velocity/denoiser over time: we observe a gap between the closed-form and trained model.

Applying Jacobian regularization, we recover effects seen previously on perturbed denoisers (drift vs noise)
November 5, 2025 at 9:05 AM
Different loss weightings favor different times: which temporal regime drives the generation quality ? Controlled perturbations reveal: drift type effects at early times (& good FID) and noise type at late times (& bad FID)
November 5, 2025 at 9:05 AM
In practice, training a denoiser involves design choices: the parametrization (velocity as in FM, residual Ɛ as in diffusion, or standard denoiser?) and the loss weighting, each influencing the generation quality
November 5, 2025 at 9:04 AM
Strong afternoon session: Ségolène Martin on how to go from flow matching to denoisers (and hopefully come back?) and Claire Boyer on how learning rate and working in latent spaces affect diffusion models
October 24, 2025 at 3:04 PM
Followed by Scott Pesme on how to use diffusion/flow matching based MMSE to compute a MAP (and nice examples!), and Thibaut Issenhuth on new ways to learn consistency models
@skate-the-apple.bsky.social
October 24, 2025 at 1:24 PM
Next is @annegnx.bsky.social presenting our neurips paper on why flow matching generalizes, while it shouldn't!

arxiv.org/abs/2506.03719
October 24, 2025 at 9:05 AM
merci David !
September 19, 2025 at 4:34 PM
merci !
September 19, 2025 at 4:33 PM
Félicitations Anna !!
September 9, 2025 at 11:36 AM
Oui, tout sera en anglais !
September 4, 2025 at 12:12 PM
Oui !
September 4, 2025 at 12:11 PM
Oui... c'est un compromis avec le fait d'avoir suffisamment de créneaux et de temps de discussions aux posters. Tu peux éventuellement arriver un peu après le début, et sinon ça devrait être accessible à distance 🤞
September 4, 2025 at 8:22 AM
on second thoughts I'm not sure I understood. In the classical FM loss you do have to learn this derivative no ? The loss is :
June 27, 2025 at 5:53 AM
I was thinking of this:
June 27, 2025 at 5:42 AM
I was thinking of the linear interpolant yes ; I haven't seen papers where other are used
June 26, 2025 at 3:57 PM