James Thornton
jamesthornton.bsky.social
James Thornton
@jamesthornton.bsky.social
Research scientist at Google DeepMind. Creating noise from data.
I know it’s only cifar10 but sota fid without any mini batch OT was already < 2 with 35nfe back in 2022.

Are you sure this makes any difference in the competitive setting? Seems like choosing hyper params makes more of a difference

arxiv.org/abs/2206.00364
June 16, 2025 at 11:18 AM
It’s used within Sinkhorn

see eg lse model here ott-jax.readthedocs.io/en/stable/_m...
ott.solvers.linear.sinkhorn — ott 0.5.0 documentation
ott-jax.readthedocs.io
May 25, 2025 at 6:24 AM
Thanks!
April 6, 2025 at 6:37 PM
I guess my point more broadly is that it is hard in general to draw the line or understand future impact of some work.

And reviewing can suck sometimes
April 6, 2025 at 10:49 AM
ODE is from arxiv.org/abs/2106.01357 in the appendix, there was an error in first version but hopefully fixed now .

I did not try with the alpha version.
Diffusion Schrödinger Bridge with Applications to Score-Based Generative Modeling
Progressively applying Gaussian noise transforms complex data distributions to approximately Gaussian. Reversing this dynamic defines a generative model. When the forward noising process is given by a...
arxiv.org
February 10, 2025 at 10:38 AM
I have tried and works well in practice, it’s a bit similar to initialising reflow from a bridge or diffusion; and is similar to annealed Rf of arxiv.org/abs/2407.12718

You can also use the flow of the SB, we wrote the details here but didn’t investigate much (this was 2020/2021)
February 8, 2025 at 6:07 PM