Meet the new Lattice Random Walk (LRW) discretisation for SDEs. It’s radically different from traditional methods like Euler-Maruyama (EM) in that each iteration can only move in discrete steps {-δₓ, 0, δₓ}.
Meet the new Lattice Random Walk (LRW) discretisation for SDEs. It’s radically different from traditional methods like Euler-Maruyama (EM) in that each iteration can only move in discrete steps {-δₓ, 0, δₓ}.
Singapore taxi driver: Malaysia
Singapore taxi driver: Malaysia
I’ll be in Singapore next week, let’s chat all things scalable Bayesian learning! 🇸🇬👋
I’ll be in Singapore next week, let’s chat all things scalable Bayesian learning! 🇸🇬👋
Normally we order our minibatches like
a, b, c, ...., [shuffle], new_a, new_b, new_c, ....
but instead, if we do
a, b, c, ...., [reverse], ...., c, b, a, [shuffle], new_a, new_b, ....
The RMSE of stochastic gradient descent reduces from O(h) to O(h²)
arxiv.org/abs/2504.04274
Normally we order our minibatches like
a, b, c, ...., [shuffle], new_a, new_b, new_c, ....
but instead, if we do
a, b, c, ...., [reverse], ...., c, b, a, [shuffle], new_a, new_b, ....
The RMSE of stochastic gradient descent reduces from O(h) to O(h²)
arxiv.org/abs/2504.04274
"why send one explorer when you can send a whole army of clueless one"
"why send one explorer when you can send a whole army of clueless one"
But I found this super confusing, it’s not an A=B+A statement
But I found this super confusing, it’s not an A=B+A statement
I highlight that I've added
i) links to several slide decks for talks about my research, and
ii) materials related to the (few) short courses which I've given in the past couple of years.
Enjoy!
I highlight that I've added
i) links to several slide decks for talks about my research, and
ii) materials related to the (few) short courses which I've given in the past couple of years.
Enjoy!
One of the best parts about VSCode is the ecosystem of extensions (and that it is open source).
Cursor is already out of sync and having issues with extension compatibility
One of the best parts about VSCode is the ecosystem of extensions (and that it is open source).
Cursor is already out of sync and having issues with extension compatibility
On a thermodynamic computer, the matrix exponential occurs very naturally through the temporal covariance driven by the noise - a polynomial speedup over digital computers!
On a thermodynamic computer, the matrix exponential occurs very naturally through the temporal covariance driven by the noise - a polynomial speedup over digital computers!
And I, for one, am delighted to see work from Normal Computing published in Unconventional Computing 😝
And I, for one, am delighted to see work from Normal Computing published in Unconventional Computing 😝
Our answer: They’re two sides of the same coin. We wrote a blog post to show how diffusion models and Gaussian flow matching are equivalent. That’s great: It means you can use them interchangeably.