Interested in (deep) learning theory and others.
In the realm of overparameterized NNs, one can achieve almost zero training error on any data, even random labels, that yield massive test errors.
So, how can we tell when such a model truly generalizes?
arxiv.org/abs/2510.06028
In the realm of overparameterized NNs, one can achieve almost zero training error on any data, even random labels, that yield massive test errors.
So, how can we tell when such a model truly generalizes?
arxiv.org/abs/2510.06028
We’re excited to host Arthur Bizzi from EPFL for a research talk next week!
Title: Towards Neural Kolmogorov Equations: Parallelizable SDE Learning with Neural PDEs
🗓 Date: November 19
⏰ Time: 16:00 CET
📍 Galileo Sala, CHT @iitalk.bsky.social
We’re excited to host Arthur Bizzi from EPFL for a research talk next week!
Title: Towards Neural Kolmogorov Equations: Parallelizable SDE Learning with Neural PDEs
🗓 Date: November 19
⏰ Time: 16:00 CET
📍 Galileo Sala, CHT @iitalk.bsky.social
📍Poster Session 1 #125
We present a new empirical Bernstein inequality for Hilbert space-valued random processes—relevant for dependent, even non-stationary data.
w/ Andreas Maurer, @vladimir-slk.bsky.social & M. Pontil
📄 Paper: openreview.net/forum?id=a0E...
📍Poster Session 1 #125
We present a new empirical Bernstein inequality for Hilbert space-valued random processes—relevant for dependent, even non-stationary data.
w/ Andreas Maurer, @vladimir-slk.bsky.social & M. Pontil
📄 Paper: openreview.net/forum?id=a0E...
pietronvll.github.io/the-operator...
"Unlocking State-Tracking in Linear RNNs Through Negative Eigenvalues"
at the M3L workshop at #NeurIPS
https://buff.ly/3BlcD4y
If interested, you can attend the presentation the 14th at 15:00, pass at the afternoon poster session, or DM me to discuss :)
"Unlocking State-Tracking in Linear RNNs Through Negative Eigenvalues"
at the M3L workshop at #NeurIPS
https://buff.ly/3BlcD4y
If interested, you can attend the presentation the 14th at 15:00, pass at the afternoon poster session, or DM me to discuss :)
“When solving a given problem, try to avoid a more general problem as an intermediate step”
“When solving a given problem, try to avoid a more general problem as an intermediate step”
#ELLISforEurope #AI #ML #CrossBorderCollab #PhD
#ELLISforEurope #AI #ML #CrossBorderCollab #PhD