[P8] NeurIPS 2024 proposed Neural Conditional Probability (NCP) to efficiently learn conditional distributions. It simplifies uncertainty quantification and guarantees accuracy for nonlinear, high-dimensional data.
[P8] NeurIPS 2024 proposed Neural Conditional Probability (NCP) to efficiently learn conditional distributions. It simplifies uncertainty quantification and guarantees accuracy for nonlinear, high-dimensional data.
[P4] NeurIPS 2023 introduced a Nyström sketching-based method to reduce computational costs from cubic to almost linear without sacrificing accuracy. Validated on massive datasets like molecular dynamics, see figure.
[P4] NeurIPS 2023 introduced a Nyström sketching-based method to reduce computational costs from cubic to almost linear without sacrificing accuracy. Validated on massive datasets like molecular dynamics, see figure.
Our Deflate-Learn-Inflate (DLI) paradigm ensures uniform error bounds, even for infinite time horizons. This method stabilized predictions in real-world tasks; see the figure.
Our Deflate-Learn-Inflate (DLI) paradigm ensures uniform error bounds, even for infinite time horizons. This method stabilized predictions in real-world tasks; see the figure.
[P1] NeurIPS 2022
We introduced the first ML formulation for learning TO, which led to the development of the open-source Kooplearn library. This step laid the groundwork for exploring the theoretical limits of operator learning from finite data.
[P1] NeurIPS 2022
We introduced the first ML formulation for learning TO, which led to the development of the open-source Kooplearn library. This step laid the groundwork for exploring the theoretical limits of operator learning from finite data.