N. Thuerey's research group at TUM
banner
thuereygroup.bsky.social
N. Thuerey's research group at TUM
@thuereygroup.bsky.social
Professor @ TUM | Making numerical methods and deep learning play nicely together | Fluids | Computer Graphics
We're very excited to report that our P3D Transformer was accepted at ICLR openreview.net/forum?id=8Ud...

We introduce a scalable hybrid CNN–Transformer architecture that pushes neural surrogate modeling into the regime of truly high-resolution 3D simulations.
February 3, 2026 at 2:58 PM
I'm very happy to report that our autoregressive predictions with generative diffusion models is _finally_ accepted 😁 Congratulations Georg!

It's been a long journey, this paper was first submitted to NeurIPS'23, and now, almost 3y later, finally got accepted www.sciencedirect.com/science/arti...
Benchmarking Autoregressive Conditional Diffusion Models for Turbulent Flow Simulation
Simulating turbulent flows is crucial for a wide range of applications, and machine learning-based solvers are gaining increasing relevance. However, …
www.sciencedirect.com
January 30, 2026 at 8:46 AM
Great to see our paper on physics-constrained reconstruction / super-res with generative models posted online now at doi.org/10.1063/5.03... 😁

- PDE Transformer as backbone architecture
- differentiable physics constraints to guide
- and ConFIG as optimizer to resolve conflicts in the gradients
Guiding diffusion models to reconstruct flow fields from sparse data
The reconstruction of unsteady flow fields from limited measurements is a challenging and crucial task for many engineering applications. Machine learning model
doi.org
January 9, 2026 at 11:25 AM
The SuperWing dataset is a large-scale, open dataset of transonic swept-wing aerodynamics, combining thousands of richly parameterized 3D wing geometries with high-fidelity RANS simulations across the operational flight envelope: arxiv.org/abs/2512.14397
December 17, 2025 at 6:48 PM
Please join our mini symposium on "AI for Computational Fluid Dynamics - Opportunities and Challenges" MS279 , wccm-eccomas2026.org/event/area/8... at WCCM ECCOMAS in Munich next year in July (July 2026, wccm-eccomas2026.org). Inspiring discussions, and a proper "Mass" at the beergarden 🍻😁
December 10, 2025 at 3:41 PM
Our full course "advanced deep learning for physics" (ADL4P) is online now at tum-pbs.github.io/ADL4P/ 😁 The course covers AI and neural network techniques for physics simulations & combinations with numerical methods. All recordings, slides and exercises are freely available!
December 9, 2025 at 12:03 PM
Can AI surrogates outperform their training data? Turns out the answer is yes - with a few caveats 😉 tum-pbs.github.io/emulator-sup... #neurips This surprising behavior leads to interesting and fundamental questions about the role of training data, and about how NN surrogates should be evaluated.
November 26, 2025 at 7:58 PM
Reposted by N. Thuerey's research group at TUM
Can your AI surpass the simulator that taught it? What if the key to more accurate PDE modeling lies in questioning your training data's origins? 🤔

Excited to share my #NeurIPS 2025 paper with @thuereygroup.bsky.social: "Neural Emulator Superiority"!
November 14, 2025 at 8:45 AM
Congratulations to Hao, Aleksandra and Bjoern for their NeurIPS paper tum-pbs.github.io/inc-paper/ 👍 It analyzes how hybrid PDE solvers fundamentally and provably benefit from "indirect" (force-based) corrections rather than direct ones. Baking the corrections via INC reduces error growth!
November 25, 2025 at 12:11 PM
I wanted to highlight that source code and data for our physics-based flow matching (PBFM) algorithm are online now at: github.com/tum-pbs/PBFM/ feel free to give it a try, and we'd be curious to hear how it works for you!
November 6, 2025 at 9:49 AM
I'm happy to report that our collaborative project on 3D sparse-reconstruction and super-resolution with diffusion models, physics constraints and PDE Transformers is online now as preprint arxiv.org/abs/2510.19971 and source code github.com/tum-pbs/spar.... Great work Marc, Luis, Qiang and Luca 👍
October 29, 2025 at 2:37 PM
I'm very excited to introduce P3D: our PDE-Transformer architecture in 3 dimensions by . Demonstrated for unprecedented 512^3 resolutions! That means the Transformer produces over 400 million degrees of freedom in one go 😀 a regime that was previously out of reach: arxiv.org/abs/2509.10186
September 16, 2025 at 7:40 AM
Congratulations to Bjoern for his accepted PoF paper on equivariant GraphNets 👍 doi.org/10.1063/5.02...
the core idea is a very generic and powerful one: we compute a local Eigenbasis from flow features for equivariance. Mathematically it's identical to previous approaches, but faster and simpler 😅
September 9, 2025 at 7:15 AM
I also wanted to mention that our paper detailing the differentiable SPH solver by Rene is online now on arxiv: arxiv.org/abs/2507.21684 If you're interested in fast and efficient neighborhoods, differentiable SPH operators and neat first optimization and learning tasks, please take a look!
August 1, 2025 at 7:37 AM
Get ready for the PDE-Transformer: our new NN architecture tailored to scientific tasks 😁 It combines hierarchical processing (UDiT), scalability (SWin) and flexible conditioning mechanisms. Code and paper available at tum-pbs.github.io/pde-transfor...
June 30, 2025 at 7:05 PM
I'm really excited to share our latest work combining physics priors with probabilistic models: Flow Matching Meets PDEs - A Unified Framework for Physics-Constrained Generation , arxiv.org/abs/2506.08604 , great work by Giacomo and Qiang!
June 17, 2025 at 2:36 PM
Have you faced challenges like SPH-based inverse problems, or learning Lagrangian closure models?
For these we’re excited to announce the first public release of DiffSPH , our differentiable Smoothed Particle Hydrodynamics solver.
Code: diffsph.fluids.dev
Short demo: lnkd.in/dYABSeKG
June 13, 2025 at 2:08 PM
Congratulations to Bernhard for his first #SIGGRAPH paper! Great work 👍 His two-phase Navier-Stokes solver is even more impressive given the fact that it's all done on a regular workstation, and without a GPU. Enjoy the sims in full screen & hi-quality here: youtu.be/nt9BohngvoE
June 4, 2025 at 7:14 PM
I also just recorded a quick overview video for our new PICT solver: youtu.be/GGLidL0oT3s , enjoy! In case you missed it: PICT provides a new fully-differentiable multi-block Navier-Stokes solver for AI and learning tasks in PyTorch, e.g. learning turbulence closure in 3D
Introducing PICT: the differentiable Fluid Solver for AI & machine learning in PyTorch
YouTube video by Nils Thuerey
youtu.be
June 2, 2025 at 12:38 PM
I'd like to highlight PICT, our new differentiable Fluid Solver built for AI & learning: github.com/tum-pbs/PICT

Simulating fluids is hard, and learning 3D closure models even harder: This is where PICT comes in — a GPU-accelerated, fully differentiable fluid solver for PyTorch 🥳
May 28, 2025 at 3:41 PM
I can highly recommend checking out Mario's talk about our Diffusion Graph Net paper from ICLR'25: www.youtube.com/watch?v=4Vx_... , enjoy!
Learning Distributions of Complex Fluid Simulations with Diffusion Graph Networks
YouTube video by Mario Lino
www.youtube.com
May 21, 2025 at 12:58 PM
I wanted to highlight PBDL's brand-new sections on diffusion models with code and derivations! Great work by Benjamin Holzschuh, with neat Jupyter notebooks 👍 All the way from normalizing flow basics over score matching to denoising & flow matching. E.g., colab.research.google.com/github/tum-p...
May 13, 2025 at 7:19 AM
If you're at #ICLR 2025 in Singapore, please check out our posters 🤗 I'm sure it's going to be a great conference! Have fun everyone...
April 23, 2025 at 8:53 AM
I wanted to highlight that our project website (with code!) for our progressively-refined training with physics simulations is up now at: kanishkbh.github.io/prdp-paper/ #ICLR25 , the main ideas are: match network approximation and physics accuracy, refine the physics over the course of training.
April 11, 2025 at 2:21 PM
The full PBDL book is available in a single PDF now arxiv.org/pdf/2109.05237, and has grown to 451 pages 😳 Enjoy all the new highlights on generative models, simulation-based constraints and long term stability with diffusion models 😁
March 28, 2025 at 8:23 AM