Felix Koehler
banner
felix-m-koehler.bsky.social
Felix Koehler
@felix-m-koehler.bsky.social
🤖 Machine Learning & 🌊 Simulation | 📺 YouTuber | 🧑‍🎓 PhD student @ Thuerey Group
Can your AI surpass the simulator that taught it? What if the key to more accurate PDE modeling lies in questioning your training data's origins? 🤔

Excited to share my #NeurIPS 2025 paper with @thuereygroup.bsky.social: "Neural Emulator Superiority"!
November 14, 2025 at 8:45 AM
📢 Calling for (4-page) workshop papers on hashtag #differentiable programming and hashtag #SciML @euripsconf.bsky.social.
Submit your work on to the @euripsconf.bsky.social workshop on on Differentiable Systems and Scientific Machine Learning!

differentiable-systems.github.io/workshop-eur...

Submission: 10 October
Notification: 31 October
Workshop: 6 or 7 December in Copenhagen
Differentiable Systems and Scientific Machine Learning Workshop - EurIPS 2025
differentiable-systems.github.io
September 19, 2025 at 12:37 PM
Reposted by Felix Koehler
It may only be a band-aid, but we have just announced our new "Salon des Refusés" sessions for papers rejected due to space constraints: bsky.app/profile/euri...
Congratulations to everyone who got their @neuripsconf.bsky.social papers accepted 🎉🎉🎉

At #EurIPS we are looking forward to welcoming presentations of all accepted NeurIPS papers, including a new “Salon des Refusés” track for papers which were rejected due to space constraints!
September 19, 2025 at 9:35 AM
Check Out my latest video on implementing an attention-based neural operator/emulator (i.e. a Transformer) in JAX:
youtu.be/GVVWpyvXq_s
Transformer Neural Operator in JAX
YouTube video by Machine Learning & Simulation
youtu.be
June 9, 2025 at 7:23 AM
Travelling to Singapore next week for #ICLR2025 presenting this paper (Sat 3 pm nr. 538): arxiv.org/abs/2502.19611

DM me (Whova, Email or bsky) if you want to chat about (autoregressive) neural emulators/operators for PDE, autodiff, differentiable physics, numerical solvers etc. 😊
PRDP: Progressively Refined Differentiable Physics
The physics solvers employed for neural network training are primarily iterative, and hence, differentiating through them introduces a severe computational burden as iterations grow large. Inspired by...
arxiv.org
April 18, 2025 at 9:25 AM
Check out my latest video on approximating the full Lyapunov spectrum for the Lorenz system: youtu.be/Enves8MDwms

Nice showcase of #JAX's features:
- `jax.lax.scan` for autoregressive rollout
- `jax.linearize` repeated jvp
- `jax.vmap`: automatic vectorization
Full Lyapunov Spectrum of Chaotic Lorenz System using JAX
YouTube video by Machine Learning & Simulation
youtu.be
April 4, 2025 at 2:36 PM
Art.
March 28, 2025 at 1:24 PM
Today, I had the chance to present my #NeurIPS paper "APEBench" @SimAI4Science . You can find the recording on YouTube: youtu.be/wie-SzD6AJE
APEBench Talk @ Pasteur Labs Journal Club
YouTube video by Machine Learning & Simulation
youtu.be
February 18, 2025 at 6:47 PM
Thanks @munichcenterml.bsky.social for highlighting my recent #NeurIPS paper: APEBench,
a new benchmark suite for autoregressive emulators of PDEs to understand how we might solve the models of nature more efficiently. More details 🧵

Visual summary on project page: tum-pbs.github.io/apebench-pap...
February 12, 2025 at 4:08 PM
Reposted by Felix Koehler
Our online book on systems principles of LLM scaling is live at jax-ml.github.io/scaling-book/

We hope that it helps you make the most of your computing resources. Enjoy!
February 4, 2025 at 6:59 PM
Reposted by Felix Koehler
I’d like to thank everyone contributing to our five accepted ICLR papers for the hard work! Great job everyone 👍 Here’s a quick list, stay tuned for details & code in the upcoming weeks…
January 23, 2025 at 3:14 AM
I created a video to help you get started using the APEBench suite (my recent #neurips paper) to benchmark autoregressive neural emulators for PDEs with a simple ConvNet emulation of 1D advection: youtu.be/q8fjQ4ZFynw
APEBench Quickstart
YouTube video by Machine Learning & Simulation
youtu.be
January 7, 2025 at 6:42 PM
Happy new year! 🎉 Two days ago we entered 2025 and just in time the channel surpassed 25k subscribers. Wow! Thanks to everyone for their kind words and support along the way: www.youtube.com/channel/UCh0...
January 2, 2025 at 12:55 PM
Check out my latest video on approximating the largest Lyapunov exponent of a dynamical system by integrating a tangent linear perturbation dynamic via autodiff in JAX: youtu.be/zRMBIkpcuu0

Very neat use-case of forward-mode AD for efficient Lyap approximation.
Largest Lyapunov Exponent using Autodiff in JAX/Python
YouTube video by Machine Learning & Simulation
youtu.be
December 20, 2024 at 4:33 PM
Reposted by Felix Koehler
Automatic differentiation in forward mode computes derivatives by breaking down functions into elem operations and propagating derivatives alongside values. It’s efficient for functions with fewer inputs than outputs and for Jacobian-vect prod, using for instance dual numbers.
December 13, 2024 at 6:00 AM
Now presenting APEBench at #NeurIPS in West #5407.
December 12, 2024 at 6:46 PM
Reposted by Felix Koehler
Excited to be at NeurIPS this week! 🎉 I'm part of four exciting projects being presented:

- The Well & Multimodal Universe: massive, curated scientific datasets
- LaSR: LLM concept evolution for symbolic regression
- MPP: 0th gen @polymathicai.bsky.social

All posters Wed/Thu - stop by! 👋
December 11, 2024 at 6:44 PM
Reposted by Felix Koehler
I am at #NEURIPS and will present it today Wed 11 at 11:00am, West Ballroom A-D #6001
📢 #NeurIPS2024 poster ready! "Derivatives of Stochastic Gradient Descent in parametric optimization" w/
@franck-iutzeler.bsky.social & E. Pauwels

We show that differentiating the iterates of SGD leads to convergent sequences (in different regimes) for strg cvx functions.

arxiv.org/abs/2405.15894
December 11, 2024 at 3:05 PM
I will be presenting my poster on APEBench on Thursday from 11:00 to 14:00 PST at West Ballroom A-D #5407.

This was done as part of my PhD with @thuereygroup.bsky.social in collaboration with my talented co-author, Simon Niedermayr, who is supervised by Rüdiger Westermann.
December 11, 2024 at 1:18 AM