Johann Brehmer
banner
johannbrehmer.bsky.social
Johann Brehmer
@johannbrehmer.bsky.social
Machine learner & physicist. At CuspAI, I teach machines to discover materials for carbon capture. Previously Qualcomm AI Research, NYU, Heidelberg U.
Reposted by Johann Brehmer
It was great to work on the ODAC25 paper with the Meta FAIR Chemistry and Georgia Tech. A leap forwards in modelling direct air carbon capture with metal organic frameworks, with much better data and larger models.

Paper: arxiv.org/abs/2508.03162
Data and models: huggingface.co/facebook/ODA...
The Open DAC 2025 Dataset for Sorbent Discovery in Direct Air Capture
Identifying useful sorbent materials for direct air capture (DAC) from humid air remains a challenge. We present the Open DAC 2025 (ODAC25) dataset, a significant expansion and improvement upon ODAC23...
arxiv.org
August 7, 2025 at 11:25 PM
Reposted by Johann Brehmer
Are you tired of context-switching between coding models in @pytorch.org and paper writing on @overleaf.com?

Well, I’ve got the fix for you, Neuralatex! An ML library written in pure Latex!

neuralatex.com

To appear in Sigbovik (subject to rigorous review process)
Neuralatex: A machine learning library written in pure LATEX
Neuralatex: A machine learning library written in pure LATEX
neuralatex.com
April 1, 2025 at 11:23 AM
Reposted by Johann Brehmer
An unexpected surprise. The 2025 Breakthrough Prize in Fundamental Physics honors over 13,000 researchers whose labors have led to the precise description the Higgs mechanism, … breakthroughprize.org/News/91 @CERN
April 5, 2025 at 11:24 PM
Reposted by Johann Brehmer
On March 7th, we’re Standing Up for Science—
and against political censorship, autocracy, and fascism.

Science stands at a crossroads. This is a wider fight for truth, for democracy, and for the future.

We hope you join us.

www.standupforscience2025.org
STAND UP FOR SCIENCE
March 7, 2025. Washington DC and nationwide. Because science is for everyone.
www.standupforscience2025.org
March 2, 2025 at 1:07 PM
Reposted by Johann Brehmer
📣 Hiring! I am looking for PhD/postdoc candidates to work on foundation models for science at @ULiege, with a special focus on weather and climate systems. 🌏 Three positions are open around deep learning, physics-informed FMs and inverse problems with FMs.
December 30, 2024 at 12:21 PM
Reposted by Johann Brehmer
Excellent talk by @johannbrehmer.bsky.social
On “Does equivariance matter at scale?” At NeurReps workshop
arxiv.org/abs/2410.23179
www.neurreps.org
December 14, 2024 at 7:14 PM
Just arrived in Vancouver for #NeurIPS.

I'm looking forward to meeting old and new friends, learning a thing or two, and presenting some recent work:

1/6
December 11, 2024 at 5:15 AM
Reposted by Johann Brehmer
A common question nowadays: Which is better, diffusion or flow matching? 🤔

Our answer: They’re two sides of the same coin. We wrote a blog post to show how diffusion models and Gaussian flow matching are equivalent. That’s great: It means you can use them interchangeably.
December 2, 2024 at 6:45 PM
Reposted by Johann Brehmer
The sbi package is growing into a community project 🌍 To reflect this and the many algorithms, neural nets, and diagnostics that have been added since its initial release, we have written a new software paper 📝 Check it out, and reach out if you want to get involved: arxiv.org/abs/2411.17337
sbi reloaded: a toolkit for simulation-based inference workflows
Scientists and engineers use simulators to model empirically observed phenomena. However, tuning the parameters of a simulator to ensure its outputs match observed data presents a significant challeng...
arxiv.org
November 27, 2024 at 11:17 AM
Reposted by Johann Brehmer
Thrilled to announce that L-GATr is going to NeurIPS 2024! Plus, there is a new preprint with extended experiments and a more detailed explanation.

Code: github.com/heidelberg-h...
Physics paper: arxiv.org/abs/2411.00446
CS paper: arxiv.org/abs/2405.14806

1/7
GitHub - heidelberg-hepml/lorentz-gatr: Repository for <Lorentz-Equivariant Geometric Algebra Transformers for High-Energy Physics> (J. Spinner et al 2024)
Repository for <Lorentz-Equivariant Geometric Algebra Transformers for High-Energy Physics> (J. Spinner et al 2024) - heidelberg-hepml/lorentz-gatr
github.com
November 25, 2024 at 3:27 PM
Reposted by Johann Brehmer
Milestone: our review paper “The Frontier of Simulation-Based Inference” coauthored with @glouppe.bsky.social & @johannbrehmer.bsky.social hit 1000 citations. I’m very excited about the potential for these methods to transform science!
www.pnas.org/doi/10.1073/...

simulation-based-inference.org
November 22, 2024 at 1:16 PM
Reposted by Johann Brehmer
The snow is gently falling outside the window, the models are training, what could be better? Two articles cool to read:

Does Equivariance matter at scale? (@johannbrehmer.bsky.social et al.) arxiv.org/abs/2410.23179
Denoising Diffusion Bridge Models (Linqi Zhou et al.) arxiv.org/pdf/2309.16948
November 21, 2024 at 12:48 PM
Looking for an internship? Interested in geometric or causal priors for robotics, inductive biases at scale, or one of many other ML topics?

Come work with us in Amsterdam!

careers.qualcomm.com/careers?pid=...
Qualcomm Careers | Engineering Jobs and More | Qualcomm
Search open positions at Qualcomm. Learn more about how our culture of collaboration and robust benefits program allow our employees to live well and exceed their potential.
careers.qualcomm.com
November 6, 2023 at 9:32 AM