Jonas Spinner
jonasspinner.bsky.social
Jonas Spinner
@jonasspinner.bsky.social
PhD student working on machine learning for high energy physics. Interested in equivariant architectures and generative modelling.
Pinned
Lorentz Local Canonicalization (LLoCa) is a drop-in replacement that makes any network Lorentz-equivariant. Check out how we apply it to high-energy physics tasks in arxiv.org/abs/2505.20280.

w/ Luigi Favaro, Peter Lippmann, Sebastian Pitz, Gerrit Gerhartz, Tilman Plehn, and Fred A. Hamprecht
1/6
Lorentz Local Canonicalization: How to Make Any Network Lorentz-Equivariant
Lorentz-equivariant neural networks are becoming the leading architectures for high-energy physics. Current implementations rely on specialized layers, limiting architectural choices. We introduce Lor...
arxiv.org
Lorentz Local Canonicalization (LLoCa) is a drop-in replacement that makes any network Lorentz-equivariant. Check out how we apply it to high-energy physics tasks in arxiv.org/abs/2505.20280.

w/ Luigi Favaro, Peter Lippmann, Sebastian Pitz, Gerrit Gerhartz, Tilman Plehn, and Fred A. Hamprecht
1/6
Lorentz Local Canonicalization: How to Make Any Network Lorentz-Equivariant
Lorentz-equivariant neural networks are becoming the leading architectures for high-energy physics. Current implementations rely on specialized layers, limiting architectural choices. We introduce Lor...
arxiv.org
June 2, 2025 at 9:43 AM
Can transformers learn the universal patterns of jet radiation and extrapolate beyond the training data?

Find out in our preprint
'Extrapolating Jet Radiation with Autoregressive Transformers'
arxiv.org/abs/2412.12074

w/ Javi Marino, Ayo Ore, Francois Charton, Anja Butter and Tilman Plehn
1/7
Extrapolating Jet Radiation with Autoregressive Transformers
Generative networks are an exciting tool for fast LHC event generation. Usually, they are used to generate configurations with a fixed number of particles. Autoregressive transformers allow us to gene...
arxiv.org
December 19, 2024 at 12:45 PM
Reposted by Jonas Spinner
On Thursday from 11:00 to 14:00, I'll be cheering on @jonasspinner.bsky.social and Victor Bresó at poster 3911.

They built L-GATr 🐊: a transformer that's equivariant to the Lorentz symmetry of special relativity. It performs remarkably well across different tasks in high-energy physics.

2/6
December 11, 2024 at 5:15 AM
Thrilled to announce that L-GATr is going to NeurIPS 2024! Plus, there is a new preprint with extended experiments and a more detailed explanation.

Code: github.com/heidelberg-h...
Physics paper: arxiv.org/abs/2411.00446
CS paper: arxiv.org/abs/2405.14806

1/7
GitHub - heidelberg-hepml/lorentz-gatr: Repository for <Lorentz-Equivariant Geometric Algebra Transformers for High-Energy Physics> (J. Spinner et al 2024)
Repository for <Lorentz-Equivariant Geometric Algebra Transformers for High-Energy Physics> (J. Spinner et al 2024) - heidelberg-hepml/lorentz-gatr
github.com
November 25, 2024 at 3:27 PM