Machine Learning in Science
banner
mackelab.bsky.social
Machine Learning in Science
@mackelab.bsky.social
We build probabilistic #MachineLearning and #AI Tools for scientific discovery, especially in Neuroscience. Probably not posted by @jakhmack.bsky.social.

📍 @ml4science.bsky.social‬, Tübingen, Germany
Congrats to Dr Michael Deistler @deismic.bsky.social, who defended his PhD!

Michael worked on "Machine Learning for Inference in Biophysical Neuroscience Simulations", focusing on simulation-based inference and differentiable simulation.

We wish him all the best for the next chapter! 👏🎓
October 2, 2025 at 11:28 AM
The Macke lab is well-represented at the @bernsteinneuro.bsky.social conference in Frankfurt this year! We have lots of exciting new work to present with 7 posters (details👇) 1/9
September 30, 2025 at 2:06 PM
Reposted by Machine Learning in Science
From hackathon to release: sbi v0.25 is here! 🎉

What happens when dozens of SBI researchers and practitioners collaborate for a week? New inference methods, new documentation, lots of new embedding networks, a bridge to pyro and a bridge between flow matching and score-based methods 🤯

1/7 🧵
September 9, 2025 at 3:00 PM
New preprint: SBI with foundation models!
Tired of training or tuning your inference network, or waiting for your simulations to finish? Our method NPE-PF can help: It provides training-free simulation-based inference, achieving competitive performance with orders of magnitude fewer simulations! ⚡️
July 23, 2025 at 2:28 PM
Reposted by Machine Learning in Science
The neurons that encode sequential information into working memory do not fire in that same order during recall, a finding that is at odds with a long-standing theory. Read more in this month’s Null and Noteworthy.

By @ldattaro.bsky.social

#neuroskyence

www.thetransmitter.org/null-and-not...
Null and Noteworthy: Neurons tracking sequences don’t fire in order
Instead, neurons encode the position of sequential items in working memory based on when they fire during ongoing brain wave oscillations—a finding that challenges a long-standing theory.
www.thetransmitter.org
June 30, 2025 at 4:08 PM
Many people in our lab use Scholar Inbox regularly -- highly recommended!
July 2, 2025 at 7:05 AM
Thrilled to share that our paper on using simulation-based inference for inferring ice accumulation and melting rates for Antarctic ice shelves is now published in Journal of Glaciology!

www.cambridge.org/core/journal...
Simulation-based inference of surface accumulation and basal melt rates of an Antarctic ice shelf from isochronal layers | Journal of Glaciology | Cambridge Core
Simulation-based inference of surface accumulation and basal melt rates of an Antarctic ice shelf from isochronal layers - Volume 71
www.cambridge.org
June 11, 2025 at 11:47 AM
Reposted by Machine Learning in Science
Great news! Our March SBI hackathon in Tübingen was a huge success, with 40+ participants (30 onsite!). Expect significant updates soon: awesome new features & a revamped documentation you'll love! Huge thanks to our amazing SBI community! Release details coming soon. 🥁 🎉
May 12, 2025 at 2:29 PM
🎓Hiring now! 🧠 Join us at the exciting intersection of ML and Neuroscience! #AI4science
We’re looking for PhDs, Postdocs and Scientific Programmers that want to use deep learning to build, optimize and study mechanistic models of neural computations. Full details: www.mackelab.org/jobs/ 1/5
Jobs - mackelab
The MackeLab is a research group at the Excellence Cluster Machine Learning at Tübingen University!
www.mackelab.org
April 30, 2025 at 1:43 PM
Excited to present our work on compositional SBI for time series at #ICLR2025 tomorrow!

If you're interested in simulation-based inference for time series, come chat with Manuel Gloeckler or Shoji Toyota

at Poster #420, Saturday 10:00–12:00 in Hall 3.

📰: arxiv.org/abs/2411.02728
Compositional simulation-based inference for time series
Amortized simulation-based inference (SBI) methods train neural networks on simulated data to perform Bayesian inference. While this strategy avoids the need for tractable likelihoods, it often requir...
arxiv.org
April 25, 2025 at 8:53 AM
Reposted by Machine Learning in Science
Thanks so much for the shout-out, and congrats on your exciting work!! 🎉 🙂

Also, a good reminder to share that our work is now out in Cell Reports 🙏🎊

⬇️

www.cell.com/cell-reports...
April 17, 2025 at 8:50 PM
The @mackelab.bsky.social is represented at @cosynemeeting.bsky.social #cosyne2025 in Montreal

with 3 posters, 2 workshop talks, and a main conference contributed talk (for the very first time in Mackelab history 🎉)!
March 27, 2025 at 2:03 PM
Reposted by Machine Learning in Science
Exciting new paper out of a Tübingen-Bonn collaboration, with three researchers from our cluster involved: first author @stefanieliebe.bsky.social, @matthijspals.bsky.social & @mackelab.bsky.social. Congrats to the team!
March 24, 2025 at 1:53 PM
Reposted by Machine Learning in Science
Science Alert 🚨: Our paper is now out in @natureneuro.bsky.social - We show that the firing phase of neurons in human MTL doesn’t reflect the order of events, challenging a long-standing theory of human memory.
nature.com/articles/s41593-025-01893-7
Phase of firing does not reflect temporal order in sequence memory of humans and recurrent neural networks - Nature Neuroscience
The temporal order of events in working memory is thought to be reflected by ordered neuronal firing at different phases. Here the authors show that this is not the case and that phase order is linked...
www.nature.com
March 24, 2025 at 12:55 PM
Reposted by Machine Learning in Science
Together with @dendritesgr.bsky.social, we’ll be hosting a tutorial on constructing and optimizing biophysical models (via Jaxley & DendroTweaks) 🚀

Join us in Florence if you like dendrites, biophysics, or optimization!
We are thrilled to host a tutorial on biophysical modeling with Jaxley & DendroTweaks at #CNS2025! 📍 Florence, July 5 — looking forward to seeing you there! www.cnsorg.org/cns-2025 @cnsorg.bsky.social
February 28, 2025 at 8:08 AM
Reposted by Machine Learning in Science
1) Some exciting science in turbulent times:

How do mice distinguish self-generated vs. object-generated looming stimuli? Our new study combines VR and neural recordings from superior colliculus (SC) 🧠🐭 to explore this question.

Check out our preprint doi.org/10.1101/2024... 🧵
February 3, 2025 at 7:19 PM
Talk to @vetterj.bsky.social and @gmoss13.bsky.social about sourcerer at #Neurips2024 today!
📍Poster #4006 (East; 11 am PT)
December 13, 2024 at 4:49 PM
Reposted by Machine Learning in Science
1) With our @neuripsconf.bsky.social poster happening tomorrow, it's about time to introduce our Spotlight paper 🔦, co-lead with @jkapoor.bsky.social:

Latent Diffusion for Neural Spiking data (LDNS), a latent variable model (LVM) which addresses 3 goals simultaneously:
December 11, 2024 at 7:43 AM
Reposted by Machine Learning in Science
How to find all fixed points in piece-wise linear recurrent neural networks (RNNs)?
A short thread 🧵

In RNNs with N units with ReLU(x-b) activations the phase space is partioned in 2^N regions by hyperplanes at x=b 1/7
December 11, 2024 at 1:32 AM
Thrilled to announce we have three #NeurIPS2024 papers! Interested in simulating realistic neural data with diffusion models or recurrent neural networks, or in source distribution sorcery? Have a look 👇 1/4
December 9, 2024 at 7:28 PM
Reposted by Machine Learning in Science
Watching the sbi-toolbox grow up, seeing its many uses on a wide range of applications, and experiencing the growth, momentum + team-spirit of the sbi community has been amazing. We now have a short software paper with many new contributions and contributors! So many thanks, and get involved!
The sbi package is growing into a community project 🌍 To reflect this and the many algorithms, neural nets, and diagnostics that have been added since its initial release, we have written a new software paper 📝 Check it out, and reach out if you want to get involved: arxiv.org/abs/2411.17337
sbi reloaded: a toolkit for simulation-based inference workflows
Scientists and engineers use simulators to model empirically observed phenomena. However, tuning the parameters of a simulator to ensure its outputs match observed data presents a significant challeng...
arxiv.org
November 27, 2024 at 11:44 AM
Reposted by Machine Learning in Science
The sbi package is growing into a community project 🌍 To reflect this and the many algorithms, neural nets, and diagnostics that have been added since its initial release, we have written a new software paper 📝 Check it out, and reach out if you want to get involved: arxiv.org/abs/2411.17337
sbi reloaded: a toolkit for simulation-based inference workflows
Scientists and engineers use simulators to model empirically observed phenomena. However, tuning the parameters of a simulator to ensure its outputs match observed data presents a significant challeng...
arxiv.org
November 27, 2024 at 11:17 AM
To introduce our science, let’s start with: we are hiring! Inspired to do a PhD or Postdoc in #AI4Science? Work with us on ML tools for scientific discovery. Full details: www.mackelab.org/jobs/
PhD students: Apply by Nov 15 (tomorrow!), directly to IMPRS-IS or ELLIS
Jobs - mackelab
The MackeLab is a research group at the Excellence Cluster Machine Learning at Tübingen University!
www.mackelab.org
November 14, 2024 at 2:11 PM
Hi world! This is the brand-new BlueSky account of the Machine Learning in Science (@jakhmack.bsky.social) lab. We create probabilistic #MachineLearning and #AI Tools for scientific discovery — but more on that soon!
For now let’s introduce ourselves with some pictures of our recent group retreat.
November 14, 2024 at 1:39 PM