Bálint Mucsányi
bmucsanyi.bsky.social
Bálint Mucsányi
@bmucsanyi.bsky.social
ELLIS & IMPRS-IS PhD Student at the University of Tübingen.

Excited about uncertainty quantification, weight spaces, and deep learning theory.
Excited to present our spotlight paper on uncertainty disentanglement at #NeurIPS! Drop by today between 11 am and 2 pm PST at West Ballroom A-D #5509 and let's chat!
December 12, 2024 at 6:00 PM
For more details, check out our paper: arxiv.org/abs/2402.19460! Our GitHub repo (github.com/bmucsanyi/un...) contains performant implementations of the 19 benchmarked uncertainty methods, out-of-the-box OOD perturbation support, handling of label uncertainty, and support for over 50 metrics. 7/7
December 3, 2024 at 1:38 PM
Predictive uncertainty encompasses all the aforementioned sources of uncertainty. Almost all methods perform well on predictive uncertainty metrics, but the best-performing one depends on the exact metric (see podiums below for different metrics). 4/7
December 3, 2024 at 1:38 PM
Instead, we found specialized estimators to perform best at capturing these sources of uncertainty. For epistemic uncertainty, a specialized OOD detector works best. For aleatoric uncertainty, evidential methods perform well, but more research is needed to develop dedicated aleatoric estimators. 3/7
December 3, 2024 at 1:38 PM
Decomposition formulas like in the image below are popular approaches for breaking up the total uncertainty into different parts. However, we unveil that these parts are severely internally correlated (rank corr. 0.8 to 0.999), i.e., they "measure the same thing" in practice. 2/7
December 3, 2024 at 1:38 PM
Thrilled to share our NeurIPS spotlight on uncertainty disentanglement! ✨ We study how well existing methods disentangle different sources of uncertainty, like epistemic and aleatoric. While all tested methods fail at this task, there are promising avenues ahead. 🧵 👇 1/7

📖: arxiv.org/abs/2402.19460
December 3, 2024 at 1:38 PM