Ron Richman
ronrich.bsky.social
Ron Richman
@ronrich.bsky.social
Avid actuary
Reposted by Ron Richman
Want all NeurIPS/ICML/ICLR papers in one single .bib file? Here you go!

🗞️ short blog post: fabian-sp.github.io/posts/2024/1...

📇 bib files: github.com/fabian-sp/ml-bib
A Bibliography Database for Machine Learning
Getting the correct bibtex entry for a conference paper (e.g. published at NeurIPS, ICML, ICLR) is annoyingly hard: if you search for the title, you will often find a link to arxiv or to the pdf file,...
fabian-sp.github.io
December 17, 2024 at 10:42 AM
Reposted by Ron Richman
Survey of different Large Language Model Architectures: Trends, Benchmarks, and Challenges

Presents a survey on LLM architectures that systematically categorizes auto-encoding, auto-regressive and encoder-decoder models.

📝 arxiv.org/abs/2412.03220
Survey of different Large Language Model Architectures: Trends, Benchmarks, and Challenges
Large Language Models (LLMs) represent a class of deep learning models adept at understanding natural language and generating coherent responses to various prompts or queries. These models far exceed ...
arxiv.org
December 5, 2024 at 4:15 AM
Reposted by Ron Richman
🚨 New preprint out!

We build **scalar** time series embeddings of temporal networks !

The key enabling insight : the relevant feature of each network snapshot... is just its distance to every other snapshot!

Work w/ FJ Marín, N. Masuda, L. Arola-Fernández

arxiv.org/abs/2412.02715
December 5, 2024 at 8:30 AM
Reposted by Ron Richman
Proud to announce our NeurIPS spotlight, which was in the works for over a year now :) We dig into why decomposing aleatoric and epistemic uncertainty is hard, and what this means for the future of uncertainty quantification.

📖 arxiv.org/abs/2402.19460 🧵1/10
December 3, 2024 at 9:45 AM
Reposted by Ron Richman
If you use SHAP, LIME or Data Shapley, you might be interested in our new #neurips2024 paper. We introduce stochastic amortization to speed up feature + data attribution by 10x-100x 🚀 #XML

Surprisingly we can "learn to attribute" cheaply from noisy explanations! arxiv.org/abs/2401.15866
December 2, 2024 at 5:35 PM
Reposted by Ron Richman
I am very excited to share our new Neurips 2024 paper + package, Treeffuser! 🌳 We combine gradient-boosted trees with diffusion models for fast, flexible probabilistic predictions and well-calibrated uncertainty.

paper: arxiv.org/abs/2406.07658
repo: github.com/blei-lab/tre...

🧵(1/8)
December 2, 2024 at 9:48 PM
Reposted by Ron Richman
TIL from the Hard Fork podcast that the transformer, the core of modern AI including LLMs, was inspired by the aliens in Arrival. That’s wild—and yet another reason to watch Arrival, easily one of the best films of the last decade. Great podcast, great movie!
December 1, 2024 at 6:47 AM
Reposted by Ron Richman
Did you know that attention across the whole input span was inspired by the time-negating alien language in Arrival? Crazy anecdote from the latest Hard Fork podcast (by @kevinroose.com and @caseynewton.bsky.social). HT nwbrownboi on Threads for the lead.
December 1, 2024 at 2:50 PM
Reposted by Ron Richman
I did a small test with TabM-mini and 5-fold bagging, only default parameters with numerical embeddings. It seems that it's roughly comparable with RealMLP. But then maybe RealMLP can benefit more from additional ensembling or the two could be combined. A fair comparison with ensembling is hard.
November 26, 2024 at 2:13 PM
Reposted by Ron Richman
PyTabKit 1.1 is out!

- Includes TabM and provides a scikit-learn interface
- some baseline NN parameter names are renamed (removed double-underscores)
- other small changes, see the readme.

github.com/dholzmueller...
Can deep learning finally compete with boosted trees on tabular data? 🌲
In our NeurIPS 2024 paper, we introduce RealMLP, a NN with improvements in all areas and meta-learned default parameters.
Some insights about RealMLP and other models on large benchmarks (>200 datasets): 🧵
November 25, 2024 at 10:49 AM
Reposted by Ron Richman
Bluesky really is the new #rstats twitter because we have the first base R vs tidyverse flame war 🤣
November 14, 2024 at 5:18 PM
Reposted by Ron Richman
Going to try to start posting more on here given the increasingly suffocating toxicity of the other place 👋
August 7, 2024 at 7:13 AM
Reposted by Ron Richman
When home heating prices are lower, fewer people die each winter, particularly in high-poverty communities. That's the punchline of my paper with Janjala Chirakijja and Pinchuan Ong on heating prices and mortality in the US, just published in the Economic Journal. 📉📈 academic.oup.com/ej/advance-a...
December 7, 2023 at 6:35 PM
Reposted by Ron Richman
“Computer Age Statistical Inference” by Efron and Hastie is great for learning the connections among all these (though not much on deep learning specifically). And it’s free! Can’t recommend this book highly enough:
hastie.su.domains/CASI_files/P...
October 3, 2023 at 9:34 PM