#regularization
There's not so much use for conjugate priors these days tbh. Better to spend the time figuring out posterior predictive checks. Any intuitions you have about marginal effects will probably be helpful too. And the loose equivalence between priors on parameters and regularization may also be helpful.
November 7, 2025 at 11:04 PM
If there had been no border crisis with 100ks of migrants -- many from outside the Americas -- released into the interior, I think political rhetoric surrounding regularization of existing long-term migrants/Dreamers and more permissive rules for Mexicans/CentAms would have been *wayyy* less heated.
November 10, 2025 at 5:07 PM
(5/n) With this, we can run coarse-grained Langevin dynamics directly, without the need for any priors or force labels.

This works across biomolecular systems including fast-folding proteins like Chignolin and BBA.

Here is a comparison with and without our regularization:
November 6, 2025 at 2:41 PM
I'm also really curious what's happening here. A push for regularization has been a firm part of many advocacy campaigns, particularly for those on precarious temporary permits. We'll see who is included!
November 4, 2025 at 10:12 PM
Imaging-based models are being used to assess #BrainAge, but can they predict #NeurologicalDisease? @ritterkerstin.bsky.social &co show that simpler models, with lower age prediction accuracy, are paradoxically more sensitive to disease-related brain changes @plosbiology.org 🧪 plos.io/47NcIew
October 29, 2025 at 2:07 PM
I need to get used to the fact that Neural Networks with Dropout regularization can have a better validation loss than training loss. #machineLearning 🤖🧠
October 27, 2025 at 8:27 AM
pfft, ML people are way ahead of you - call it weight decay, or, if you want to be really old school Tikhonov regularization.
October 25, 2025 at 3:18 AM
🧠 Regularization, Action, and Attractors in the Dynamical “Bayesian” Brain

direct.mit.edu/jocn/article...

(still uncorrected proofs, but they should post the corrected one soon--also OA is forthcoming, for now PDF at brainandexperience.org/pdf/10.1162-...)
Regularization, Action, and Attractors in the Dynamical “Bayesian” Brain
Abstract. The idea that the brain is a probabilistic (Bayesian) inference machine, continuously trying to figure out the hidden causes of its inputs, has become very influential in cognitive (neuro)sc...
direct.mit.edu
October 22, 2025 at 8:59 AM
Singular Value-based Atmospheric Tomography with Fourier Domain Regularization (SAFR). Lukas Weissinger et. al. https://arxiv.org/abs/2510.19542
October 23, 2025 at 4:56 AM
New on arXiv: Why noise matters in learned regularization!
Data-driven regularizers fail without noise structure. 🎯
Tikhonov, Lavrentiev, quadratic — not equal under non-white noise.
🤫 Spoiler: Optimal reconstruction needs proper noise modeling.
👉 Regularizer choice = structure and noise.
Sebastian Banert, Christoph Brauer, Dirk Lorenz, Lionel Tondji
Why the noise model matters: A performance gap in learned regularization
https://arxiv.org/abs/2510.12521
October 15, 2025 at 6:40 AM
7/8 The takeaway for the public: consider training choices like entropy regularization can make systems more robust so fewer restarts and less costly retraining when the world shifts. This means your learning systems are more durable and efficient.
October 7, 2025 at 6:34 PM
How well can we map its surface without making any assumptions (often called regularization in the field of image reconstruction)? We inject different levels of noise into our simulated data and optimize to find the best-fitting map. At high signal-to-noise we basically recover a perfect map! (9/N)
October 1, 2025 at 7:42 AM
me, thinking about immigration these days:
September 26, 2025 at 10:11 PM
I tell students: You don’t need to memorize every ML algorithm. But you do need to understand:
— loss functions
— overfitting
— regularization
Master the “why,” not just the “how.”

#DataScience #MachineLearning #Statistics #AI #RStats
September 18, 2025 at 6:08 PM
there’s no linguistics work on this since 1953 btw. wiktionary & others claim it’s an over-regularization following “eaten” but I have a theory rn that it’s to avoid confusion with “drunk,” as in alcohol, which seems not that different from the tomar/beber regional distinction (in motivation)
September 19, 2025 at 3:44 AM
“Equal rights”: thousands of immigrant workers demand equality and regularization
“Direitos iguais”: milhares de trabalhadores imigrantes exigem igualdade e regularização
Milhares de imigrantes manifestaram-se em Lisboa para exigir documentos e o fim das injustiças na regularização. Detenções ilegais e problemas com sistema de sinalização do espaço Schengen são dois pr...
www.esquerda.net
September 17, 2025 at 9:21 PM
Michael Potter, Stefano Maxenti, Michael Everett
Robust Survival Analysis with Adversarial Regularization. (arXiv:2312.16019v1 [stat.ML])
http://arxiv.org/abs/2312.16019
December 27, 2023 at 5:00 AM
various regularization terms, we derive corresponding closed-form solutions while maintaining the $L$-smooth adaptable property always holds for any $L\ge 1$. Numerical experiments, on graph regularized clustering and sparse [7/8 of https://arxiv.org/abs/2503.02386v1]
March 5, 2025 at 6:17 AM
Ziyang Wu, Jingyuan Zhang, Druv Pai, XuDong Wang, Chandan Singh, Jianwei Yang, Jianfeng Gao, Yi Ma
Simplifying DINO via Coding Rate Regularization
https://arxiv.org/abs/2502.10385
February 17, 2025 at 6:19 AM
"Finally, by introducing several modifications (query specific regularization, disjoint encoders etc.), we are able to improve efficiency, achieving latency on par with BM25 under the same computing constraints."
github.com/naver/splade
GitHub - naver/splade: SPLADE: sparse neural search (SIGIR21, SIGIR22)
SPLADE: sparse neural search (SIGIR21, SIGIR22). Contribute to naver/splade development by creating an account on GitHub.
github.com
June 6, 2024 at 2:18 PM
That really shouldn't matter. Bonus if you have the expired passport, but what's really important is that you're probably in the system, which is the important part.
Go here:
soniadiazmexico.com/other-visas/

Scroll past most of it until you see Regularization. Read up on it.
February 18, 2025 at 3:37 AM
It is not a universal amnesty but a particular regularization for very established people and part of our communities. It could help up to 500,000 adults and 50,000 children.
June 19, 2024 at 4:37 PM