J. Llarena
banner
ll4r3n4.bsky.social
J. Llarena
@ll4r3n4.bsky.social
Independent Researcher, NLP/ASR Engineer looking for a PhD position in Computational Neuro/Psycho/linguistics. He/him.

josellarena.github.io
This has always irked me too. Closest thing I can think of are the complex behaviours of some insects (like Sphex wasp en.wikipedia.org/wiki/Sphex) that are fine-tuned to the environment, but don't result in permanent behavioural changes. D. Dennett called this mindless intelligence "Sphexishness".
Sphex - Wikipedia
en.wikipedia.org
November 16, 2025 at 3:23 PM
Reposted by J. Llarena
Read the cookbook: arxiv.org/abs/2510.00368

Join us for weekly seminars on formal language theory, ML, NLP, and more: flannseminars.github.io
October 3, 2025 at 4:24 PM
Reposted by J. Llarena
There is no better way to understand what transformers can do than to get your hands dirty and construct them, weight-by-weight. The Transformer Cookbook provides a guide for anyone aiming to understand the expressive power of transformers on such a formal level.
October 3, 2025 at 4:24 PM
Reposted by J. Llarena
October 3, 2025 at 4:24 PM
Neural methods haven't taken over as much you'd think, especially in smaller companies that can't afford large-scale data + compute. OpenFST is still used directly in Hybrid DNN-HMM ASR models, and also indirectly, through Pynini, in (Inverse) Text Normalisation, needed for both TTS and ASR
August 16, 2025 at 12:40 PM
mirror.aclweb.org
June 6, 2025 at 6:57 PM