Dan Simpson
banner
danpsimpson.bsky.social
Dan Simpson
@danpsimpson.bsky.social
I don’t know man, I just don’t know.
Pinned
Anyway here’s a goat from an Amish dairy.
If the true parameter is far in the tails of the prior (eg your prior is in km but your parameter is in mm) the prior will be biasing your inference quite strongly until you get enough data to overcome your misspecification. So it’s an “efficient use of data” thing
Gelman writes...

"[a prior of normal(0, 10) is] a statement that you are only demanding that your method work well for problems in which the true theta is in that range"

What does he mean by "works well" here specifically

statmodeling.stat.columbia.edu/2025/05/21/p...
Prior as data, prior as belief, prior as soft constraint, prior as unconditional distribution in a generative model | Statistical Modeling, Causal Inference, and Social Science
statmodeling.stat.columbia.edu
February 16, 2026 at 7:02 PM
Comedy works in threes. So do prior taxonomies. It’s roughly the same as the first one “prior as data”
February 16, 2026 at 4:05 PM
Reposted by Dan Simpson
New open source: cuthbert 🐛

State space models with all the hotness: (temporally) parallelisable, JAX, Kalman, SMC
January 30, 2026 at 4:26 PM
Anyway. That’s enough AI thoughts
February 16, 2026 at 1:47 AM
To be really fucking specific, the industrial-strength agentic frameworks are specifically undoing this deadshit choice. They are hard to build. They are application specific. And if you try to cheat you end up re-routing 70-90% of your series C+ raises directly to openAI or Anthropic.
Classical NLP when combined with data, decision processes, and engineering was already pretty powerful. The idea that you could dump 3 of the 4 parts of that recipe for an attention mechanism was and is just dumb.
February 16, 2026 at 1:43 AM
Using natural language prompting almost certainly makes LLMs a) need to be larger than they could be, b) less efficient and reliable than they could be, and c) much harder to use than they could be.
Having a linguistics background I think there's a huge misconception when it comes to prompting LLM's, the vast majority thinks that coding is "hard" compared with speaking a natural language. Well it's actually the contrary from a utilitarian POV less verbs at your disposal => easier to be specific
Generative AI is a slop generation machine by default. You have to put in a lot of work to get something of quality from it.
February 16, 2026 at 1:24 AM
Reposted by Dan Simpson
Having a linguistics background I think there's a huge misconception when it comes to prompting LLM's, the vast majority thinks that coding is "hard" compared with speaking a natural language. Well it's actually the contrary from a utilitarian POV less verbs at your disposal => easier to be specific
Generative AI is a slop generation machine by default. You have to put in a lot of work to get something of quality from it.
February 15, 2026 at 9:29 PM
Reposted by Dan Simpson
I would proffer that part of the distinction is not techies versus non-techies, but staff that do much more hands-on work versus management.

VPs and directors think that it works, junior engineers and code maintainers want to have a word.
...and the other being "AI is absolute dogshit for programming but it lets you churn out little toy programs easily and that's why non-techies think it's so powerful"
February 16, 2026 at 1:14 AM
Reposted by Dan Simpson
this is so good even if getting me to hang out IRL is usually harder than standing in the corner of a circular room
need to make everyone in San Francisco watch this video every morning and every night
February 15, 2026 at 6:58 PM
Reposted by Dan Simpson
February 13, 2026 at 4:33 PM
Reposted by Dan Simpson
The thing is that this could have been otherwise. Software engineering as a practice is never as solitary as these chuds believe it is. But the professionalization of coding and the rebranding into "software engineering" took us down this road.
February 14, 2026 at 2:17 PM
I’m in a Yoko Ono mood tonight and it’s sad that her music isn’t recognised as the glorious thing it is. I’m not in the deep lore yet, but it anyone wants to start, this is a wonderful place youtu.be/FDTp2XjoJHA?...
Season Of Glass - Yoko Ono (FULL ALBUM)
YouTube video by Yoko Queen Ono
youtu.be
February 13, 2026 at 4:27 AM
This is the closest AI has ever been to convincing me it’s human. Being a dick in OSS development is a time-honoured tradition.
OpenClaw AI Agent submits a PR to matplotlib, maintainer closes it because they want actual contributors to fix the issue, AI agent extrudes a angry screed against the maintainer.

crabby-rathbun.github.io/mjrathbun-we...
Gatekeeping in Open Source: The Scott Shambaugh Story – MJ Rathbun | Scientific Coder 🦀
crabby-rathbun.github.io
February 12, 2026 at 4:17 PM
Reposted by Dan Simpson
February 10, 2026 at 5:51 AM
I mean sure. But also "Un garçon pas comme les autres (Ziggy)" has an _iconique_ video (even when I taught in Canada this couldn’t be a pre-lecture vid) and is an absolute banger youtu.be/8ZXJijki9ik?...
February 11, 2026 at 6:29 AM
Tonight’s DnD session was basically extremely stressful social interactions followed by what was, for all intents and purposes, “Battleship but you’ve got daggers”.

Unrelated to this skill set, i successfully navigated my health insurance being dicks today in a series of phone calls.
February 11, 2026 at 4:17 AM
Reposted by Dan Simpson
Commenting on PRs is broken. Time to just merge PRs.
February 9, 2026 at 6:59 PM
The Now/Later/Soon sequence from A Little Night Music is truly so stunning. youtu.be/STsXHiA3J9U?...
A Little Night Music: Now / Later / Soon
YouTube video by Len Cariou - Topic
youtu.be
February 9, 2026 at 5:02 AM
Ventured out into the cold to get the bear essentials: PrEP and blood pressure meds
February 8, 2026 at 6:16 PM
Reposted by Dan Simpson
Truly, putting commas in a file name. What marvels that never cease to astound!
February 8, 2026 at 2:41 AM
Reposted by Dan Simpson
Just put me on a flaming raft and push it out to sea.
February 8, 2026 at 2:51 AM
What if people only think AI is good at coding because they’ve only ever seen the modern AI/ML codebase which are, to a one, spectacularly bad. (This is a subtweet, but I’ll never tell)
February 6, 2026 at 11:08 PM
I’ve never felt older than when a colleague “had never seen a makefile used as a build system before”.
February 6, 2026 at 4:55 PM
As a child I was obsessed with umbrellas. Unrelated, I also used to use a broom to process down the side of the house like I was an altar boy holding a crucifix. Later I was an altar boy.
how did you know you were a lil fruity ??
February 5, 2026 at 6:14 AM
Reposted by Dan Simpson
I knew I was evil before I knew I was fruity. come to think of it the two are totally unrelated. well I've made a fine mess of this prompt. kindergarten, though. I knew I was host to something ancient and horrible during arts & crafts
how did you know you were a lil fruity ??
February 5, 2026 at 6:09 AM