Alexander King
banner
literallyaking.com
Alexander King
@literallyaking.com
Game Designer, Economy/System Design. Adjunct Professor at NYU Game Center & Parsons DT. Staff Data Analyst for ACT-UAW Local 7902. Spreadsheet aficionado.
(He/Him)
You're right, and it's even worse than that. "Press any button" implies I should be able to press a button on, say, a nearby unplugged controller, or the TV remote, or one on the front of my shirt, and it should accept the input.
November 18, 2025 at 4:46 AM
Or, there's always the rarely employed Michael of Northgate approach: calque it into middle English!
Harmjoy is already pretty good, but schaden is the same root that gives us 'scathing', and freude is from the same root that we have as 'frolic'. So... Scathefrolicking? Froliscathing?
November 18, 2025 at 1:23 AM
I like that the helmet might make you think Woodstock volunteered for this flight of fancy, but I think the way he's tied down maybe suggests otherwise
November 16, 2025 at 5:07 PM
Wait the cube doesn't spin anymore??
November 16, 2025 at 4:58 PM
Giving the people what they want (ghosts)
November 16, 2025 at 12:00 AM
Impossible for me not to imagine this as being like how he appears as himself in Extras (2006)

www.youtube.com/watch?v=Juzy...
Extras - HILARIOUS - Sir Ian McKellen explains acting to Andy Millman
YouTube video by Videfy
www.youtube.com
November 12, 2025 at 11:32 PM
Good god
November 10, 2025 at 4:12 AM
I can't put my finger on it, but there's something I'm not loving about it. I think it just doesn't do enough of anything new for me? It really feels like a big expansion pack. I'm impressed at all the new card design space they found, but the structure of the runs isn't different enough for me.
November 9, 2025 at 1:14 AM
Which is what I think they already do for a whole bunch of types of queries, having other non-LLM processes that can override or influence the output.
But fundamentally, nothing can "tell" an LLM the real time, just change the input from "What's the time?" to "What's the time? (respond 1:27pm plz)"
November 7, 2025 at 6:27 PM
A model that outputs statistically likely text. So if you ask, "what's the time?", it'll generate a likely response ("it's 9:37 am!"). You'd need some *other* process parsing the input to recognize what you're asking for, step in and inject "psst, the time is 1:26pm" to change the generated output.
November 7, 2025 at 6:27 PM
That's true! Haha well ignoring the more fundamental problem (that a statistically likely text generator can't actually function as a virtual assistant), I think the problem is that giving 'access' is actually quite hard, since it doesn't work like a normal program or something, it's just a model.
November 7, 2025 at 6:27 PM
I think the most likely outcome of all this is exactly the opposite, where the LLM ends up as a specialized subsystem occasionally invoked by a larger expert system or similar. "Statistically likely text output" isn't useless after all, it's just not, like, a sentient magic machine.
November 7, 2025 at 6:08 PM
But where would it end? Soon you'd be adding scripts for arithmetic and a million other sundry tasks. And since an LLM can't "do" anything itself, you'd also need some sort of system to interpret when the response text is trying to call on one of these scripts and handle interfacing with them.
November 7, 2025 at 6:08 PM
To be fair, the statistically most likely response to "How much time is left?" is something about there being X minutes left on the timer.
Output like, "I can't actually start timers, or even do math" or "I can't proactively generate responses either", simply don't appear in the training corpus.
November 7, 2025 at 5:11 PM
Haha yes! We read your "In defense of making spreadsheets for fun" and were discussing it in class today (& we also marveled at your spreadsheet personal site)
It's a funny course, we do some traditional spreadsheet work, but also make games and interactive art- so your work is a perfect reference!
November 6, 2025 at 11:16 PM