fatherbarry.bsky.social
fatherbarry.bsky.social
@fatherbarry.bsky.social
It’s recognizing that LLMs facsimile of intelligence is smoke & mirrors, and that a different kind of model not based on text and able to contextualize arbitrary data would be needed for real progress. It’s a positive step in jumping off the hype train and re-exploring traditional ML techniques
November 20, 2025 at 3:40 PM
LLMs are “sequence transduction” models designed to do math on a sequence of tokens, like text. People frequently *represent* info as text, but the text is not the info itself.

LLM = based on these tokens what’s next token
Hypothetical World Model = based on raw info, predict unknown raw info
November 20, 2025 at 3:37 PM