Jules
julienkrs.bsky.social
Jules
@julienkrs.bsky.social
Software, management, products and usability. German/French/English. Lover of all things UX, guitar, and Arsene Wenger.
I agree that as of today, the difference is massive. It's more of a general phlosophical musingon my end, I wonder if the industry tries to go beyond the transformers or earnestly looks at their relative efficiency. I fear it's conceptually easier to spend that VC money on data centers though.
January 21, 2026 at 4:46 PM
I'm saying this from a mac that has trouble answering large context questions in ollama with a qwen 14b and similar. But the results are good-ish. Go small, I say.
January 21, 2026 at 4:38 PM
I'm wondering if the end goal should not be to scale down the LLMs to the extend of them running competently on a standard dev laptop and having the inference go there. This way, you still sell a product license (cursor and the likes), model upgrades, but then defer the energy cost on the end user
January 21, 2026 at 4:37 PM
Fantastique, bienvenue à vous !
January 23, 2025 at 8:50 AM
La classe ultime, d'une certaine manière. Etre l'exemple que l'on veut donner.
January 9, 2025 at 10:07 AM