antirez
banner
antirez.bsky.social
antirez
@antirez.bsky.social
Reproducible bugs are candies 🍭🍬
I like programming too much for not liking automatic programming.
Software is created for accumulation of knowledge. AI is not going to cancel this fact. Forgot the idea that programs will be prompts (specifications). The details is what really matters, and they are hard to capture textually than in the code. New projects will be spec + code, evolving.
February 10, 2026 at 4:40 PM
You see Anthropic releasing a "fast mode". At the same time you see Codex slowly doing in 5 minutes something that Claude Code rushed for 2 hours trying to get done. Speed is very relative.
February 9, 2026 at 3:47 PM
Now Flux2.c also supports the base model (if you have time and a >= M3, and the inclination), and different schedulers, and it is quite fan to run, also at reduced steps, since 50 are... a bit too much to wait.
February 7, 2026 at 11:28 AM
Now flux2.c is consistently faster than official PyTorch MPS and not far from Draw Things (14s vs 19s at 1024x). DT uses 6bit quants while flux2 uses BF16 (2.7x the weights). Happy with the result so far. github.com/antirez/flux...
github.com
February 7, 2026 at 11:02 AM
voxtral.c decoding speed is now near to Metal hardware limits for ~80%.
February 6, 2026 at 7:00 PM
The HN commenters who simultaneously argue "it just decompressed GCC" and "the output is way worse than GCC" don't seem to notice they're refuting themselves. I didn't expect better than that, given the normie-alization of the site in recent times.
February 6, 2026 at 2:42 PM
I wrote a SPEED markdown file with instructions for Claude Code (Opus 4.6) with a loop process in order to improve the speed of the Voxtral speed with the Metal backend and left my home to pickup my daughter, have lunch, ... Back home, the code is 2x faster, and it is still going.
February 6, 2026 at 2:33 PM
No mercy for abstractions that suck. Don't let your LLM be tempted.
February 6, 2026 at 10:53 AM
Voxtral (from @MistralAI) transcription quality is quite incredible, the way it handles the punctation and all the rest, makes transcribed audio messages so much more understandable. I implemented a few fixes in the FFT and now there is no longer a skipped tokens issue in voxtral.c
February 6, 2026 at 10:11 AM
Maybe you were wondering why serious, grown-up folks would spend nights writing a Commodore 64 well made demo or game, in recent years. Now that your dear LLM can write a serious C compiler in Rust, but can't write a well made C64 game/demo, maybe you are starting to get it.
February 5, 2026 at 11:45 PM
Yesterday @MistralAI released an open weights transcription model able to work in real time, Voxtral Mini 4B. Today, following the Whisper.cpp lesson, here is a C inference pipeline ready to use as a library, I hope you'll enjoy it:

github.com/antirez/voxt...
github.com
February 5, 2026 at 9:00 PM
Mmmm... 44ms per token. Not stellar but nice.
February 5, 2026 at 3:10 PM
Anthropic ADs post: they realized that with Claude Code they have a solid business model and that ADs-model is not comparable. They made a business decision, and used what they refused (ADs) to improve company image. As simple as that.
February 4, 2026 at 10:37 PM
I'm pretty confident that Claude Code, since it spawns multiple sub-agents for sub tasks, has lower performances.
February 4, 2026 at 9:02 PM
Italian is an intrinsically understandable language, by people and machines.
February 4, 2026 at 4:12 PM
Explain me one thing: how a sane human being that saw Claude Code / Codex / ... at work for a few months can have *any* doubt about the ability of those systems to work as a personal assistant, given enough access to your stuff? It requires a lot less capabilities than the other main task.
February 4, 2026 at 2:21 PM
With this prompt trick, you can use flux.2 4b klein (distilled) as a handy and powerful super resolution model. In the example 128x128 -> 1024x1024.
February 3, 2026 at 11:28 PM
Welcome to TgTerm: run your coding agents via Telegram for fun and profit:

github.com/antirez/tgterm
February 3, 2026 at 10:45 AM
Remembering to run stuff with tmux, tunnels, termux or another SSH client on the phone is ways too much. So I thought I could build a Telegram bot that gives me access to terminals running on my computer when I'm just with my phone around. Security must be handled. I believe I'm going to use it.
February 2, 2026 at 10:45 PM
Anthropic may be near to release a new model, however, as things are today, after weeks of testing, I'm totally sure Codex with GPT 5.2-xhigh is definitely a model capable of handling way more complex tasks.
February 2, 2026 at 4:37 PM
Reposted by antirez
Un video sulla vita e le opere di Norbert #Wiener, grande scienziato, inventore della #cibernetica e matematico sopraffino, raccontate sfogliando qualche sua opera.

youtu.be/dSPd7excxC4?...
Norbert Wiener: scienza e vita di un grande matematico sfogliandone qualche libro
YouTube video by Paolo Caressa
youtu.be
February 2, 2026 at 11:07 AM
The new AGENT markdown file in the Flux2.c project -> github.com/antirez/flux...
February 2, 2026 at 11:35 AM
As I did time ago with the Claude account, let's stop also the ChatGPT subscription I did for an error via iPhone to redo it via the web site. I don't think the 30% cut is fair and, as a customer, I want to express it with my choices.
February 2, 2026 at 10:05 AM
Second time my MacOS crashes because of a combination of Ghostty memory leaks + MacOs kernel terrible handling of OOM conditions.
February 1, 2026 at 12:41 PM
"What's your workflow with coding agents?"

Well, this, among the many others.
January 31, 2026 at 3:36 PM