Rob Lowcock
banner
robl.bsky.social
Rob Lowcock
@robl.bsky.social
Building great teams who build great products. Blogging on hashtagwebscale.com
After a brief exchange with a friend who can actually read documentation (unlike yours truly, who failed completely), it turns out there is an explanation in the docs (www.prisma.io/docs/orm/ref...). The LLMs haven't quite caught up yet though
Prisma CLI reference | Prisma Documentation
This page gives an overview of all available Prisma CLI commands, explains their options and shows numerous usage examples.
www.prisma.io
August 10, 2025 at 5:27 PM
In fact it's weirder than that. Claude Sonnet gave me a detailed run down that didn't seem to reflect what the command actually does, then when I quizzed it admitted it made the whole thing up and says the command doesn't exist 🤷‍♂️
August 10, 2025 at 4:25 PM
Thanks! Will give it a go
August 10, 2025 at 4:22 PM
What's particularly fun is when it tries to implement something, runs the tests, and then if they fail goes back on what it implemented and tries something else. It's fascinating. I've seen it wipe out entire files that it literally just created.
August 9, 2025 at 9:47 PM
I'm sure there are plenty more examples! Certainly not wanting to diminish what LLMs do – it's incredible and fun to work with, but I'd be surprised if they sped things up by 10x in the time we've had them
July 8, 2025 at 12:16 PM
Depends on the situation as they can vary wildly, but here are a few examples I've encountered: constant context switching, sprawling codebases that have been developed with high dev churn, and new features without a chance to clean up. A lot comes down to management rather than just tooling.
July 8, 2025 at 12:15 PM
Ultimately LLMs are just tooling - it’s still going to take a while before they’re used to their full potential. But there are definitely some interesting questions as to whether it solves the real problems that slow software development down
July 8, 2025 at 11:50 AM