Deb Sahu
debashishsahu.com
Deb Sahu
@debashishsahu.com
🧲NMR/Biophysics PhD🧬Science🧪Nerd🫙Freshnox - Cofounder🧊S3 Tech Innovations - Founder & CEO🍛Foodie📡IoT enthu pattani👨🏽‍💻Coder👨🏽‍🔧Tinkerer/maker▶️Ex-YouTuber〽️Amateur Trader⚡️EV🚗Dog Dad🐶Smart Home🏠

🌏 https://social.debashishsahu.com

🗣️ Opinions are my own 🫢
People are upset about Opus weekly limits, and tone deaf Anthropic is saying this:
From the Anthropic community on Reddit
Explore this post and more from the Anthropic community
www.reddit.com
October 2, 2025 at 1:57 AM
Yes that’s was my second choice!
October 2, 2025 at 1:07 AM
If you’re tired of swapping summer/winter tires, the CrossClimate 2 is the answer. Summer tire handling, winter tire snow grip, all-weather confidence. Best tire upgrade I’ve made for the Mach-E. Highly recommend!
October 2, 2025 at 12:25 AM
The engineering is wild: advanced rubber compound stays flexible in extreme heat AND cold, V-shaped directional tread for snow/rain performance, plus PIANO noise reduction technology. They’re shockingly quiet on the highway!
October 2, 2025 at 12:25 AM
It’s so refreshing to see it actually brainstorm ideas, not just say “You are absolutely right…” there are lot of pushback on bad ideas, constructive conversations and it’s defends its actions well! Dare I say, feels like talking to a senior SWE!
September 30, 2025 at 4:27 PM
Sonnet 4.5 reaches above Opus 4 at SWE-bench at 70.9% wow! It’s blazing fast as well.
September 29, 2025 at 11:06 PM
🎯 Key takeaway: Sometimes the simplest solution is the best solution.

While everyone talks about complex RAG pipelines with Pinecone & massive models, a well-designed lightweight system delivers better UX.

Perfect for startups, edge deployments, or anyone who values simplicity over complexity.
August 14, 2025 at 3:34 AM
📊 Performance results that surprised everyone:
- Response time: <200ms average
- Memory usage: <2GB RAM
- Cost: Nearly zero vs API-based solutions
- Accuracy: Comparable to heavy vector DB setups for our use case

Why this works: For small-medium datasets, JSON embeddings are perfectly sufficient!
August 14, 2025 at 3:34 AM
💡 The ultra-lightweight setup that actually works:
- Pre-computed embeddings stored as simple JSON files
- Tiny LLM for response generation (sub-1B parameters)
- Cosine similarity search in memory
- Total deployment size under 500MB
August 14, 2025 at 3:34 AM
That’s ~$10k of helium in one picture!
August 7, 2025 at 10:21 PM