Rahul Sethi
sethi.dev
Rahul Sethi
@sethi.dev
Londoner. Previously Principal Engineer @Cloudflare Workers
Then “millions of retirees” is surely an objective lie
November 7, 2025 at 6:29 PM
I see, it’s too late for _Labour_ to start, in their own interests. That’s less horrifying.
November 7, 2025 at 11:30 AM
I refuse to believe it’s too late to start (countering the narrative) now because what’s the alternative??
November 7, 2025 at 11:25 AM
Catchy
October 31, 2025 at 5:55 PM
At this stage of their power grab, the two are plausibly identical. By the time they prove they’re fascists beyond doubt, it’ll be too late
October 31, 2025 at 1:04 PM
Billion. With a B.
October 28, 2025 at 2:43 PM
Watch how the supplier is a shell company connected to their mates/donors
October 24, 2025 at 3:03 PM
By pulling themselves up by their own bootstraps. Brainlessly.
October 7, 2025 at 11:42 AM
@princecharlescinema.com get this screening in the UK
September 25, 2025 at 4:09 PM
Have you heard of the “no true Scotsman” fallacy?
September 23, 2025 at 9:30 PM
Wind, Nuclear, France really got me 😂 came down to say the same thing!
September 6, 2025 at 10:49 PM
This was more fun than I thought it’d be haha
August 20, 2025 at 3:23 PM
I really do wish we prioritised the goal for privacy and running these models locally on the device
June 27, 2025 at 12:35 PM
For anyone getting close to considering it youtu.be/FuGiqCXKDlk?...
Bret Weinstein and Tucker Carlson Have a Playdate
YouTube video by Professor Dave Explains
youtu.be
June 18, 2025 at 10:49 AM
Reposted by Rahul Sethi
this might be orthogonal to the actual OP, but i really worry how often people are sold on the performance of intellect rather than its outcomes, and maybe this worry is misplaced
June 18, 2025 at 4:54 AM
Confidence could be improved by footnote style citations for *each point* within the paragraph and the links including `#:~:text=…` fragments for the specific section. But the (current) vague citations and 404s indicate they’re optimising for perception of confidence instead
May 29, 2025 at 2:50 PM
When ChatGPT has a source at the end of a paragraph, they know they’re leaning on this pattern. Several times I’ve clicked these links and could not find the cited info. Often they’re just 404s which is laughable for a *live* source, not training material, and demonstrates their abuse of the pattern
May 29, 2025 at 2:50 PM
Search engines have implemented many assurances for source confidence. They strip html content hidden by css, reindex often enough to avoid rug pulls, don’t rephrase (beyond tense) ie hallucinate. That is the expected standard now – could be better but don’t want to undersell it either
May 29, 2025 at 2:50 PM
I hate that my brain needed to work this out *sigh* Infinity[3] true[0] false[3]
May 29, 2025 at 1:50 PM
Exactly! No gloves, intentional and more athletic
May 15, 2025 at 1:28 AM
Not even ‘the’ government. Just government in general. Screams of internal delusion – they’ll never allow themselves to see they’re wrong
May 9, 2025 at 9:51 AM
This is one of the least controversial things in maths. There are at least 3 proofs that are simple and intuitive like your example. Wait until she hears 1 + 2 + 3 + 4 + … = -1/12
April 25, 2025 at 5:39 PM
What I love about that episode is holds a mirror up at real human behaviour. It’s like, given this sci-fi future tech exists, where does that existing behaviour lead. There are some great episodes from this season of flawed humans using tech awfully. I personally think every episode was amazing
April 13, 2025 at 7:38 PM
Doesn’t everyone do this and only sometimes catch themselves before posting or is this another adhd thing I don’t know about…
April 3, 2025 at 11:30 PM