Hervé Eulacia
banner
eulacia.bsky.social
Hervé Eulacia
@eulacia.bsky.social
Promote the growth of knowledge and honor the institutions that safeguard the freedom to criticize.

eulacia.com
At this point you may also dispute this is not clear enough to read. And if you do, I suspect that your definition of legibility may be a bit too puritanical 😊
July 25, 2025 at 7:01 PM
yes!
April 14, 2025 at 12:45 AM
They quote books that they never really understood and lack the culture to understand that the reason why intelligence is — to the best of our current knowledge — substrate independent is also the reason why AGIs will necessarily people.

Most of these guys are plain dorks.
April 14, 2025 at 12:44 AM
Altman never understood Deutsch with any depth. Him invoking on of his books to mix up creativity with induction is a good example of how shallow he is… same for Thompson. These guys don’t know anything about — and clearly were never interested in — epistemology.
April 14, 2025 at 12:36 AM
It would be challenging to revert to Google. Kagi offers a significant improvement in terms of quality and clarity in search results, resulting in more efficient searches.
March 11, 2025 at 4:39 PM
I see but it’s a solvable issue. Correlation engines face two intractable issues : inexplicit knowledge and the limits of induction.
March 7, 2025 at 10:46 AM
No, a lot of the things we know are not explicit. Nor are they even conscious, actually. Language is just a distillation of what we know. And you can’t seriously hope to reverse-engineer human intelligence from what humans say to write.
March 7, 2025 at 2:57 AM
Even early web was fraught with liabilities for hosts.
February 26, 2025 at 2:42 AM
Building it would result in way more downsides than upsides for the owner.
February 26, 2025 at 2:19 AM
some people do miss you on X btw… I told them you’re here. The best will follow!
February 22, 2025 at 12:08 PM
February 9, 2025 at 12:47 PM
The bad epistemology and weak science of Jonathan Haidt wins again 🙄
February 1, 2025 at 5:14 PM
Ever looked into Kagi.com? They offer most LLMs. You can compare, tweak each one of them for specific uses, compare their results instantly… the dream!
February 1, 2025 at 12:57 PM
Also remarkable:

1^3+2^3+3^3+4^3+5^3+6^3+7^3+8^3+9^3=2025
January 2, 2025 at 12:25 PM
2025 will be the square year of our lifetime — next one is 2116.
January 1, 2025 at 6:19 PM
Communicating with anyone in the world is about as hard as convincing your best friend to read a single paragraph of David Deutsch.
December 9, 2024 at 11:30 AM
Select the file in the iCloud folder and ask to keep download. It’s now an option.
December 7, 2024 at 3:18 PM
Come back to podcasting soon!
December 4, 2024 at 10:43 PM