its funny to me that the same folks who yell about how people in tech need to take classes in the humanities simultaneously get upset whenever the humanities start getting applied to tech
February 11, 2026 at 11:37 PM
its funny to me that the same folks who yell about how people in tech need to take classes in the humanities simultaneously get upset whenever the humanities start getting applied to tech
I understand why people are exhausted by AI hype, and why those of us squarely in the corner of "human dignity uber alles" see AI doomerism as self-serving hype, but I *really* think people on the left broadly need to start thinking seriously about the possibiltiy of the hype being...true.
imho the denialists are actually literally in denial, which has the consequence that they are constantly throwing off statements that might as well be cooked in a lab to generate a maximum of friction with reality
February 11, 2026 at 10:35 PM
it’s Not Great that one of the leading AI denialists is an academic coping with her entire life’s work being a waste
my position is that intelligence is decomposable into families of capabilities, a few of which LLMs have but humans do not and many of which humans have but LLMs do not.
I have very strong and I believe well-justified scientific opinions about what LLMs have and don't have and how that relates to human cognition but to claim that anything about this is settled or inarguable or not debatable is in fact super wild and people should be real careful about it.
"LLMs raise absolutely no philosophical questions whatsoever, because the issue of what intelligence consists of was settled a long before they existed" is an insane statement even if you cede the incorrect "it was a settled question" point to the other side.
February 11, 2026 at 4:43 PM
my position is that intelligence is decomposable into families of capabilities, a few of which LLMs have but humans do not and many of which humans have but LLMs do not.