NateA
banner
kurosawafan.bsky.social
NateA
@kurosawafan.bsky.social
Hiking, Movies, Philosophy. I teach philosophy, film studies, and the humanities at Eckerd College.
“Trying to fix the proper meaning in our minds is like coaxing the gull to settle in the rigging, with the rule that the gull must be alive when it settles: one must not shoot it and tie it there" Collingwood, Principles of Art, 2/2
May 31, 2025 at 1:40 PM
“"the proper meaning of a word … is never something upon which the word sits perched like a gull on a stone; it is something over which the word hovers like a gull over a ship's stern…” 1/2
May 31, 2025 at 1:39 PM
May 16, 2025 at 2:57 PM
LOL
May 12, 2025 at 3:01 PM
Pope John Paul II wrote his dissertation on Scheler and was deeply influenced by phenomenology.
May 9, 2025 at 12:04 PM
Reposted by NateA
And nobody laments the loss of thinking. It's just on me to invent ever more creative ways to trick them into critical thinking to evade using the machine.
May 6, 2025 at 12:29 PM
Ear Window, Tar Wars
May 3, 2025 at 1:51 PM
But LLMs (assuming that’s what we are talking about), don’t relate to the world. They are algorithms producing strings of text based on training data which allows them to find statistical correlations in word patterns and make next token predictions. There is no entity there to have an experience.
April 28, 2025 at 2:44 PM
Animals with a similar range of sensory capacities are likely to share a roughly similar set of sensory experiences. Animals like bats that have sensory capabilities we lack are likely to have very different experiences—but there is no good reason to think they don’t experience. 2/3 (need one more!)
April 28, 2025 at 2:37 PM
How can I be sure there is something it is like to be you? I can’t, but from analogy with my own experience, from a similarity in our physiology and our capacity to communicate, I can be reasonably sure that you also encounter a world roughly like mine. From there I extend the analogy to animals 1/2
April 28, 2025 at 2:33 PM
I don’t think that’s what Nagel’s article says. Rather: we CAN be reasonably confident there is something it is like to be a bat, but what we can’t know is what that is like. In the case of AI chatbots we have absolutely no reason to assume there is anything it is like to be them.
April 28, 2025 at 1:34 PM