Gail TB
gailtb.bsky.social
Gail TB
@gailtb.bsky.social
Based in London, UK. She / her
Reposted by Gail TB
When a chatbot gets something wrong, it’s not because it made an error. It’s because on that roll of the dice, it happened to string together a group of words that, when read by a human, represents something false. But it was working entirely as designed. It was supposed to make a sentence & it did.
June 19, 2025 at 11:28 AM
Reposted by Gail TB
Chatbots — LLMs — do not know facts and are not designed to be able to accurately answer factual questions. They are designed to find and mimic patterns of words, probabilistically. When they’re “right” it’s because correct things are often written down, so those patterns are frequent. That’s all.
June 19, 2025 at 11:21 AM
Reposted by Gail TB
A little while ago, my parents' cat Bridget went missing. As the weeks dragged on, they became extremely worried. My dad devised a way to distract himself: he began to paint Bridget's adventures, imagining her travelling through time and popping up in some of art and music's most iconic scenes.
April 6, 2025 at 10:24 AM