Jonny Williams
banner
delivervalue.bsky.social
Jonny Williams
@delivervalue.bsky.social
Platform Specialist at Red Hat & Author of "Delivery Management: Enabling Teams to Deliver Value"

https://delivervalue.uk
Which approach do you think your organisation should take?
December 4, 2024 at 12:26 PM
This isn't just a technical challenge, it's a fundamental issue for society.

AI should expand our understanding of human potential, not restrict it. 🚀
December 4, 2024 at 12:25 PM
I strongly believe that the benefits of open source speak for themselves:
- Transparent development processes
- Community-driven bias detection
- Representation that reflects true diversity
- Democratised technology that empowers, rather than marginalises
December 4, 2024 at 12:25 PM
This is why I'm making the case for open source AI, and I hope you will too.
December 4, 2024 at 12:25 PM
🚧 Accessibility Barriers: Closed source AI results in technological gatekeeping that disproportionately impacts underrepresented communities. When AI systems are controlled by a handful of privileged tech companies, who gets to define what "normal" looks like?
December 4, 2024 at 12:25 PM
👨🏻‍⚕️ Professional Stereotyping: These AI models consistently default to narrow, discriminatory representations. A doctor is assumed male. A nurse, female. A tech leader, white and male. These aren't just harmless defaults. They're powerful examples of exclusion that reinforce workplace inequalities.
December 4, 2024 at 12:25 PM
🤫 Erased Communities: Models don't just reflect bias, they can actively erase and marginalise diverse identities. The recent David Mayer story shows how even a single name can be arbitrarily suppressed, highlighting how easily systems can be manipulated to control narratives and visibility.
December 4, 2024 at 12:25 PM
Consider the profound ways closed source AI might perpetuate harmful stereotypes...
December 4, 2024 at 12:24 PM
AI is not neutral. It's a mirror reflecting and amplifying our deepest societal biases.

Unfortunately we don't have an ability to interrogate most AI models and uncover what training data they consumed. There's plenty of hidden ugliness in closed source AI that we simply don't get to see. 🫠
December 4, 2024 at 12:24 PM
Nice post!
December 2, 2024 at 11:54 AM