Matt Blake
banner
mattblake.uk
Matt Blake
@mattblake.uk
No code app development tutoring at 👉 https://www.planetnocode.com

Habitual side project starter 🪴

X @mattblake_uk
The future of support might not be human versus AI. It might be human plus AI.
#AICoaching #FutureOfWork #HumanConnection
September 5, 2025 at 8:46 PM
For millions without access to human coaching, this could mean 24/7 affordable support.
AI may not be ready to replace deep therapeutic relationships yet.

But the question isn't whether it will happen. It's how quickly we adapt to working alongside these tools rather than competing with them.
September 5, 2025 at 8:46 PM
This points to a fascinating division of labour emerging.

AI handling structured, goal-focused support that follows established frameworks.
Humans managing the complex, adaptive work requiring cultural sensitivity and emotional nuance.
September 5, 2025 at 8:46 PM
The research revealed something unexpected about our relationship with AI. We don't need to build rapport with machines like we do with humans. What matters most is whether we believe the technology actually works.
September 5, 2025 at 8:46 PM
Students felt psychologically safe with AI. They shared personal information without fear of judgement.

But here's where it gets interesting.

AI only worked for narrow targets. It couldn't improve broader measures like resilience or overall wellbeing, which human coaches influenced significantly.
September 5, 2025 at 8:46 PM
Maybe the real breakthrough isn't eliminating AI confabulations.

Maybe it's teaching AI to catch itself in the act, just like we do.

#ArtificialIntelligence #CognitiveScience #MachineLearning
September 4, 2025 at 7:30 PM
It's that we're better at recognising when we're making things up.

But here's the uncomfortable question: if AI is developing the same cognitive patterns that make us human, how long will our advantage of self-awareness last?
September 4, 2025 at 7:30 PM
We construct them when needed, influenced by everything we've learned since, filling gaps with what feels most plausible in the moment.

Both humans and AI construct rather than retrieve information.

The difference isn't that we're more intelligent.
September 4, 2025 at 7:30 PM
Yet the overall gist was perfectly accurate: there was a cover-up happening.

He was confabulating plausible memories based on what seemed right to him.

That's exactly what AI does when it "hallucinates."

We don't store memories like computer files either.
September 4, 2025 at 7:30 PM
During Watergate, John Dean testified about Oval Office meetings with incredible confidence and detail.

He was genuinely trying to tell the truth, but got massive amounts wrong - meetings that never happened, wrong people saying wrong things.
September 4, 2025 at 7:30 PM
In psychology, confabulations are false memories that people genuinely believe to be true - not lies, but plausible content our brains create to fill gaps, with zero awareness the details are wrong.

Sound familiar?
September 4, 2025 at 7:30 PM