Dr J. Rosenbaum
banner
jrosenbaum.com.au
Dr J. Rosenbaum
@jrosenbaum.com.au
Artist and researcher working with AI perceptions of gender. PhD, nerd, muso, they/them pronouns 🏳️‍⚧️🇦🇺
links page at minxdragon.com
Phew fixed my handle, my domain provider moved TXT settings to an “advanced DNS” tier.
Luckily I could shoehorn in my hosting provider instead.
It’s *always* DNS
November 11, 2025 at 3:50 AM
Reposted by Dr J. Rosenbaum
art
November 10, 2025 at 2:45 PM
I work with AI, I teach AI. When the bubble bursts I will find something else, I will pivot. It's ok. I would rather the bubble just burst thanks.
Maybe I can get back to using proper machine learning and science again.
me when the AI bubble bursts
November 10, 2025 at 11:47 PM
Reposted by Dr J. Rosenbaum
me when the AI bubble bursts
November 10, 2025 at 6:04 PM
Reposted by Dr J. Rosenbaum
Ballet [1/4]
November 10, 2025 at 10:20 PM
I use it a lot as shorthand. I started calling myself an AI artist ~10 years ago because no one understood what a machine learning artist was. Now I usually say machine learning for science stuff, AI for general and generative AI to specify when I am talking about transformer architecture harms
When you are using the "AI" framing, does the way you talk about "AI" carry water for the Sam Altmans of the world? That is, are you helping to paint a picture of their tech as inevitable, all powerful, and/or anything other than commercial products?

>>
November 10, 2025 at 11:41 PM
Reposted by Dr J. Rosenbaum
When you are using the "AI" framing, does the way you talk about "AI" carry water for the Sam Altmans of the world? That is, are you helping to paint a picture of their tech as inevitable, all powerful, and/or anything other than commercial products?

>>
November 10, 2025 at 8:35 PM
Reposted by Dr J. Rosenbaum
Love the phrases "Cognitive Cost" and "Executive Function Theft" to pinpoint how exhausting it is to be constantly bombarded with apps telling you to use AI. I've been calling it "Corporate pressure" and "Force feeding". It's quite a lot right now. Seems a bit desperate tbh.
The #ExecutiveFunctionTheft of having to opt out.
Feeling annoyed at the cognitive cost of having to dismiss all the offers of "AI" assitance every time I use Acrobat to provide feedback on documents my students wrote. (NO I do NOT want an "AI" summary of this. In what world???)

Decided to check settings:
November 10, 2025 at 6:53 PM
This was always going to happen. AI should NEVER be responsible for anything involving high risk situations ESPECIALLY INCLUDING healthcare. Chronic pain kills if mishandled.
a few months ago I said "having generative AI handle absolutely anything with regards to healthcare is a nightmare and should be banned" and a bunch of people made fun of me and called me stupid. anyways,
November 10, 2025 at 11:30 PM
Reposted by Dr J. Rosenbaum
we should not be letting any company get away with blaming AI for their failures. I'm not even sure how we got to this point. people are like "it's just a tool" okay so is a forklift and we still hold somebody responsible when it goes wrong and hurts someone. this is YOUR AI!
November 10, 2025 at 4:05 PM
Reposted by Dr J. Rosenbaum
An awful, awful example of what happens when you use AI scribe systems in healthcare without any meaningful form of redress for their erroneous output.

Setting a "compliance" policy that doesn't allow clinical staff to change AI generated fields without intervention is definitely a money decision.
a few months ago I said "having generative AI handle absolutely anything with regards to healthcare is a nightmare and should be banned" and a bunch of people made fun of me and called me stupid. anyways,
November 10, 2025 at 7:41 PM
Reposted by Dr J. Rosenbaum
AI systems can unintentionally worsen mental health issues — including eating disorders. CDT’s new report, From Symptoms to Systems, maps out 6 key ways generative AI may contribute to eating disorder-related harms and offers guidance for developers, clinicians & caregivers. cdt.org/insights/fro...
November 10, 2025 at 8:55 PM
Reposted by Dr J. Rosenbaum
Humans: capable of building entire skyscrapers but still afraid to put the scary number on one of the floors
November 10, 2025 at 2:35 AM
Reposted by Dr J. Rosenbaum
What the fuck was the point of shutting down the government then??
November 10, 2025 at 11:32 AM
Reposted by Dr J. Rosenbaum
Whenever some sanctimonious Dem announces that America needs a functioning Republican Party, I always think “I’d settle for a functioning Democratic Party.”
November 10, 2025 at 9:36 PM
Reposted by Dr J. Rosenbaum
Breaking: competition predicated on measuring physical advantages… comes out against physical advantages*??

*advantages may or may not exist. terms and conditions apply.
The International Olympic Committee (IOC) is moving towards a complete ban on transgender women in female events following a review of the evidence on the long-term physical advantages of being born male.

🔗 theathletic.com/6795023/?sou...
November 10, 2025 at 8:40 PM
Reposted by Dr J. Rosenbaum
November 9, 2025 at 2:25 PM
Reposted by Dr J. Rosenbaum
Two articles that articulate the dehumanising reality and the brilliantly human possibilities of AI (and other tech) and the importance of where we direct and attention and energy.

Guardian piece on medical AI and below in 🧵and great interview with the relational tech project in the US.
“The problem is that when it is installed in a health sector that prizes efficiency, surveillance and profit extraction, AI becomes not a tool for care and community but simply another instrument for commodifying human life.”
What we lose when we surrender care to algorithms | Eric Reinhart
A dangerous faith in AI is sweeping American healthcare – with consequences for the basis of society itself
www.theguardian.com
November 9, 2025 at 6:03 PM
Reposted by Dr J. Rosenbaum
Well this by @eric-reinhart.com is excellent.

I was struck by the way he describes patients coming for consultations having rehearsed and refined their stories using ChatGPT.

And how he describes the meaning in silences that don't make it into transcriptions.

www.theguardian.com/us-news/ng-i...
What we lose when we surrender care to algorithms | Eric Reinhart
A dangerous faith in AI is sweeping American healthcare – with consequences for the basis of society itself
www.theguardian.com
November 9, 2025 at 6:52 PM
Reposted by Dr J. Rosenbaum
Bleakly hilarious state of affairs that "the heaviest AI users are thought leadership writers (84%)."
November 9, 2025 at 6:43 PM
Reposted by Dr J. Rosenbaum
Marc Andreesen,🥚, picked a fight with *the Pope* over a pretty basic call to exercise discernment when building AI.

Marc truly believes that any constraint on anything he does is fundamentally illegitimate.
November 9, 2025 at 10:35 PM
Reposted by Dr J. Rosenbaum
You didn’t have to hear anybody’s opinion about literally anything unless they were within slapping distance
November 9, 2025 at 3:36 PM
Reposted by Dr J. Rosenbaum
Incredibly bittersweet to be published in Teen Vogue this weekend after devastating layoffs, including my editor for this piece. It’s about how beauty influencers have fallen into and become part of the alt-right pipeline targeting girls and women
www.teenvogue.com/story/womano...
The 'Womanosphere' Is Coming for Teen Girls
How beauty and wellness influencers are part of a misinformation ecosystem pushing traditional values on girls.
www.teenvogue.com
November 9, 2025 at 9:56 PM
Reposted by Dr J. Rosenbaum
Don’t Cave for a Promise.
November 9, 2025 at 11:38 PM
Reposted by Dr J. Rosenbaum
I keep warning that so many of our systems are still built around the assumption that quality writing and analysis are costly and therefore meaningful signals.

Our systems are very much not ready for the revelation that this is no longer true, as this planning objection AI shows
November 9, 2025 at 11:39 PM