Jed Brown
banner
jedbrown.org
Jed Brown
@jedbrown.org
Prof developing fast algorithms, reliable software, and healthy communities for computational science. Opinions my own. https://hachyderm.io/@jedbrown

https://PhyPID.org | aspiring killjoy | against epistemicide | he/they
This is a very favorable study design (the doctors wrote clean scenarios and recruited participants to play along). Real patients will mention irrelevant/distractor/somatic symptoms, and may use language related to conspiracy theories. Chatbots will tend to reinforce those.
doi.org/10.1038/s415...
February 10, 2026 at 1:32 AM
Products that lull regulators into complacency, assuming that tests that would be valid for assessing correctness of conventional devices and people are also valid for products based on fitting data without a mechanism. A system that seems trustworthy, then mistakes the carotid artery during surgery
“Researchers from Johns Hopkins, Georgetown and Yale universities recently found that 60 FDA-authorized medical devices using AI were linked to 182 product recalls, according to a research letter published in the JAMA Health Forum in August.”
As AI enters the operating room, reports arise of botched surgeries and misidentified body parts
Medical device makers have been rushing to add AI to their products. While proponents say the new technology will revolutionize medicine, regulators are receiving a rising number of claims of patient ...
www.reuters.com
February 9, 2026 at 6:51 PM
Thread is absolutely correct. The "fooled you" dollars are not good and will only get worse as the MLM scam spreads. It won't displace established authors, but is a denial of service attack on new authors finding an audience, their new gem in a sea of slop as readers stick to known authors.
I'm going to win.
February 8, 2026 at 5:41 PM
Reposted by Jed Brown
The main conceit of "AI" (namely chatbots and "agents") are the twin promises of control and productivity.

The mechanism of control is that you can be replaced more effectively by a machine, or someone who wields the machine more adroitly than you.
The bosses are forcing their employees to use it. Politicians are competing to shovel money & infrastructure at it. The billionaires who control media are constantly lecturing & hectoring people to accept it & use it more.

They don't care it's losing money. They don't care people hate it.
February 8, 2026 at 12:15 AM
Reposted by Jed Brown
I do want to shout-out my fellow sociologists, who have collectively created a discipline so woke that not a single one of our introductory textbooks can make it past Florida's censors.

Great work everyone.
February 6, 2026 at 3:00 PM
Reposted by Jed Brown
No joke: I got angry hate mail today for writing an obituary of a Black woman scientist—because the person felt she did didn’t deserve the recognition.

Which just makes me want to share it again: www.nature.com/articles/d41...
Gladys Mae West obituary: mathematician who pioneered GPS technology
She made key contributions to US cold-war science despite facing huge barriers as a Black woman.
www.nature.com
February 6, 2026 at 9:09 AM
Internal search launched in Dec and closed in Jan with no shared governance or faculty representation in the committee, despite the fact that the current chancellor's term doesn't end until Jun 2027. This isn't a good-faith search, but a shady ploy to shut the university community out of the process
CSU faculty raise alarm over how the university is finding its next chancellor
The representative body for academic faculty and a local chapter of the American Association of University Professors have both raised objections to Colorado State University's search process for Tony...
www.kunc.org
February 6, 2026 at 7:08 AM
Caught with fingerprints all over the Rule 11 Violation Machine and the Rule 11 Violation Obfuscator. 14 of 60 cited cases do not exist and other cases were misquoted, but this man thought he could talk his way out of sanctions in 97 pages of testimony under oath. Get his ass, Judge Failla.
I’m getting nauseous just reading this
February 6, 2026 at 4:58 AM
Reposted by Jed Brown
The Trump administration is sending a loud and clear message that they will not stop schools from discriminating.

This is a necessary step on the path to resegregating schools. Which has been their openly stated goal all along.
Between March and December of 2025, the department received more than 9,000 civil rights complaints and dismissed 90% of the roughly 7,000 complaints it resolved.
UPDATE: $38 million spent on laid-off civil rights staff, while complaints were dismissed
The latest education news updates from EdSource.
edsource.org
February 4, 2026 at 8:04 PM
The asymmetry of bullshit: You can make an anti-vax conspiracy bot (with incidental factual statements), but you can't make an accurate-science bot using LLM methods (they don't represent facts and will reinforce false claims even if all training data were strictly factual in narrative voice).
Using tech to manufacture BS anti-vax claims
February 5, 2026 at 4:28 AM
Reposted by Jed Brown
The Wayback is now so load-bearing we should be protecting it with our actual lives
They closed the CIA World Factbook and deleted it entirely.
February 5, 2026 at 12:07 AM
Also very true of universities: from the HUAC and McCarthy era to today, ordinary faculty, staff, and students are far more committed to the university charter than are upper administrators and regents.
one thing the trump era has made clear, i think, is that the american people themselves are far more committed to the values of our founding documents than our elites
February 5, 2026 at 12:26 AM
Reposted by Jed Brown
1. I am tracking graphics and figures in academic journals made gen AI here: bit.ly/academic-slop

2. Keeping gen AI of the sort described here should be a completely solvable problem with run of the mill peer review and editorial oversight.

3. Change research assessment incentives.
February 4, 2026 at 9:42 PM
Reposted by Jed Brown
@biblioracle.bsky.social has a great interview with @mattseybold.bsky.social on techno-feudalism, mass surveillance, Ed Tech, and what this means for academic freedom, and higher ED more generally. A chilling read. @aaup.org

academicfreedomontheline.substack.com/p/the-techno...
The Technology That's Taking Your Freedom
It's more than AI. A Q&A with Matt Seybold.
academicfreedomontheline.substack.com
February 3, 2026 at 6:49 PM
Reposted by Jed Brown
Thank you, Victor. I spent a few weeks on this. As usual, it’s always a negotiation with length. So a few other bon mots:

1. The admin has issued DOJ guidance to challenge state laws that would regulate AI

www.wired.com/story/ai-sup...
February 3, 2026 at 1:58 PM
This is overstating the decision. Alsup only addressed training, not uses of the trained model. Using it for search is fair use, but it remains to be adjudicated whether distributing the model weights or output is fair use when those include near-verbatim reproduction of books.
February 2, 2026 at 3:18 PM
Reposted by Jed Brown
EdTech, it doesn't work... How long until we can just accept the science on this? www.felienne.nl/2026-05/#edt...
AI in week 5
Op veler verzoek... Een kortere nieuwsbrief! (Click here for English!) Ik ga echt proberen me vanaf te beperken tot twee onderwerpen, zo hou ik hopelijk tijd over voor mijn boek. En ik stond deze wee...
www.felienne.nl
February 1, 2026 at 11:07 AM
Reposted by Jed Brown
As these teens describe, AI can diminish human relationships; devalue art; threaten the environment; lead to laziness; give unreliable results; pose privacy concerns; and be misused.

So, please, stop with the narratives of inevitability and let's embrace a pedagogy and politics of refusal.
7 Reasons Teens Say No to AI
Some young people only turn to artificial-intelligence chatbots as a last resort, citing concerns about relationships, creativity, the environment and more.
www.wsj.com
February 1, 2026 at 10:24 PM
University administrators across the country are holding this "compact", drafted by one Epstein associate to give another Epstein associate leverage over their institutions, hoping that silence will protect them from public scrutiny. "No" is a complete sentence and faculty were correct to demand it.
This is the U Penn donor who helped push the President out, pushed to eliminate arts & science offerings, and drafted Trumps "compact" with universities.
People like Rowan repeatedly claimed the moral high ground to impose their agenda on students and faculty.
Top Apollo Global Management executives including chief Marc Rowan held wide-ranging discussions over the firm’s tax arrangements with Jeffrey Epstein throughout the 2010s www.ft.com/content/092d...
February 1, 2026 at 9:30 PM
The question should never be "is fraud profitable" or "is plagiarism good for my h-index". A call for dispassionate study fails to confront power structures that got us here, cedes the framing, and ignores the fact that decision-makers won't fund and won't listen to results of critical studies.
Why limit to empirical effects and not in principle rejecting? We reject things as inappropriate (or worse) based on how they are a conflict of interest — ed tech by fascist companies. No data affect such a judgement once made. Common sense for conflicts of interest, not creation of new databases.
Triggered by claims the "AI in education debate" is caught in a deadlock and by some academics worried that "AI critics: are over-anxious about *possible* rather than empirically documented effects, here are what I think are some worthwhile critical projects about AI in education 🧵
February 1, 2026 at 7:24 AM
Reposted by Jed Brown
Faculty did not ask for this. The chancellor did this while claiming austerity and collecting a million dollars in salary and benefits. While TAMU is closing women's and gender studies programs out of pointed cruelty, the CSU is doing the same out of willful starving of resources and layoffs.
The CSU/OpenAI contract is set to expire June 30, 2026.

Sign this petition: https://actionnetwork.org/petitions/cancel-chatgpt-edu-invest-in-humans/ for the CSU NOT to renew the contract and to use the savings to protect jobs at CSU campuses facing layoffs.
February 1, 2026 at 12:25 AM
A lot of institutions overrepresented in the Epstein files should be thinking about whether this is the reputation they want. And if not, what they can do now to change culture and hire people of such character that won't be in the Epstein files of 2050. MeToo barely scratched the surface.
for whatever reason you were being chummy in emails with a convicted pedophile and sex trafficker, its not a "pile on" or people "judging" you when they are grossed out by it.
You made your choice, you knew who Epstein was, and still wanted to be in his orbit. How do you take it back now?
January 31, 2026 at 11:07 PM
Reposted by Jed Brown
beyond words to express how much i hate this... looks like "AI summary" not the author's written abstract is displayed by default for all ACM digital library publications. i, for one, didn't ask for it and hate it to my core
January 23, 2026 at 2:50 PM
Imagine if we relied on oil companies to publish evidence that CO2 emissions cause climate change. This statement against interest by Anthropic illustrates the epistemic vulnerability in which funding agencies and universities have uncritically accepted vibes-based claims.

arxiv.org/pdf/2601.20245
January 31, 2026 at 4:05 AM
Reposted by Jed Brown
“Most of the reports the charity receives involve wearable health technology like smartwatches and rings, but she said abusers are also using smart locks, heating technology, and even fertility trackers to manipulate and control victims.”
Abuse involving smartwatches and rings rising at ‘alarming’ rate, charity warns
Abusers are using smartwatches, rings, and home technology to ‘stalk, surveil and control survivors’, the charity said
www.independent.co.uk
January 30, 2026 at 2:36 PM