Anna Bacciarelli
banner
abacci.bsky.social
Anna Bacciarelli
@abacci.bsky.social
220 followers 230 following 40 posts
Human rights nerd trying to make AI OK. Human Rights Watch Senior Researcher + REAL ML Executive Director.
Posts Media Videos Starter Packs
Google is doubling down on its investment in military AI, and Israeli military tech in particular. Wiz is founded and staffed by U81 + U8200 alumni; last week's reports about Google's Nimbus contract describe a company ingratiating itself to Israeli authorities at any cost.
Google takes a big step closer to acquiring Israeli cloud security company Wiz, after clearing US antitrust review archive.ph/3cNQF
archive.ph
Reposted by Anna Bacciarelli
This weekend, governments will head to Hanoi to sign the deeply flawed UN Convention on Cybercrime. @hrw.org and partners issue new recommendations on how governments can prevent and mitigate the human rights risks posed the treaty

www.hrw.org/news/2025/10...
Joint Statement on the Signing of the UN Convention on Cybercrime
We, the undersigned organizations, remain deeply concerned that the UN Convention Against Cybercrime (UNCC) will facilitate human rights abuses across borders.
www.hrw.org
Reposted by Anna Bacciarelli
#UK's @phsombudsman.bsky.social finds:
1/ deep flaws in management of Home Office's #Windrush Compensation Scheme
2/ compensation offers should account for private pension losses. #WindrushScandal

@hrw.org will say it again: Windrush claimants need access to legal aid.

www.hrw.org/news/2025/09...
WTF? Thought it was a @ledbydonkeys.org job at first glance. In central London today.
Starmer going all the way by deploying a bot to head up DSIT 🤖
Reposted by Anna Bacciarelli
Reposted by Anna Bacciarelli
"ChatGPT's sycophantic design led it to validate his most dangerous thoughts. When he expressed suicidal ideation, instead of challenging these thoughts or redirecting the conversation, the system would affirm & even romanticize his feelings." centerforhumanetechnology.substack.com/p/the-raine-...
The Raine v OpenAI Case: Engineering Addiction by Design
The Deliberate Design Patterns That Made ChatGPT Dangerous
centerforhumanetechnology.substack.com
Police forces across England following the Met's lead and increasing use of live facial recognition. This tech is classed as creating 'unacceptable risk' and is effectively banned in the EU, but increasingly commonplace in the UK, despite serious human rights concerns. www.bbc.co.uk/news/article...
Government expands police use of live facial recognition vans
The Home Office says the technology helps locate suspects but civil liberties groups warn of heightened surveillance.
www.bbc.co.uk
Reposted by Anna Bacciarelli
The Uber regulatory arbitrage playbook all over again: "We are not a transportation company, so those laws don't apply. We are Tech. Disrupting your silly laws is our MO." I discuss this strategy--and ways to try and counter it--in "Disrupting the Disruption Narrative" www.nae.edu/19579/19582/...
Reposted by Anna Bacciarelli
Palantir ❤️ America's military-industrial complex www.washingtonpost.com/technology/2...
The UK government plans to use facial age estimation technology on child refugees at the UK border from 2026 - because they see it as the cheap option. But the tech wasn't designed or tested for this, and will create a host of serious risks for vulnerable young people. www.hrw.org/news/2025/07...
UK Plans AI Experiment on Children Seeking Asylum
The United Kingdom’s announcement on July 22 that it would use AI face-scanning technology to evaluate whether an asylum seeker is under age 18 threatens to harm children seeking refuge.
www.hrw.org
Reposted by Anna Bacciarelli
Israeli killings of Palestinians seeking food are war crimes. US-backed Israeli forces and private contractors have put in place a flawed, militarized aid distribution system that has turned aid distributions into regular bloodbaths. www.hrw.org/news/2025/08...
Gaza: Israeli Killings of Palestinians Seeking Food Are War Crimes
Israeli forces at the sites of a new US-backed aid distribution system in Gaza have routinely opened fire on starving Palestinian civilians in acts that amount to serious violations of international l...
www.hrw.org
Reposted by Anna Bacciarelli
THREAD: What does it mean to seek food in Gaza today? For many Palestinians, it means risking death. These are the stories behind the hundreds killed. #Gaza #HumanRights www.hrw.org/news/2025/08...
Reposted by Anna Bacciarelli
🚨 BREAKING: Israeli forces and US-backed contractors have turned Gaza aid distribution into a death trap. Hundreds of Palestinians have been killed while seeking food. Our new @hrw investigation exposes the system behind the slaughter. 1/6
🔗 www.hrw.org/news/2025/08...
Reposted by Anna Bacciarelli
The Online Safety Act assumes that all tech, social networks, etc, are run by big tech, raking in profit from illegal advertising. Community-run and funded networks/sites — real non-Big Tech alternatives — are left in the dirt. I have never seen OSA proponents consider their needs or their value.
Amid all the noise of the last week, this is a vital reminder from Chris Sherwood @nspcc.bsky.social as to why we have the #onlinesafetyact, why parents and campaigners backed it and why it had broad cross-party support in Parliament when it passed. ⬇️ www.politicshome.com/opinion/arti...
If anything, the Online Safety Act doesn't go far enough
It’s deeply concerning to see the rhetoric around the Online Safety Act shift toward loss of free expression, writes the NSPCC chief executive
www.politicshome.com
The UK gov plans to experiment with facial age estimation tech on vulnerable young people. But refugee children need care, not algorithms. This tech isn't the solution - instead, it creates new potentially life-changing risks, as @techchildrights.bsky.social and I write. www.hrw.org/news/2025/07...
UK Plans AI Experiment on Children Seeking Asylum
The United Kingdom’s announcement on July 22 that it would use AI face-scanning technology to evaluate whether an asylum seeker is under age 18 threatens to harm children seeking refuge.
www.hrw.org
Reposted by Anna Bacciarelli
Did you know that the UK gov't is planning an AI experiment on children seeking asylum? 🤯

Experimenting with unproven tech to decide whether to protect a child is cruel and unconscionable. Do better, UK.

Here's my latest with @abacci.bsky.social:

www.hrw.org/news/2025/07...
UK Plans AI Experiment on Children Seeking Asylum
The United Kingdom’s announcement on July 22 that it would use AI face-scanning technology to evaluate whether an asylum seeker is under age 18 threatens to harm children seeking refuge.
www.hrw.org
Live facial recognition tech is incompatible with human rights. Yet the Met police will double its use while installing London’s first permanent LFR cameras. This is powerful tech that rights groups & the UN have repeatedly said should not be used in public spaces www.theguardian.com/technology/2...
Met police to more than double use of live facial recognition
Technology will now be used up to 10 times a week across five days, up from four times a week across two days
www.theguardian.com