The police are creating biased and discriminatory intelligence profiles based on individuals' pasts, assuming mistakes will be repeated and ignoring the possibility of change. #WeCopWatch #StopAutomatedRacism
www.amnesty.org.uk/predictive-p...
www.amnesty.org.uk/predictive-p...
Stop Automated Racism
Amnesty International UK found out that 3/4 of police forces across the UK are using technology to try to “predict crime”, but it is having racist and discriminatory impacts. Almost no one knows about it. Learn more:
www.amnesty.org.uk
October 23, 2025 at 2:30 PM
The police are creating biased and discriminatory intelligence profiles based on individuals' pasts, assuming mistakes will be repeated and ignoring the possibility of change. #WeCopWatch #StopAutomatedRacism
www.amnesty.org.uk/predictive-p...
www.amnesty.org.uk/predictive-p...
Predictive policing systems exacerbate racism and discrimination against people from lower socio-economic groups.
ORG supports Amnesty in calling for predictive policing systems to be BANNED.
Sign the petition to #StopAutomatedRacism TODAY ⬇️
www.amnesty.org.uk/actions/ban-...
ORG supports Amnesty in calling for predictive policing systems to be BANNED.
Sign the petition to #StopAutomatedRacism TODAY ⬇️
www.amnesty.org.uk/actions/ban-...
Stop Automated Discrimination
We’ve found out that 3/4 of police forces across the UK are using technology to try to “predict crime” - and almost no one knows about it. But we’re trying to stop it. Sign the petition:
www.amnesty.org.uk
February 20, 2025 at 9:37 AM
Predictive policing systems exacerbate racism and discrimination against people from lower socio-economic groups.
ORG supports Amnesty in calling for predictive policing systems to be BANNED.
Sign the petition to #StopAutomatedRacism TODAY ⬇️
www.amnesty.org.uk/actions/ban-...
ORG supports Amnesty in calling for predictive policing systems to be BANNED.
Sign the petition to #StopAutomatedRacism TODAY ⬇️
www.amnesty.org.uk/actions/ban-...
In 2021, four residents sued the county, resulting in a settlement where the sheriff's office acknowledged violations of rights. In the same police story, the Offender Management App violates human rights. #StopAutomatedRacism
theconversation.com/predictive-p...
theconversation.com/predictive-p...
Predictive policing AI is on the rise − making it accountable to the public could curb its harmful effects
AI that anticipates where crimes are likely to occur and who might commit them has a troubling track record. Democratic accountability could shine a light on the technology and how it’s used.
theconversation.com
September 19, 2025 at 9:42 AM
In 2021, four residents sued the county, resulting in a settlement where the sheriff's office acknowledged violations of rights. In the same police story, the Offender Management App violates human rights. #StopAutomatedRacism
theconversation.com/predictive-p...
theconversation.com/predictive-p...
www.openrightsgroup.org/campaign/res...
Data and content are being misused to unjustly criminalise individuals, driven by facial recognition, AI, and surveillance technologies. #stopautomatedracism #WeCopWatch
Data and content are being misused to unjustly criminalise individuals, driven by facial recognition, AI, and surveillance technologies. #stopautomatedracism #WeCopWatch
End Pre-Crime
Data and content is being weaponised to criminalise people without cause fuelled by facial recognition technology, AI and surveillance.
www.openrightsgroup.org
September 10, 2025 at 2:01 PM
www.openrightsgroup.org/campaign/res...
Data and content are being misused to unjustly criminalise individuals, driven by facial recognition, AI, and surveillance technologies. #stopautomatedracism #WeCopWatch
Data and content are being misused to unjustly criminalise individuals, driven by facial recognition, AI, and surveillance technologies. #stopautomatedracism #WeCopWatch
www.amnesty.org.uk/blogs/human-...
In today's world, advanced technologies and algorithms are enhancing policing predictive capabilities and entrenching and amplifying institutional racism and discrimination. #StopAutomatedRacism
In today's world, advanced technologies and algorithms are enhancing policing predictive capabilities and entrenching and amplifying institutional racism and discrimination. #StopAutomatedRacism
Modern policing has always been predictive.
We are Amnesty International UK. We are ordinary people from across the world standing up for humanity and human rights.
www.amnesty.org.uk
October 10, 2025 at 4:00 PM
www.amnesty.org.uk/blogs/human-...
In today's world, advanced technologies and algorithms are enhancing policing predictive capabilities and entrenching and amplifying institutional racism and discrimination. #StopAutomatedRacism
In today's world, advanced technologies and algorithms are enhancing policing predictive capabilities and entrenching and amplifying institutional racism and discrimination. #StopAutomatedRacism
copwatchersorg.wordpress.com/2024/04/23/p...
We’re revisiting the Offender Management App this Saturday and community justice paths. The police have "risk-scored" over 364,000 of us. First steps toward accountability and justice are in the article below. #StopAutomatedRacism #WeCopWatch
We’re revisiting the Offender Management App this Saturday and community justice paths. The police have "risk-scored" over 364,000 of us. First steps toward accountability and justice are in the article below. #StopAutomatedRacism #WeCopWatch
Predictive Policing and Subject Access Requests
By John Pegram, case worker, public speaker, founder If you’ve been keeping tabs on all things police in Bristol and across Avon and Somerset you’d be right in thinking there’s a …
copwatchersorg.wordpress.com
September 11, 2025 at 10:21 AM
copwatchersorg.wordpress.com/2024/04/23/p...
We’re revisiting the Offender Management App this Saturday and community justice paths. The police have "risk-scored" over 364,000 of us. First steps toward accountability and justice are in the article below. #StopAutomatedRacism #WeCopWatch
We’re revisiting the Offender Management App this Saturday and community justice paths. The police have "risk-scored" over 364,000 of us. First steps toward accountability and justice are in the article below. #StopAutomatedRacism #WeCopWatch
justice-equity-technology.org/predictive-p...
This article is worth your time if you want to learn how to mobilise your community against predictive policing. Stay tuned for updates on our blog this weekend! #WeCopWatch #EndThinkSurveillance #StopAutomatedRacism
This article is worth your time if you want to learn how to mobilise your community against predictive policing. Stay tuned for updates on our blog this weekend! #WeCopWatch #EndThinkSurveillance #StopAutomatedRacism
Predictive Policing and Community Organising – Justice, Equity & Technology
◼
justice-equity-technology.org
November 8, 2025 at 4:03 PM
justice-equity-technology.org/predictive-p...
This article is worth your time if you want to learn how to mobilise your community against predictive policing. Stay tuned for updates on our blog this weekend! #WeCopWatch #EndThinkSurveillance #StopAutomatedRacism
This article is worth your time if you want to learn how to mobilise your community against predictive policing. Stay tuned for updates on our blog this weekend! #WeCopWatch #EndThinkSurveillance #StopAutomatedRacism
Look out for our latest blog post, "Road mapping paths to accountability", a follow-up to "Predictive policing and subject access requests" over the weekend. #WeCopWatch #stopautomatedracism
copwatchersorg.wordpress.com/2024/04/23/p...
copwatchersorg.wordpress.com/2024/04/23/p...
Predictive Policing and Subject Access Requests
By John Pegram, case worker, public speaker, founder If you’ve been keeping tabs on all things police in Bristol and across Avon and Somerset you’d be right in thinking there’s a …
copwatchersorg.wordpress.com
November 8, 2025 at 5:17 PM
Look out for our latest blog post, "Road mapping paths to accountability", a follow-up to "Predictive policing and subject access requests" over the weekend. #WeCopWatch #stopautomatedracism
copwatchersorg.wordpress.com/2024/04/23/p...
copwatchersorg.wordpress.com/2024/04/23/p...
www.openrightsgroup.org/blog/why-pre...
The UK Government is attempting to leverage algorithms to forecast which individuals may become violent offenders, utilising sensitive personal data from hundreds of thousands. #automatedinjustice #stopautomatedracism
The UK Government is attempting to leverage algorithms to forecast which individuals may become violent offenders, utilising sensitive personal data from hundreds of thousands. #automatedinjustice #stopautomatedracism
Why ‘Predictive’ Policing Must be Banned
The UK Government is trying to use algorithms to predict which people are most likely to become killers using sensetive personal data of hundreds of thousands of people.
www.openrightsgroup.org
October 14, 2025 at 2:01 PM
www.openrightsgroup.org/blog/why-pre...
The UK Government is attempting to leverage algorithms to forecast which individuals may become violent offenders, utilising sensitive personal data from hundreds of thousands. #automatedinjustice #stopautomatedracism
The UK Government is attempting to leverage algorithms to forecast which individuals may become violent offenders, utilising sensitive personal data from hundreds of thousands. #automatedinjustice #stopautomatedracism
publiclawproject.org.uk/latest/publi...
"A new project from Public Law Project, funded by the Nuffield Foundation, will investigate how transparency mechanisms can be adapted to facilitate fair, lawful, and non-discriminatory automated decision-making in the public sector." #stopautomatedracism
"A new project from Public Law Project, funded by the Nuffield Foundation, will investigate how transparency mechanisms can be adapted to facilitate fair, lawful, and non-discriminatory automated decision-making in the public sector." #stopautomatedracism
Public law litigation in the automated state - Public Law Project
How can we use transparency mechanisms to make sure the public sector's automated decision-making is fair and lawful?
publiclawproject.org.uk
September 21, 2025 at 2:24 PM
publiclawproject.org.uk/latest/publi...
"A new project from Public Law Project, funded by the Nuffield Foundation, will investigate how transparency mechanisms can be adapted to facilitate fair, lawful, and non-discriminatory automated decision-making in the public sector." #stopautomatedracism
"A new project from Public Law Project, funded by the Nuffield Foundation, will investigate how transparency mechanisms can be adapted to facilitate fair, lawful, and non-discriminatory automated decision-making in the public sector." #stopautomatedracism
www.theguardian.com/uk-news/2025...
Amnesty International and we recognise that algorithms and data are fueling discrimination in UK policing. #stopautomatedracism
Amnesty International and we recognise that algorithms and data are fueling discrimination in UK policing. #stopautomatedracism
UK use of predictive policing is racist and should be banned, says Amnesty
Exclusive: rights group says use of algorithms and data reinforces discrimination in UK policing
www.theguardian.com
September 10, 2025 at 4:20 PM
www.theguardian.com/uk-news/2025...
Amnesty International and we recognise that algorithms and data are fueling discrimination in UK policing. #stopautomatedracism
Amnesty International and we recognise that algorithms and data are fueling discrimination in UK policing. #stopautomatedracism
chuffed.org/project/1285...
Police have no place in schools, and neither do predictive surveillance systems. Support our film to shine a light on Think Family Education and the harms of predictive policing. #StopAutomatedRacism
Police have no place in schools, and neither do predictive surveillance systems. Support our film to shine a light on Think Family Education and the harms of predictive policing. #StopAutomatedRacism
Surveillance is not safe guarding
Think Family Education (TFE) is an app that has been in use in schools for several years, in fact The Bristol Cable first reported on TFE in 2021. The app is connected to a council and police database.
chuffed.org
September 2, 2025 at 11:00 AM
chuffed.org/project/1285...
Police have no place in schools, and neither do predictive surveillance systems. Support our film to shine a light on Think Family Education and the harms of predictive policing. #StopAutomatedRacism
Police have no place in schools, and neither do predictive surveillance systems. Support our film to shine a light on Think Family Education and the harms of predictive policing. #StopAutomatedRacism
copwatchersorg.wordpress.com/2025/11/08/r...
If you're considering action against predictive policing or believe you've been risk-scored by the Offender Management App, check out our Sunday long read, "Road mapping paths to accountability." You'll never walk alone. #StopAutomatedRacism
If you're considering action against predictive policing or believe you've been risk-scored by the Offender Management App, check out our Sunday long read, "Road mapping paths to accountability." You'll never walk alone. #StopAutomatedRacism
Road mapping paths to accountability
By John Pegram, founder, caseworker, public speaker Afternoon, fellow copwatchers! As the end of 2025 approaches at a rate of knots, it seems only right to update you all on current developments wi…
copwatchersorg.wordpress.com
November 9, 2025 at 12:12 PM
copwatchersorg.wordpress.com/2025/11/08/r...
If you're considering action against predictive policing or believe you've been risk-scored by the Offender Management App, check out our Sunday long read, "Road mapping paths to accountability." You'll never walk alone. #StopAutomatedRacism
If you're considering action against predictive policing or believe you've been risk-scored by the Offender Management App, check out our Sunday long read, "Road mapping paths to accountability." You'll never walk alone. #StopAutomatedRacism
"There is the impact such technology has on our lives, and the harms of policing we must consider. Perpetual criminalisation is something I went through in my youth.." #stopautomatedracism
copwatchersorg.wordpress.com/2025/11/08/r...
If you're considering action against predictive policing or believe you've been risk-scored by the Offender Management App, check out our Sunday long read, "Road mapping paths to accountability." You'll never walk alone. #StopAutomatedRacism
If you're considering action against predictive policing or believe you've been risk-scored by the Offender Management App, check out our Sunday long read, "Road mapping paths to accountability." You'll never walk alone. #StopAutomatedRacism
Road mapping paths to accountability
By John Pegram, founder, caseworker, public speaker Afternoon, fellow copwatchers! As the end of 2025 approaches at a rate of knots, it seems only right to update you all on current developments wi…
copwatchersorg.wordpress.com
November 11, 2025 at 10:46 AM
"There is the impact such technology has on our lives, and the harms of policing we must consider. Perpetual criminalisation is something I went through in my youth.." #stopautomatedracism