Amnesty International's report titled "Automated Racism" reveals that UK police forces deploy predictive policing systems that disproportionately target poor and racialized communities. The algorithms used in these systems rely on historical policing data, which is biased, leading to a cycle of over-policing these groups even before crimes are committed. This creates a negative feedback loop where increased policing results in more data that continues to reinforce systemic biases. Amnesty emphasizes that these practices violate both domestic and international human rights obligations.
Predictive policing tools severely undermine the presumption of innocence, targeting individuals based on prior data rather than on actual criminal behavior.
The reliance on biased data creates a negative feedback loop of discrimination, leading to more over-policing of already marginalized communities.
Collection
[
|
...
]