Support us and go ad-free
A number of Access to Information (FOI) requests made by The Canary revealed the number of police forces in the UK that use crime prediction software or predictive policing models. Predictive policing coverage, also known as pre-crime, is part of our #ResistBigBrother series. Data Justice Lab researcher Fieke Jansen specializes in data-driven police operations. Jansen said The Canary that predictive policing has several components:
the first is the location police. It examines where the crime is most likely to occur in the near future. And the other is predictive identification which makes it possible to see who is potentially most likely to be involved in a certain criminal activity in the near future.
This type of policing strategy is known to be discriminatory and to create a disturbing trend for data mining. The Liberty Civil Liberties Organization explains:
Predictive programs are not neutral. They are trained by people and rely on existing police data. They therefore reflect models of discrimination and integrate them further into police practice.
Author of The rise of big data policing Andrew Ferguson points out that the police could have information on people who have not committed an offense and that the use of the data could encourage police violence, among other problems:
the growing network of surveillance threatens to cripple freedoms of association, political expression and expectations of privacy by eroding public anonymityâ¦ even with the best use policies in place, agents have access to extensive vast amounts of personal information about people not suspected of any crime … carelessly chosen data entries, long-standing racial, societal and other biases will be reified in the data.
Predictive policing is relatively sophisticated in the United States. The Los Angeles Police Department has recently been criticized for its use of predictive policing:
Read on …
Support us and go ad-free
recently revealed public documents detail how PredPol and Operation Laser, the department’s flagship data-driven programs, validated existing policing models and reinforced decisions to patrol certain people and neighborhoods over others, leading to excessive surveillance of black and brown communities in the metropolis.
What is the situation in the UK? Our investigation unit sent freedom of information requests to police forces asking if they were using data-driven techniques, what type of software they were using and how much it cost.
West Yorkshire Police
West Yorkshire Police told us they were receiving support from the College of Policing to use a ‘cutting edge technology’ called Patrol-Wise, funded by the Home Office and developed by University College London. They explained to us that crime data is analyzed in Patrol-Wise and communicated to officers on handheld devices.
Humberside Police told us that:
As part of a predictive policing program, Humberside Police are currently testing a predictive algorithm.
There is currently no other information available.
West Midlands Police
West Midlands Police told us:
WMP has developed (and is in the process of developing) predictive models – in particular around the most serious violence (places and number); knife crime – used when causing injury (location and number) and to estimate the likelihood of individuals going from low / medium damage levels to high damage.
Knife crime police are notorious for racially targeting blacks and browns. The incorporation of data algorithms into an already racist structure is a prime example of how data-driven policing strategies have entrenched racism even further in the forces.
Avon and Somerset Police
Avon and Somerset Police told us they use:
Data visualization / dashboards
– Production reporting / ad hoc interrogation
– Predictive analysis / risk models / ETL (extraction-transformation-loading)
– Social media analysis
Hampshire Police told us they are using Demar Forecasting, owned by Process Evolution.
Police forces in Scotland, Cumbria, Lancashire, Cheshire, Kent, North Wales, Gloucestershire, South Yorkshire, Staffordshire, Warwickshire, Derbyshire, Bedfordshire, West Mercia, Surrey, Northumbria, Hertfordshire, Sussex, Wiltshire, Lincolnshire, South Wales, the valley Thames, and the Metropolitan Police all returned responses indicating that they either had no information about predictive policing or that they were not using predictive policing. A number of police forces had not responded to the FOI request at the time of publication – despite public authorities being required to respond to FOI requests within 20 working days.
In a 2019 report from Liberty, a number of these police forces above are listed as using predictive font. The Canary contacted those police forces to ask them why their responses to our IOF claimed not to use predictive policing, but they were listed in Liberty’s report as using predictive policing. Police in Kent, Warwickshire and Cheshire have all explained that they have used predictive policing in the past and no longer do.
Kent Police said:
The article you mention says we used predictive policing in 2013 but in 2018 we decided not to renew the contract, this information is correct.
Warwickshire Police said:
Historically, Warwickshire Police have been briefly involved in a project as part of their strategic alliance with West Mercia. However, we no longer have an active role as an autonomous force.
Cheshire Police said:
Cheshire Constabulary participated in a short trial in 2015 and it was decided not to use the software.
This follows with what data expert Jansen told us:
Police cuts in the UK have resulted in the frequent shutdown and start of many predictive policing models.
Level of sophistication
It is clear that predictive policing in the UK may not be as widespread or sophisticated as it is in the US. There is also a model of predictive policing models adopted by short-term forces and then abandoned due to funding. This stop-start landscape means that the forces depend on government funding and the development of university departments. This in turn means that influences on politics can change the threat to civil liberties.
As Kevin Blowe of the Network for Police Monitoring (Netpol) points out:
predictive technology is better able to identify patterns than individuals and is only likely to work if it involves collecting large amounts of data over a long period of time. Since people from poorer communities are more likely to have data held about them by public services, this likely means that they are more likely to be classified as a risk.
Predictive policing itself requires police to hold data on individuals, but the abuse of technology is not a problem with the technology itself but a problem with racist structures in the police. As Blowe explains:
The problem is, these are just tools – and biases arise from choices made about which crimes to focus on and where. In essence, if the perception of a problem by the police (and therefore the data they seek) relates to “gang crime” then they are likely to reproduce pre-existing racial biases on this issue with the police. .
Why is all this important?
The events of 2021 highlighted institutional sexism and racism within the British police force: the murder of Sarah Everard by a policeman on duty; police officers taking photos with the murdered bodies of Nicole Smallman and Bibaa Henry; crackdown on the Police, Crime, Sentencing and Courts Bill; the promise of further restrictions in amendments to the Official Secrets Act.
The restrictions come against the backdrop of growing privatization of the NHS, deep cuts to the welfare state, botched handling of the coronavirus (Covid-19) pandemic and anti-immigrant rhetoric. A shift towards far-right policies causes serious problems for all of our civil liberties.
The level of sophistication is not necessarily the issue when it comes to predictive policing. The problem is that the institutionally racist and corrupt police are using data as another tool in their arsenal that can be used to quell dissent, control and monitor citizens. There are already concerns that the police are relying on the NHS and mental health services for data. Crackdowns on civil liberties almost always pave the way for a smoother path to even more regressive policies. Resisting police violence means resisting the use of data and technology that violates the right to privacy.
Featured image via Wikimedia Commons / Tony Webster
Support us and go ad-free