Predictive Policing: Is It Biased?

By Freya Graham

It’s an action movie trope that everyone’s familiar with — a brazen police officer or
undercover spy is entrusted with a host of gadgets and gizmos to help them fight crime.
Often, it borders on the ridiculous — anyone remember the exploding chewing gum in
Mission: Impossible? Sophisticated police technology is no longer the remit of bombastic
summer blockbusters, though. The intersection between policing and technology is
growing closer with the rise of predictive policing software.


The most common form of predictive policing is place-based. A wide-range of data is
input into computer systems. The data is analysed, and the software produces
predictions of where and when crime is likely to take place. Usually, this is in the form of
an annotated map. Police forces can use this information to deploy officers to areas
where crime is predicted to take place (Lau, 2020). US company PredPol — recently
rebranded as ‘Geolitica’ — supplies police forces across the world with place-based
predictive policing technology. The PredPol website claims to ‘predict critical events and
gain actionable insight’.


Person-based predictive policing identifies individuals who are more likely to commit acrime (Lau, 2020). The Harm Risk Assessment Tool (HART), for instance, is a machine-
learning algorithm that predicts how likely an individual is to commit a crime in the next two years, based on variables such as criminal history and age (Jacobs, 2021).
Proponents of predictive policing technology say that it can help to prevent crime. It could
also help police forces with limited resources clearly identify how they can best help their
community. According to human rights group Liberty, at least 14 UK police forces have
used or intend to use both place-based and person-based predictive policing (Liberty,
2021).

The Feedback Loop
While the technology may be new, the concept of predictive policing isn’t. Its roots can be
traced back to the 1960s, when sociologists coined the ‘Broken Windows’ theory. The
‘Broken Window’ theory, later popularised by Malcolm Gladwell’s 2000 book The Tipping
Point, linked physical and social disorder in a neighbourhood (with broken windows
metaphorically representing disorder) with higher crime rates (McKee, 2018). The theory
influenced police policy in the USA in the 1990s and 2000s. In New York, for instance, the
police clamped down on minor crimes such as public drinking and jaywalking in order to
create ‘harmonious’ communities and therefore, according to the broken windows theory,
reduce serious crime (McKee, 2018).


Many critics, however, have said that the Broken Windows theory, when implemented,
results in over-policing in minority communities (Childress, 2016). In New York, the theory
is closely linked with the controversial stop-and-frisk program, which disproportionately
targets African American and Hispanic American individuals in the city (Munn, 2018).
Areas which look ‘more criminal’ are policed more intensively. More policing means more
minor crime detection — the policy becomes self-fulfilling.
Predictive policing technology creates a similar feedback loop. The software is more likely
to direct police forces to areas where crimes have been previously committed. With a
larger police presence, more minor crimes are likely to be detected in the area. These newly detected crimes will then be processed as data in the predictive policing algorithm.
The algorithm will then, based on this new data, determine that the area is at higher risk
of future crime. As a result, police officers will continue to be directed to the area. The
loop continues.

Bad Data Prevails
Predictive policing software also relies on the assumption that data is objective and
factual. The data that is used by predictive policing software is data gathered by the
police, such as the time and place of every reported crime over the course of a month.
Reported crime is different from committed crime, though. Some communities are more
likely to report a crime. Some crimes are less likely to be reported at all.
Indeed, statisticians Kristian Lum and William Isaac wrote in a 2016 paper that ‘police
databases are not a complete census of all criminal offences, nor do they constitute a
representative random sample.’


They added that ‘empirical evidence suggests that police officers – either implicitly or
explicitly – consider race and ethnicity in their determination of which persons to detain
and search and which neighbourhoods to patrol.’ This reflects statements made by
human rights group Liberty, who said that predictive policing algorithms ‘are trained by
people and rely on existing police data, and so they reflect patterns of discrimination and
further embed them into police practice.’ Police bias is legitimised and amplified by
predictive policing technology.


The Academic Response
Criticism of predictive police is echoed throughout the academic community. In July
2020, a group of leading mathematicians wrote an open letter urging academics to cut
ties with predictive policing technology (Linder, 2020). The letter said that ‘given the
structural racism and brutality in US policing, we do not believe that mathematicians
should be collaborating with police departments in this manner. It is simply too easy to
create a “scientific” veneer for racism.’ Over 1500 mathematicians signed the petition
accompanying the open letter.


Speaking to Popular Mechanics in 2020, mathematician Tarik Aougab said that ‘the
problem with predictive policing is that it’s not merely individual officer bias.’ He believes
that the algorithms intensify existing bias, adding that ‘there’s a huge structural bias at
play, which amongst other things might count minor shoplifting, or the use of a counterfeit
bill, which is what eventually precipitated the murder of George Floyd, as a crime to which
police should respond to in the first place’ (Linder, 2020).


A June 2020 blog post on the PredPol website directly addresses George Floyd’s murder.
The marketing team wrote that ‘the best place to start is with dialogue and with data. We
need to have objective discussions of the problems of racial inequality in America.’ They
added that ‘’We believe that the starting point is data: objective, agreed-upon facts that
can be used to guide the discussion’ (PredPol, 2020).


This statement doesn’t acknowledge that the data itself is a huge part of the problem,
though. According to the mathematicians to penned the open letter, the data used in
predictive policing is not ‘objective’ or ‘agreed-upon’. Ultimately, predictive policing is an act of ‘tech-washing’ — using buzzwords like ‘data’ and ‘algorithms’ to veil biased and
discriminatory practices.

Bibliography
Hao, 2020, ’A US government study confirms most face recognition systems are racist’,
MIT Technology Review. Available at: https://www.technologyreview.com/2019/12/20/79/
ai-face-recognition-racist-us-government-nist-study/
Haskins, 2019, ‘Academics Confirm Major Predictive Policing Algorithm is Fundamentally

Flawed’, Vice. Available at: https://www.vice.com/en/article/xwbag4/academics-confirm-
major-predictive-policing-algorithm-is-fundamentally-flawed

Haskins, 2019b, ‘Dozens of Cities Have Secretly Experimented With Predictive Policing

Software’, Vice. Available at: https://www.vice.com/en/article/d3m7jq/dozens-of-cities-
have-secretly-experimented-with-predictive-policing-software

Jacobs, 2021, ‘The radical idea to reduce crime by policing less, not more’, Wired.
Available at: https://www.wired.co.uk/article/evidence-based-policing
Lau, 2020, ‘Predictive Policing Explained’. Available at: https://www.brennancenter.org/
our-work/research-reports/predictive-policing-explained
Liberty Human Rights, 2021, ’Predictive Policing’. Available at: https://
http://www.libertyhumanrights.org.uk/fundamental/predictive-policing/
Linder, 2020, ‘Why Hundred of Mathematicians Are Boycotting Predictive Policing’,
Popular Mechanics. Available at:https://www.popularmechanics.com/science/math/
a32957375/mathematicians-boycott-predictive-policing/
McKee, 2018, “Broken windows theory”. Encyclopedia Britannica. Available at: https://
http://www.britannica.com/topic/broken-windows-theory.
Munn, 2018, ‘This Predictive Policing Company Compares Its Software to ‘Broken

Windows’ Policing’, Vice. Available at: https://www.vice.com/en/article/d3k5pv/predpol-
predictive-policing-broken-windows-theory-chicago-lucy-parsons

PredPol website, 2021. Available at: https://www.predpol.com/
PredPol, 2020, ‘Are We at a Tipping Point in Police-Community Relations?’. Available at:
https://blog.predpol.com/are-we-at-a-tipping-point-in-police-community-relations

Published by Impala Global

Our goal is to ensure that the global health and human rights implications of technology are considered to ensure an inclusive future.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: