Sei sulla pagina 1di 9

Caviness !

1
Alex Caviness
Prof. Campbell
UWRT 1103
November 9, 2016
When Big Brother Comes Knocking: The Case Against Predictive Policing
Howard Marks came home one day to the sight of his wife sleeping with another man.
Blinded by anger, he grabbed a pair of scissors and was prepared to take his revenge against his
spouse when the police burst into his home and said, you [are] under arrest for the future
murder of Sarah Marks, that was to take place today How did the police know to show up?
All because of a clairvoyant who envisioned this crime taking place before it ever happened (qtd.
in Mayer-Schnberger and Cukier). Now, this story is not actually realits from the movie
Minority Reportand while it is probably safe to say the police are unlikely to ever use
clairvoyants to base arrests upon, substitute psychics with big data and suddenly that anecdote
becomes shockingly possible. In fact, it could be a reality much sooner than we may think. Faced
with this imminent problem, we have to ask, are the benefits of predictive policing worth the
threats to civil liberties?
To answer this question, we must first know what exactly predictive policing is, which in
turn requires a knowledge of the larger context of big data. Basically, big data is a term used to
describe the ability of society to harness information in novel ways to produce useful insights,
as big data experts Viktor Mayer-Schnberger and Kenneth Cukier put it. In other words, big
data is the use of algorithms to analyze huge amounts of information to find trends otherwise

Caviness !2
impossible to see. Its applications are as far reaching as the mind can imaginefrom medicine to
astronomy to economicsbut most relevantly, big data is increasingly being used in the field of
policing. Methods of predicting where crimes will occur have existed for some time now, many
involving police officers gut feelings, but only with big data have the predictions been of real
significance (Perry, et. al). In this case,To predict crimes, information on previous crimes in an
area is collected and inputed into a software, and the software predicts where future crimes will
occurhence predictive policing. More specifically, from the website of the predominant
predictive policing software, PredPol, historical data is combined with information on day-to-day
crimes as they come in, and the software makes a prediction on where crimes are most likely to
happen for the next few hours. In some ways, this is similar to the traditional methods of
predicting crime where police will more heavily patrol areas they know from experience will
have more crimes committed at some time. Part of the training for crime analysts, after all, is in
hot spot identification (Perry, et. al). The big difference is that big data allows these predictions
to be founded on a more scientific means: data. As a result, hot spots can be identified faster,
greater accuracy, and with precisions time frames. Would any cop be able to tell that on New
Years Eve in Times Square there will be an increase in crime? Of course. But could he tell
which street would likely have the most crime between 5 and 6 oclock next Tuesday? Not a
chance.
On the face of it, predictive policing sounds like a powerful tool in making policing more
efficient, and in some ways it is. For one, crime analysts can spend more timehave an important
job gathering intelligenceanalyzing long-term crime trends and finding solutions that will

Caviness !3
address the root causes. This task is essential to actually affecting the community and reducing
crime. However, instead of finding long-term solutions to local crime problems, they spend a
much of their time predicting hot spots, or places where crimes are more likely to occur. With
This fact is especially useful because predictive policing software, police departments can
determine hot spots immeasurably faster than people can and with far greater accuracy, which
frees crime analysts to focus on their more lasting and impactful job. Furthermore, PredPol can
attest to drops in crime as high as 30% in various cities after its implementation. The theory is
police departments can send their officers to hot spots when they have extra time, and their
presence as a deterrent will prevent crimes that would otherwise have happened. However, it is
worth noting here that correlation does not imply causation, so just because crime dropped after
PredPol was implemented does not mean that predictive policing caused crime to drop. While it
assuredly could have contributed to the drop in crime in these various places, there have been no
scientific, controlled experiments to see how much predictive policing actually contributes to
crime reduction.
One of the other hopes with predictive policing is that it will help reduce bias in policing
and help heal communities divided by racial profiling. The general idea, as Santa Cruz Police
Department crime analyst Zach Friend reports, is that by relying solely on data and taking out
some of the human factor of where to patrol, racial prejudice can be eliminated. Since computer
programs cannot be biased, the predictive policing software must not be biased, right? According
to Science writer Mara Hvistendahl, however, this is far from true. She fears that predictive
policing is a scientific curtain behind which racial prejudice will lie. At first it seems nigh on

Caviness !4
impossible for algorithms to perpetuate racial bias, but it does, in fact, make sense because all of
the crime statistics a predictive policing software uses to base its predictions come from the
biased police force. Minorities are disproportionately more likely to be stopped on the street, be
stopped in their cars, be searched or have their car searched, have excessive force used on them,
and be fined for minor offenses. More stops means more offenses like possession of drugs are
caught, and more fines means more minor offenses are made official. As a result of this, crimes
by minorities show as being far higher than accurate in comparison to crimes committed by
whites in police departments data (Heres How Racial Bias Plays Out in Policing). The
software then reflects a bias towards areas where more crimes are recorded,ported (likelywhich
is skewed towards areas with many minorities). A Chicago Police Department (CPD) program
used a complex algorithm to provide officers with a list of likely offenders and victims, usually
people of color, and officers were basically told to keep an eye on those people. While crime was
not reduced as a result of this and supposedly at-risk victims were no more likely to be victims
than anyone else, people on the victim lists were more likely to be arrested than average citizens.
Crime was not reduced by this program because the algorithm used considered innocent people
to be more suspicious than they were because of biased data. A large part of the problem there is
tThe list was generated using information that was skewed to show more minorities as criminals
and victims. This is compounded by officers being told to watch them and having their own
prejudice in a system all justified by predictive policing. Thus, we are forced to conclude that
predictive policing does not keep police officers unbiased so much as provide a convenient cover
for systemic prejudices. (Hvistendahl).

Caviness !5
Another equally tricky issue with predictive policing arises when considering the concept
of reasonable suspicion. Reasonable suspicion, as NPR law enforcement correspondent Martin
Kaste reports, is the suspicion required to stop somebody on the street. In other words, if a cop
sees someone with furtive movements, he has grounds to stop the person, whereas he could not
stop someone just because he felt negatively about the other person. Why is this a problem? The
issue arises when a police officer is sitting in his squad car having driven to a hot spot marked by
the computer for a likelihood of car theft. As he sits there, he sees a man go up to a car and
struggle fortake a moment before getting insidecould be car theft. But, then again, it could also
just be a man who forgot which pocket he put his keys in, and odds are the officer is a little more
paranoid just from being told that he is somewhere where car theft is likely. So, does that
computer-drawn hot spot provide the reasonable suspicion necessary to stop him? Law professor
Andrew Ferguson says in the NPR article that police departments have told officers not to use
predictive policing as reasonable suspicion, but it is likely only a matter of time before hot spots
are used as justification. There are not any laws dictatingNo laws currently dictate if predictive
policing can be used in establishing reasonable suspicion, so it is worth considering the civil
rights we would be giving up by letting a computer determine if we are suspicious or not. Since
the CPD has already started a program that uses predictive policing methods to identify
individuals believed to be risky or at risk, it is not a far step to say police departments will try to
use computers to decide who is worthy of suspicion. Really, this situation is not very different
from someone getting stopped because a cop thinks the neighborhood is suspect. It is profiling

Caviness !6
just done with the aid of a computer programand it demonstrates an unjust disregard for
civil liberties.
An even scarier thought towards the future goes back to the opening anecdote from
Minority Report. It may have sounded like science fiction at the time, but we could be quite close
to that being a reality. Viktor Mayer-Schnberger and Kenneth Cukier tell how most state parole
boards already use big data to determine the likelihood that a prisoner will commit a crime if
released. The Ohio Risk Assessment System (ORAS) has been implemented in prisons
throughout the state to identify low-risk criminals who are more likely to commit another crime
the longer they are in prison (Prison breakthrough). While most would agree that this category
of prisoner should be granted parole, Iit goes without saying that given the imperfections of any
prediction, this type of software could lead to a lot of faulty decisions. In other words, no matter
how good their prediction algorithms are, there will always be some prisoners who would never
commit another crime denied parole and prisoners who will commit a crime set free. This is only
the beginning of the path, however. Police departments are already seeking to go beyond the
current state of predictive policing to where algorithms can predict exact locations of crimes and
who will commit them. Imagine how perfect that would be, knowing who was going to commit a
crime before they did and being able to show up at the scene and intervene before anyone got
hurt. That all works fine until society wants to punish the would-be criminal for his would-be
actions. No action could be more misguided.
While assuredly society just wants to hold people accountable for the crimes they commit
and prevent them from committing crimes in the future, locking people up on the basis of crimes

Caviness !7
they have not yet committed is a grave mistake. For starters, our entire system of justice is based
on the idea that one is innocent until it can proven beyond a reasonable doubt that he committed
the action of which he is accused. Without having performed that act, there is obviously no way
to prove that he did it because it never happened. Then, there is the fact that, as mentioned
earlier, predictions are never perfect (if they were, then there would be no such thing as free
choice because everything would occur as predicted), and therefore, no matter how good a
predictive policing software is, there will be people jailed based on faulty predictions. On top of
this, linking people to actions they have not yet committed not only comes dangerously close to
thought crime reminiscent of Big Brother, but it also denies people the freedom of choice that is
fundamental to the human condition (Mayer-Schnberger and Cukier).

People variously have

thoughts of performing wrongdoing often have thoughts of performing wrongdoings of different


levels of severity, from fleeting curiosities to serious considerations. What sets them apart from
criminals is that they choose not to act on those thoughts. We, as humans, have the power to
decide what thoughts to make reality, and holding people guilty for crimes they have not yet
chosen to commit is a violation of that very principle.
Clearly, predictive policing can lead to some grim outcomes, so how do we move
forward? Its potential benefit to crime reduction can surely outweigh the downsides as long as
the pitfalls are avoided, so let us take a moment to examine the following necessities for good
predictive policing. The first is almost obvious: use good data. While the face of it seems
ludicrously simple, omissions in the data can distort crime maps to inaccurately reflect how risky
various areas actually are. We have already examined the way bias can be introduced into the

Caviness !8
data, and this biased data wreak similar havoc on predictions. Also important is understanding
the factors that go into predicting hot spots. Knowing these factors matters in finding long-term
crime solutions and in preventing false associations. The RAND report gives the example that
while the location of crimes and where police officers spend time are strongly correlated, using
the location to base predictions would be misguided because the police tend to show up wherever
crimes have been committed. On top of this, police departments using predictive policing
software must be wary of infringing civil liberties. Apart from concerns with systemic bias in the
predictive policing program, this is not usually a problem with identifying hot spots, where the
data that goes into the algorithms is all stripped of identifying information, or sending police to
hot spots. The problem comes with identifying 'hot people, individuals who are determined
suspicious through computer algorithms. While reasonable suspicion may be more relaxed when
dealing with a hot spot, applying the same analysis used when determining whether or not to
grant parole to free citizens steps into dangerous legal territory (Perry, et. al). Identifying specific
people at risk is a misstep that invariably leads to stripping away civil liberties.
Where does this leave us on predictive policing? It may reduce crime rates by a fair bit,
but embracing predictive policing in full force by predicting specific events and arresting
hypothetical criminals comes at the price of perpetuating racial bias, opening the gateway to
baseless police stops, and stripping away the freedom of choice. It is a tool that can be utilized to
societys advantage, but only as a supplement to guide police on where to patrol. Beyond that
andIf society decides to use predictive policing to identify potential individuals to target or arrest,
then predictive policing threatens to undermine our civil liberties as we know them today.

Caviness !9
Works Cited
Friend, Zach. "Predictive Policing: Using Technology to Reduce Crime." FBI. 9 Apr. 2013.
leb.fbi.gov/2013/april/predictive-policing-using-technology-to-reduce-crime. Web.
Accessed 15 Oct. 2016.
Heres How Racial Bias Plays Out in Policing. The New York Times. The New York Times
Company. 10 Aug. 2016. www.nytimes.com/2016/08/11/us/heres-how-racial-bias-playsout-in-policing.html?_r=0Web. Accessed 7 Dec. 2016.
Hvistendahl, Mara. Crime forecasters. Science, vol. 353, no. 6307, 2016, pp.1484-1487, doi:
10.1126/science.353.6307.1484. Web.

Accessed 18 Oct. 2016.

Kaste, Martin. "Can Software That Predicts Crime Pass Constitutional Muster?" NPR. 26 July
2013. www.npr.org/2013/07/26/205835674. Web. Accessed 15 Oct. 2016.
Mayer-Schnberger, Viktor, and Kenneth Cukier. Big Data: A Revolution That Will Transform
How We Live, Work and Think. London: John Murray, 2013, pp. 157-163. Print.
Perry, Walter L., et. al. Predictive Policing. RAND. RAND Corporation. www.rand.org/
content/dam/rand/pubs/research_reports/RR200/RR233/RAND_RR233.pdf. Web.
Accessed 7 Dec. 2016.
Predictive Policing Software. PredPol, 2015. Web. www.predpol.com. Accessed 19 Oct.
2016.
Prison breakthrough. The Economist. The Economist Newspaper Limited. 19 Apr. 2014.
www.economist.com/news/united-states/21601009-big-data-can-help-states-decidewhom-release-prison-prison-breakthrough. Web. Accessed 7 Dec. 2016.

Potrebbero piacerti anche