Photo by Rishabh Varshney on Unsplash
Margo McGehee, Associate Member, University of Cincinnati Law Review
Police have always striven to predict the people and places involved in criminal activity in order to prevent crime from happening in the first place. In recent years, software companies have started developing new technology to help law enforcement with this goal. “Predictive Policing Technology” (“PPT”) is software programming that analyzes large sets of crime data to identify the most likely locations, perpetrators, and victims of future crime. For example, PredPol, one of the most dominant predictive policing technologies on the market, runs crime data through a series of algorithms and pinpoints 10 to 20 “hot spot” areas an officer is most likely to see crime over his or her next shift.
Dubbed a “holy grail of law enforcement,” PPT has become a multi-million dollar business, praised for its cost-effectiveness, progressivism, and ability to reduce crime. However, many of the nation’s largest police departments have recently withdrawn their support of these programs, and numerous independent organizations, such the American Civil Liberties Union, have openly criticized PPT for its perpetuation of racial bias and profiling in law enforcement, questionable efficacy, and, most notably, its potential to infringe on people’s fourth amendment rights. PPT has not been extensively considered by courts, as the technology is still relatively new; however, the Fourth Circuit Court of Appeals recently issued an en banc decision discussing the issues surrounding PPT and its relation to the fourth amendment.
Part II of this article will provide a brief overview of the fourth amendment and its exceptions. Part III will summarize the Fourth Circuit’s decision in United States v. Curry and its discussion of PPT. Part IV will explore the dangers of PPT as it relates to the “reasonableness” requirement of the fourth amendment and various public policy concerns and will advocate for greater regulation and monitoring of PPT.
II. The Fourth Amendment
The Fourth Amendment protects the “right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures” and states that “no Warrants shall issue, but upon probable cause.” The amendment’s protection not only extends to unreasonable searches of personal property, such as homes and cars, but also to the search and seizure of one’s person, including “brief investigatory stops” by law enforcement.
The Supreme Court has articulated certain exceptions to the fourth amendment’s warrant requirement that center around the “reasonableness” of a search or seizure. One exception is the “Terry stop” which allows officers to conduct brief investigatory stops without securing a warrant if they have a reasonable belief that “criminal activity may be afoot” and that the person they stop “may be armed and presently dangerous.”
Another exception is the “exigent circumstances” doctrine, which applies when an emergency arises and the needs of law enforcement are so compelling that a warrantless search is objectively reasonable under the fourth amendment. The Supreme Court recognizes only a few “emergency conditions” that rise to the level of “exigent,” one of which being the need to “protect individuals who are threatened with imminent harm.” This is known as the “emergency aid” exception and applies when an urgent situation arises that affects someone’s health and/or safety (such as the need to break down the door of a house to rescue the occupants). The exigent circumstances doctrine typically applies only to the warrantless entry and search of private property, as opposed to the search and seizure of a person, but courts have allowed suspicionless searches of a person when officers can narrowly target their searches based on specific information about a known crime and a controlled geographic area. The Fourth Circuit relied on this exception when examining PPT.
III. United States v. Curry
In July 2020, the Fourth Circuit Court of Appeals issued an en banc decision addressing whether the fourth amendment’s exigent circumstances doctrine justified the suspicionless search of Billy Curry, Jr. Curry was charged with possession of a firearm by a convicted felon after police officers found a revolver on his person during a search. The police officers who searched Curry were responding to reports of gunfire in Curry’s apartment complex, and these officers were specifically assigned to monitor this area after the department’s PPT registered an uptick in shootings and homicides within the preceding months. Instead of stopping everyone close to the location of the reported gunshots, the officers focused their search on a public park at the rear of the apartment complex and only stopped people who were acting “suspicious.” Curry was walking alone in the park when he was ultimately stopped and searched as a result of the officers’ actions is accordance with the PPT.
In an 8-6 decision, the 4th Circuit agreed with the district court’s decision to grant Curry’s motion to suppress evidence of his revolver based on the unreasonable search that led to its discovery. The court determined that the officers did not conduct a valid Terry stop, nor did the exigent circumstances doctrine apply because the police’s searches were not isolated to a geographic area with clear boundaries or to a discrete group of people.
The concurring and dissenting opinions in Curry address the use of PPT and the various constitutional and public policy concerns it raises. The concurring judges argue that reliance on PPT may have a detrimental impact on the fourth amendment rights of people living in high crime areas and perpetuate racial bias and profiling within the criminal justice system. The dissenting judges not only disagree with the Court’s holding that the conduct of the officers fell outside the scope of the exigent circumstances doctrine, but also argue that PPT makes communities safer and ensures that poorer, high-crime areas are not abandoned by law enforcement.
“Predictive policing” in the broadest sense has always been a technique used by police, even if subconsciously. Police departments and their officers develop an idea of which parts of town are more prone to crime than others, what times of the day, week, and year crimes most often occur, and other factors learned through time and experience. However, there is a natural check on this system as people know that human intuition is not always accurate or perfectly objective. The danger of PPT is that it operates under a veil of objectivity. Programs like PredPol analyze decades of crime data, weather patterns, and other variables and then spit out the coordinates of a city block where the next burglary is likely to occur. This result is more easily accepted as accurate and less biased than an officer acting on a “hunch.” But the very data these programs use is riddled with human error and biases.
One concern of PPT is that it will compromise people’s fourth amendment rights. This technology also raises significant public policy concerns, including the risk of promoting racial bias in policing. The following parts will explore each concern in turn.
A. Fourth Amendment Concerns
Arguably the most pressing danger of PPT is the effect its perceived objectivity and accuracy may have on fourth amendment interpretation, specifically the fourth amendment’s reasonableness requirement. Predictive policing tools may make it easier for police to find that individuals meet the reasonable suspicion standard, justifying more stops.
The Supreme Court recognizes that a predictive profile can be a relevant, if not controlling, factor in finding reasonable suspicion, and a computerized profile might give more credence to an officer’s finding of reasonable suspicion if the officer’s target happens to be in a crime “hot spot.” For example, absent predictive technology, if a patrolling officer saw a person looking into a parked car’s window, this alone would not be deemed sufficient activity to warrant a search of the person for suspected car theft. However, if the officer was patrolling the street based on a predictive program’s “tip” that a car theft would likely occur in the area, the officer would have a stronger argument that he had reasonable suspicion to stop and search the person.
The court in Curry was concerned about the deterioration of the fourth amendment under these technological advancements, especially for those living in areas with higher crime rates. Despite the fact that PPT had indicated that there would likely be a shooting in Curry’s neighborhood and that gunfire had been reported, the court declined to find that the officers had reasonable suspicion to search Curry. The court maintained that a person’s presence in a high-crime area cannot alone create reasonable suspicion. Concurring, Judge Wynn went on to say that if the court had decided otherwise, it would give officers too much discretion to make suspicionless stops in similar circumstances.
Further, Judge Gregory conceded in his concurring opinion that PPT may become an effective tool for law enforcement, but noted that it creates greater tension between police performing their duties and the fourth amendment rights of those being policed. The constitution demands a balance between keeping society safe and protecting individual liberty. In Gregory’s words, “[i]f merely preventing crime was enough to pass constitutional muster, the authority of the Fourth Amendment would become moot.” As technology changes rapidly, law enforcement, courts, and society as a whole must be prepared to ensure that the changes do not detrimentally impact already-existing rights.
B. Public Policy Concerns
Another danger of PPT is that it will reinforce and perpetuate racial bias in the criminal justice system. Proponents of PPT argue that the algorithm can predict future crimes more accurately and objectively than police officers relying on intuition alone, helping to combat racial bias in the system. However, as Judge Thacker stated in Curry,“[t]echnology cannot override human flaws. It stands to reason that any computer program or algorithm is only as good as the data that goes into it.”
The data used by PPT is far from objective as historical crime data is infected with years of racial bias. Racial and ethnic minorities comprise a disproportionate share of the population in urban and traditionally “high crime” areas, meaning that they are disproportionately represented in the crime statistics of those areas. The use of this data produces a biased and inaccurate picture of a city’s crime landscape and assigns officers to “high crime” areas that hold a disproportionate number of racial and ethnic minorities. Further, recent studies show that some police departments rely on “dirty data”—data “derived from or influenced by corrupt, biased, and unlawful practices,” including discriminatory policing and manipulated crime statistics—to run their predictive policing systems. This creates a circularity problem: if historical crime data indicates that a low-income, urban area is a crime “hot spot,” and an increased number of officers are assigned to constantly patrol that area, any arrests or convictions in that area will feed back into the system and reinforce the conclusion that the area is at high risk for crime. As the stereotypes of these communities are reinforced by data, officers may feel more justified in their search and seizure of members of these communities.
Crime data is also notoriously incomplete and inaccurate. The Department of Justice reports that half of crimes with victims go unreported, leading to incomplete police records. Police officers are also susceptible to human error and inevitably make mistakes in their paperwork, and these mistakes are fed into the predictive policing system. Further, arrest data is inconsistently used in these programs, potentially leading to overstated results. Independent audits of several major police departments revealed that arrest statistics were used by their predictive policing programs to determine crime hotspots, regardless of whether the arrests ultimately resulted in charges or convictions. The use of arrest data is problematic because this data may overemphasize the presence of crime in a particular area since not all arrests lead to charges or convictions.
A final concern of PPT is that too little is known about its effectiveness at preventing crime. Early implementers of PPT have praised it for its ability to decrease crime rates in the departments’ cities. However, very little independent data exists to verify the methodology of predictive technology. Of the few independent studies that examined PPT methodology, researchers found that the software had no statistically significant impact on crime reduction. Numerous departments across the country have ended their use of PPT after determining that it did not help reduce crime and provided information already being gathered by officers patrolling the streets.
Upon the Santa Cruz Police Department’s initial adoption of PPT in 2011, the department’s crime analyst stated that “[t]he worst-case scenario is that [the PPT] doesn’t work and we’re no worse off.” Despite best intentions, PPT has the potential to dramatically impact fourth amendment interpretation, as well as the day-to-day lives of many Americans. Predictive technology relies on inherently biased data that may lead to over policing of areas disproportionately populated by racial and ethnic minorities. The effects of PPT may also impact the reasonable suspicion interpretation of the fourth amendment, resulting in more frequent suspicionless and discretionary searches and seizures by police.
If programed and used correctly, PPT has the potential to be a useful supplementary tool to police officers as they strive to achieve their goal of keeping communities safe. But as it stands now, the costs of this technology outweigh the benefits. Predictive programming must do a better job of controlling for bias in data, and independent researchers must take on a greater role in monitoring these programs for accuracy. Finally, courts must remain on notice of these rapid technological changes and ensure that constitutional rights are not compromised by rigorously applying existing fourth amendment standards to determine whether PPT stops are, in fact, justified.
 Andrew G. Ferguson, Policing Predictive Policing, 94 Wash. U. L. Rev. 1109, 1123 (2017).
 Nate Berg, Predicting crime, LAPD-style, The Guardian (Jun. 25, 2014), https://www.theguardian.com/cities/2014/jun/25/predicting-crime-lapd-los-angeles-police-data-analysis-algorithm-minority-report.
 Tim Lau, Predictive Policing Explained, Brennan Center for Justice (Apr. 1, 2020), https://www.brennancenter.org/our-work/research-reports/predictive-policing-explained.
 Ellen Huet, Server And Protect: Predictive Policing Firm PredPol Promises To Map Crime Before It Happens, Forbes (Mar. 2, 2015), https://www.forbes.com/sites/ellenhuet/2015/02/11/predpol-predictive-policing/?sh=590d8ca04f9b.
 Andrew G. Ferguson, Predictive Policing and Reasonable Suspicion, 62 Emory L. J. 259, 269-70 (2012).
 Lau, supra note 3; Leila Miller, LAPD will end controversial program that aimed to predict where crimes would occur, Los Angeles Times (April 21, 2020), https://www.latimes.com/california/story/2020-04-21/lapd-ends-predictive-policing-program?eType=EmailBlastContent&eId=f3aa6ff4-fdc5-4596-b96a-2c0fe443df39.
 Predictive Policing Today: A Shared Statement of Civil Rights Concerns, American Civil Liberties Union (Aug. 31, 2016), https://www.aclu.org/other/statement-concern-about-predictive-policing-aclu-and-16-civil-rights-privacy-racial-justice#:~:text=Predictive%20Policing%20Today%%203A%20A%20Shared%20Statement%20of%20Civil%20Rights%20Concerns&text=A%20growing%20number%20of%20police,or%20who%20will%20be%20involved.
 United States v. Curry, 965 F.3d 313 (4th Cir. 2020)(en banc)(8-6 decision).
 U.S. Const. amend IV.
 United States v. Kehoe, 893 F.3d 232, 237 (4th Cir. 2018).
 Terry v. Ohio, 392 U.S. 1, 30 (1968).
 Mincey v. Arizona, 437 U.S. 385, 393-94 (1978).
 Welsh v. Wisconsin, 466 U.S. 740, 749–50 (1984).
 Carpenter v. United States, 138 S. Ct. 2206, 2223 (2018) (citing Kentucky v. King, 563 U.S. 452, 460 & n.3 (2011)).
 King, 563 U.S. at 460; Wayne v. United States, 318 F.2d 205, 212 (D.C. Cir. 1963).
 United States v. Yengel, 711 F.3d 392, 396 (4th Cir. 2013) (citing Mincey, 437 U.S. at 392–94).
 Curry, 965 F.3d at 315.
 Id. at 318.
 Id. at 325.
 Id. at 316.
 Id. at 344-45 (Thacker, J., concurring); Id. at 334 (Gregory, J., concurring); Id. at 336-37 (Wynn, J., concurring).
 Id. at 350-51, 355 (Richardson, J., dissenting); Id. at 346, 349 (Wilkinson, J., dissenting).
 Chandler Harris, Richmond, Virginia, Police Department Helps Lower Crime Rates with Crime Prediction Software, Government Technology (Dec. 21, 2008), https://www.govtech.com/public-safety/Richmond-Virginia-Police-Department-Helps-Lower.html.
 Lau, supra note 3.
 Reid v. Georgia, 448 U.S. 438, 441 (1980).
 Ferguson, supra note 5, at 308.
 Curry, 965 F.3d at 316.
 Id. at 331.
 Id. at 337 (Wynn, J., concurring).
 Id. at 334 (Gregory, J., concurring).
 Lau, supra note 3.
 Curry, 965 F.3d at 345 (Thacker, J., concurring).
 United States v. Black, 707 F.3d 531, 542 (4th Cir. 2013).
 Lau, supra note 3.
 Ferguson, supra note 1, at 1146.
 Id. at 1146-47.
 Lau, supra note 3.
 Ferguson, supra note 5, at 270.
 Justin Jouvenal, Police are Using Software to Predict Crime. Is it a ‘Holy Grail’ or Biased Against Minorities?, The Washington Post (Nov. 17, 2016), https://www.washingtonpost.com/local/public-safety/police-are-using-software-to-predict-crime-is-it-a-holy-grail-or-biased-against-minorities/2016/11/17/525a6649-0472-440a-aae1-b283aa8e5de8_story.html.
 Erica Goode, Sending the Police Before There’s a Crime, N.Y. Times (Aug. 15, 2011), https://www.nytimes.com/2011/08/16/us/16police.html.