by Leah Luckett, Associate Member, University of Cincinnati Law Review Vol. 94
I. Introduction
Facial recognition technology is not new to law enforcement. As early as 1998, the United States government employed grants to develop facial recognition technology to identify subjects on the internet, identify and locate missing and exploited children, and fight child pornography on the web.1History of NIJ Support for Face Recognition Technology, Nat’l Inst. of Just. (Mar. 5, 2020), https://nij.ojp.gov/topics/articles/history-nij-support-face-recognition-technology [https://perma.cc/VFN9-F9QG]. As technology has advanced since 1998, artificial intelligence (“AI”) has elevated the capabilities of facial recognition technology. In fact, facial recognition could become one of the most powerful applications of AI for law enforcement and surveillance practices.2Monika Simmler, & Giulia Canova, Facial Recognition Technology in Law Enforcement: Regulating Data Analysis of Another Kind, Comp. L. & Security R., Mar. 2025, at 1, 1. https://www.sciencedirect.com/science/article/pii/S0267364924001572?via%3Dihub [https://doi.org/10.1016/j.clsr.2024.106092]. Facial recognition through AI automates the comparison of human faces, and law enforcement can use this technology to identify individuals related to criminal activities.3Id. While this ability brings many advantages, the information the government uses to advance this technology often comes from an extensive range of sources. For instance, the software can use social media platforms like Facebook or Instagram to compile facial images as well as video cameras from public or private settings.4Qandeel M., Facial Recognition Technology: Regulations, Rights and the Rule of Law, Frontiers in Big Data, June 4, 2024, at 1, 2, https://www.frontiersin.org/journals/big-data/articles/10.3389/fdata.2024.1354659/full [https://doi.org/10.3389/fdata.2024.1354659]. Using technology in a way that recognizes individuals and subsequently pursues them for possible criminal activity leads to privacy issues. Although there is a right to privacy through the Fourth Amendment, it is unclear if that right includes a right to be identified only by those who know or recognize us and no one else.5Carpenter v. United States, 585 U.S. 296, 303 (2018). If that question is answered in the affirmative, facial recognition through AI, could potentially be a search of the person violating constitutional rights.
The Supreme Court has given guidelines on what constitutes a search in the confines of certain technology and data compilation.6See Id. While this understanding is important, the Court has not addressed limits on AI or facial recognition in crimes or arrests and consistently veers away from such topics.7See Riley v. California, 573 U.S. 373 (2014). With the vast amount of data now available to law enforcement, AI is a unique way to search through that data much quicker. However, the ability to search through this data and use it to solve crimes opens the door to potential Fourth Amendment issues.
This Article explores the potential impact of the intersection between AI, specifically facial recognition technology, and the Fourth Amendment, particularly pertaining to searches. Part II provides background on the Fourth Amendment and details the Court’s Fourth Amendment jurisprudence, as it relates to new technology. Next, Part III discusses the implications of prior case history on the limits of AI and facial recognition. Part III also discusses current legislative actions governing AI in the United States as well as Europe. Finally, Part IV offers a brief conclusion on the current state of facial recognition as it pertains to the Court.
II. Background
The Fourth Amendment of the Constitution states that people have the right:
. . . to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, which shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.8U.S Const. amend. IV.
The standard in assessing violations of the Fourth Amendment revolves around one’s reasonable expectation of privacy.9Katz v. United States, 389 U.S. 347, 360 (1967). As technology has advanced, and the ability to conduct searches has changed, the Court and lawmakers have attempted to shape the Fourth Amendment doctrines to reflect these futuristic developments, while also affirming the traditional principles of the Fourth Amendment and a reasonable expectation of privacy.10See Carpenter v. United States, 585 U.S. 296, 316 (2018). A few landmark Supreme Court cases display the considerations at play when advancing technology allows the Government to retrieve information in a way previously unavailable to it.
Kyllo v. United States is one of the turning point Supreme Court cases that addresses advanced technology when law enforcement conducts searches.11Kyllo v. United States, 533 U.S. 27, 35 (2001). The case involved the use of thermal imaging on a home to detect marijuana plants.12Id. Emphasizing the important protection that the home affords, the Court ruled that using sense enhancing technology to gain information without a physical intrusion into the home was a search under the Fourth Amendment.13Id. at 40. The Court reasoned that there is a reasonable expectation of privacy in a home that it will not be subject to thermal imaging detections by the government.14Id. at 35. Kyllo reinforced the commitment to preserve a degree of privacy against the government that existed at the time of the Amendment’s adoption, and emphasized the need to ensure citizens are not left at the mercy of advancing technology.15Id.
Riley v. California established additional limits on law enforcement’s use of technology under the Fourth Amendment.16Riley v. California, 573 U.S. 373 (2014). The Court in this case unanimously held that police officers could not, without a warrant, search digital information on cell phones seized from defendants during arrests.17Id. at 381. Although the case is particular to searches incident to arrest, the Court expressed fundamental ideas about advancing technology.18Id. The Court found that privacy related concerns could be enough to outweigh the benefit of a government search, or at least require a warrant, when large amounts of personal data are involved, even in the case of an arrest.19Id. at 392. Riley acknowledged that a cell phone would typically expose the government to far more than the most exhaustive search of a house.20Id. at 394. The Court stated “The sum of an individual’s private life can be reconstructed through a thousand photographs labeled with dates, locations, and descriptions; the same cannot be said of a photograph or two of loved ones tucked into a wallet.”21Id. This further clarifies the Court’s recognition that information of this kind of detail is not something that should be freely given to law enforcement.
Most recently, the Supreme Court addressed third party data and the reasonable expectation of privacy an individual has when exposing their personal information to a third party. In Carpenter v. United States, the Court found that the government’s acquisition of defendant’s historical cell site location information (“CSLI”) from wireless carriers was a search under the Fourth Amendment.22Carpenter v. United States, 585 U.S. 296, 320 (2018). Although the information was obtained by an individual’s exposure of information to a third party and thus the privacy afforded to an individual was reduced, this was not enough to determine that the defendant was not entitled to Fourth Amendment protection.23Id. at 304. The case is an important step in deciding the extent to which third party information can be searched, or if it should be searched at all. The Court discussed that because it was digital data maintained by a third party, the case did not fit under existing precedents.24Id. at 318. Diverting from the traditional third party doctrine, the Court found that an individual maintains a legitimate expectation of privacy in the record of their physical movements, as captured through CSLI, and therefore the information obtained from Carpenter’s wireless carriers was the product of a search.25Id. at 310.
Location records, like many other records, the Court noted, hold the privacies of life.26Id. at 350. Even considering how easy and efficient these tracking techniques were for law enforcement, the Court concluded it did not compare to the privacy concerns presented by its use.27Id. at 311. The lack of limitations on the use of CSLI, given its revealing nature, heavily influenced the Court and provides a framework for thinking about limitations on AI.28Id. at 320.
These three cases in conjunction with many others define the last few decades of privacy and technology as it relates to the Fourth Amendment. The Court has consistently emphasized the importance of maintaining privacy even in the evolving sphere of technology.29See Riley v. California, 573 U.S. 373 (2014); Carpenter v. United States, 585 U.S. 296, 320 (2018). As time passes, reconciling these principles with arguments for efficiency and safety in police practices becomes increasingly difficult. AI is so commonly used in so many different practices that it will be difficult to argue the government or police cannot take advantage of it in some way.30Mills, Jon L. & Bradley-Kenney, Caroline S, Surveillance and Policing Today: Can Privacy and the Fourth Amendment Survive New Technology, Artificial Intelligence and a Culture of Intrusion?, 33 U. Fla. J. L. & Pub. Pol’y 183, 185 (2023). The government already has in many ways.31Will Nesbit, Comment, A Matter of Time: Artificial Intelligence, the Fourth Amendment, and Changing Privacy Expectations, 73 Am. U. L. Rev. F. 237, 247 (2024).
III. Discussion
A. Application of Fourth Amendment Jurisprudence to AI
The government has made clear that it uses facial recognition and face capture as powerful AI technologies to support critical law enforcement investigations.322024 Update on DHS’s Use of Face Recognition & Face Capture Technologies, Dept. of Homeland Sec. (Jan. 16, 2025) https://www.dhs.gov/archive/news/2025/01/16/2024-update-dhss-use-face-recognition-face-capture-technologies [https://perma.cc/Y2JV-TUXB]. These investigations can include comparing faces from crime scenes or surveillance databases of known faces to identify suspects or even missing persons.33Id. The problem with using AI facial recognition is that often the model pulls from places like public video cameras, social media, and mugshot databases.34Id. If this is allowed constantly, individuals could get to a place where there is surveillance footage of their every movement. Even in the confines of their own homes, many Americans choose to have cameras to protect against theft both inside and outside of the home.35Mills, supra note 30 at 207. When the government uses this data for facial recognition, it leads to a limited amount of privacy. Looking at the Court’s past jurisprudence as applied to AI through facial recognition, each case struggles with the tension between established Fourth Amendment doctrines and the new challenges that AI presents.36Id. at 240.
While the use of AI in criminal investigations is not identical to Kyllo, if the Court applies the principles underlying Kyllo, the individual is afforded constitutional protection similar to when they are in the home. Kyllo stressed that the home, a constitutionally protected area, was violated, and one could argue that using facial recognition without someone’s consent is a search of their person.37Kyllo v. United States, 533 U.S. 27, 35 (2001). Further, individuals tend to have a reasonable expectation that when they go places or do things in public their face is not being collected constantly with each step they take.38Mills, supra note 30 at 192. Like the unreasonable thermal imaging in Kyllo, most citizens have a reasonable expectation that they are being recorded at times but are not expecting their face to be continually captured and used for facial recognition in criminal investigations.39Kyllo, at 34.
If the Court chose to extend the rationale from Riley, facial recognition through AI would likely qualify as a search because of the vast amount of personal information that the technology uses on an everyday basis.40Riley v. California, 573 U.S. 373, 397 (2014). Like a cellphone, facial recognition technology similarly presents images with corresponding dates, locations, and times.41Mills, supra note 30 at 192. In Riley, this was a large reason the Court concluded that the search of the cellphone violated Fourth Amendment rights, requiring a warrant even in the case of an arrest.42Riley, 573 U.S. at 394. A similar argument could be made for facial recognition and the type and amount of information it so readily possesses.
Lastly, the Court could choose to extend the third party limitations in Carpenter to facial recognition data. The type of data that AI can produce by facial recognition requires the collection of a wide range of information.43Mills, supra note 30 at 188. This information frequently is taken from third parties, such as public video cameras or social media websites, which could allow the Court to place restrictions on the information being used in AI models for facial recognition.44Id. at 184. Like the CSLI in Carpenter, facial recognition data is compiled frequently, and if this data is compiled overtime it can quickly intrude upon a reasonable expectation of privacy.45Id. The Court could find that the ability to track someone through recognition so frequently must require a warrant or probable cause.46Nesbit, supra note 31 at 262.
It is not by choice that people are choosing to expose themselves to facial recognition technology. Stepping out in public does not, and should not, imply that your facial information will be collected consistently over time such that the sum of your private life could be reconstructed.47Riley, 573 U.S. at 394. The argument that individuals are exposing themselves to third parties or assume this risk by being in public is not an argument that has strong footing.48Mills, supra note 30 at 199. Like in Carpenter, where a cellphone tracked every movement, surveillance is so common that if allowed to continue in this way without regulation, facial recognition would be everywhere and akin to an anklet monitor.49Carpenter v. United States, 585 U.S. 296, 312 (2018).
B. Legislation
Cases of law enforcement misuse of facial recognition technology has already raised concerns that the value of this technology may not be worth the cost.50Paige Gross, Facial Recognition in Police is Getting State-by-State Guardrails, Va. Mercury (Feb. 11, 2025), https://virginiamercury.com/2025/02/11/facial-recognition-in-policing-is-getting-state-by-state-guardrails/[https://perma.cc/CR9V-AW2A]. Research shows that facial recognition technology has much higher error rates for people of color, women, and older adults.51Id. Many states have begun to set in place facial recognition laws to limit the invasion and disruption of privacy that this technology presents.52Jake Laperruque, Status of State Laws on Facial Recognition Surveillance: Continued Progress and Smart Innovations, Tech Pol’y Press (Jan. 6, 2025) https://www.techpolicy.press/status-of-state-laws-on-facial-recognition-surveillance-continued-progress-and-smart-innovations/[https://perma.cc/M3DG-MC6T]. Oregon specifically has a limited law in Portland that restricts police use of facial recognition with body cameras.53Portland, Or., City Code §34.10 (2021). Maine, Massachusetts, Montana, and Utah all require a warrant, probable cause, or a court order to use facial recognition techniques.54Laperruque, supra note 52. Other states allow facial recognition only when investigating a serious crime.55Id. Additionally, some states require the defendant receive notice of the use of facial recognition by law enforcement.56Id. States like Colorado and Virginia require facial recognition be subject to independent testing and accuracy standards to ensure the technology is not biased across demographics.57Id. Moreover, the European Union has already enacted the EU’s AI Act to prevent the misuse of this technology by law enforcement.58Prathi Chowdri, Facial Recognition in Law Enforcement: Promises and Pitfalls, LEXIPOL (May 23, 2025), https://www.lexipol.com/resources/blog/facial-recognition-in-law-enforcement-promises-and-pitfalls/[https://perma.cc/U8DN-B8A9]. In fact, the Act almost completely bans real-time facial recognition by law enforcement.59Id. It is likely that states will continue to define facial recognition through individual legislation until the Supreme Court sets definitive limits on facial recognition and addresses the true burden it places on privacy and the Fourth Amendment in relation to crime.
Looking at state legislation and broader world-wide legislation, the need for limitations on this technology is known. However, it may make more sense for legislative bodies to pursue this topic rather than the Court. Legislation is likely to be much quicker and more malleable as time passes than case law. Further, it is likely states will learn from other states and countries implementing these laws, leading to a gradual progression of an acceptable standard for AI and facial recognition.
IV. Conclusion
The Court has not directly addressed or ruled on a case that implicates invasive use of AI or facial recognition in the name of government safety or regulation. On top of that, it is not certain that the Court will. As addressed above, the Court may allow legislation to tackle the issue. However, if the Court chooses to address the issue, cases like Riley, Carpenter, and Kyllo provide a small snapshot of the framework the Court may use when analyzing privacy interests in contrast to facial recognition and AI. It is clear the Court recognizes the fundamental issues that advancing technology raises, as in each case they addressed the importance of understanding the ability of technology to provide more information than is reasonably necessary or reasonably expected for the government to acquire. Similar to the issues in those cases, when the government decides to use data analysis and facial recognition technologies to solve crimes, they are implicating privacy interests of individuals. Although many unknown questions linger, it’s clear AI and facial recognition improve efficiency but may just as equally impair our privacy in a way the Court has not yet assessed.
Cover Photo by Tasha Kostyuk on Unsplash
References
- 1History of NIJ Support for Face Recognition Technology, Nat’l Inst. of Just. (Mar. 5, 2020), https://nij.ojp.gov/topics/articles/history-nij-support-face-recognition-technology [https://perma.cc/VFN9-F9QG].
- 2Monika Simmler, & Giulia Canova, Facial Recognition Technology in Law Enforcement: Regulating Data Analysis of Another Kind, Comp. L. & Security R., Mar. 2025, at 1, 1. https://www.sciencedirect.com/science/article/pii/S0267364924001572?via%3Dihub [https://doi.org/10.1016/j.clsr.2024.106092].
- 3Id.
- 4Qandeel M., Facial Recognition Technology: Regulations, Rights and the Rule of Law, Frontiers in Big Data, June 4, 2024, at 1, 2, https://www.frontiersin.org/journals/big-data/articles/10.3389/fdata.2024.1354659/full [https://doi.org/10.3389/fdata.2024.1354659].
- 5Carpenter v. United States, 585 U.S. 296, 303 (2018).
- 6See Id.
- 7See Riley v. California, 573 U.S. 373 (2014).
- 8U.S Const. amend. IV.
- 9Katz v. United States, 389 U.S. 347, 360 (1967).
- 10See Carpenter v. United States, 585 U.S. 296, 316 (2018).
- 11Kyllo v. United States, 533 U.S. 27, 35 (2001).
- 12Id.
- 13Id. at 40.
- 14Id. at 35.
- 15Id.
- 16Riley v. California, 573 U.S. 373 (2014).
- 17Id. at 381.
- 18Id.
- 19Id. at 392.
- 20Id. at 394.
- 21Id.
- 22Carpenter v. United States, 585 U.S. 296, 320 (2018).
- 23Id. at 304.
- 24Id. at 318.
- 25Id. at 310.
- 26Id. at 350.
- 27Id. at 311.
- 28Id. at 320.
- 29See Riley v. California, 573 U.S. 373 (2014); Carpenter v. United States, 585 U.S. 296, 320 (2018).
- 30Mills, Jon L. & Bradley-Kenney, Caroline S, Surveillance and Policing Today: Can Privacy and the Fourth Amendment Survive New Technology, Artificial Intelligence and a Culture of Intrusion?, 33 U. Fla. J. L. & Pub. Pol’y 183, 185 (2023).
- 31Will Nesbit, Comment, A Matter of Time: Artificial Intelligence, the Fourth Amendment, and Changing Privacy Expectations, 73 Am. U. L. Rev. F. 237, 247 (2024).
- 322024 Update on DHS’s Use of Face Recognition & Face Capture Technologies, Dept. of Homeland Sec. (Jan. 16, 2025) https://www.dhs.gov/archive/news/2025/01/16/2024-update-dhss-use-face-recognition-face-capture-technologies [https://perma.cc/Y2JV-TUXB].
- 33Id.
- 34Id.
- 35Mills, supra note 30 at 207.
- 36Id. at 240.
- 37Kyllo v. United States, 533 U.S. 27, 35 (2001).
- 38Mills, supra note 30 at 192.
- 39Kyllo, at 34.
- 40Riley v. California, 573 U.S. 373, 397 (2014).
- 41Mills, supra note 30 at 192.
- 42Riley, 573 U.S. at 394.
- 43Mills, supra note 30 at 188.
- 44Id. at 184.
- 45Id.
- 46Nesbit, supra note 31 at 262.
- 47Riley, 573 U.S. at 394.
- 48Mills, supra note 30 at 199.
- 49Carpenter v. United States, 585 U.S. 296, 312 (2018).
- 50Paige Gross, Facial Recognition in Police is Getting State-by-State Guardrails, Va. Mercury (Feb. 11, 2025), https://virginiamercury.com/2025/02/11/facial-recognition-in-policing-is-getting-state-by-state-guardrails/[https://perma.cc/CR9V-AW2A].
- 51Id.
- 52Jake Laperruque, Status of State Laws on Facial Recognition Surveillance: Continued Progress and Smart Innovations, Tech Pol’y Press (Jan. 6, 2025) https://www.techpolicy.press/status-of-state-laws-on-facial-recognition-surveillance-continued-progress-and-smart-innovations/[https://perma.cc/M3DG-MC6T].
- 53Portland, Or., City Code §34.10 (2021).
- 54Laperruque, supra note 52.
- 55Id.
- 56Id.
- 57Id.
- 58Prathi Chowdri, Facial Recognition in Law Enforcement: Promises and Pitfalls, LEXIPOL (May 23, 2025), https://www.lexipol.com/resources/blog/facial-recognition-in-law-enforcement-promises-and-pitfalls/[https://perma.cc/U8DN-B8A9].
- 59Id.
