Facial Recognition Technology: How Will Lawmakers and the Courts Respond to the Growing Demand for Policy Development?

Arizona Law Journal of Emerging Technologies
Volume 4 Article 4, 01-2021
Download Article Here



Claire Bosarge[1]*

I. Introduction

“Imagine a government tracking everywhere you walked over the past month without your permission or knowledge . . . [or] a database of everyone who attended a political rally that constitutes the very essence of free speech . . . This has long been the stuff of science fiction and popular movies – like . . . ‘1984’ – but now it’s on the verge of becoming possible.” – Brad Smith, President of Microsoft[2]

This excerpt offers a glimpse of future America if the commercial and governmental use of facial recognition technology (FRT) persists without federal regulation.[3] FRT is a rapidly advancing biometric authentication method that identifies or verifies the identity of a person by comparing specific facial features detected in an image or video to faces stored within a database.[4]

Although the most well-known use of FRT is by law enforcement agencies, there are numerous other entities that utilize FRT, such as cell phone manufacturers, universities, social media companies, and retailers.[5] For instance, during American pop star Taylor Swift’s Reputation tour, a mesmerizing screen displaying rehearsal footage was secretly used to scan and compare fans’ faces to images of hundreds of the star’s known stalkers.[6] Information on the use, collection, and storage of facial recognition data is scarce, but a 2016 study, released by the Center on Privacy & Technology at Georgetown Law, reported that the face of one in two American adults is in a facial recognition database.[7]

One false match in a facial recognition system can result in missed flights, police interrogations, or even a false arrest.[8] Nevertheless, the global market for FRT is projected to grow to $7 billion in 2024, from $3.2 billion in 2019.[9] The drastic expected growth is due to the escalating use of FRT in commercial applications.[10] Fittingly, it is expected that by 2023, U.S. Customs and Border Protection will have the ability to scan the faces of 97% of commercial airline passengers departing the U.S.[11]

Although FRT has become mainstream,[12] widespread, unregulated use of FRT creates serious privacy concerns.[13] While some state and local governments have placed restrictions on the use of FRT, the federal government has struggled to gain much traction limiting the uses of FRT by federal agencies.[14] Moreover, the U.S. Supreme Court has yet to hear a case regarding the use of FRT.[15] However, in Carpenter v. United States, the Supreme Court held that the government must obtain a warrant to acquire cell phone location data (CPLD) from a cellular provider.[16] The Court declared that an individual maintains a legitimate expectation of privacy, for Fourth Amendment purposes, “in the record of his physical movements as captured through [CPLD].”[17] Given the similarities between CPLD and facial recognition data, the holding in Carpenter may be extended to a case challenging large-scale surveillance through the use of FRT.[18]

This Article addresses the privacy concerns presented by the widespread unregulated implementation of FRT and explores possible responses by the legislature and courts. Part II details the mechanics of FRT and recent technological developments. Part III addresses why FRT may be characterized as a double-edged sword by touching on (1) the various applications of FRT in different industries, (2) the fallibility of FRT, and (3) the criticisms of FRT that drive the need for policy development. Part IV provides a detailed analysis of Carpenter, and predicts how the courts may interpret the holdings in Carpenter when faced with a case challenging FRT. Part V demonstrates the need for federal regulations restricting FRT and provides the pending and enacted state and local legislative acts imposing restrictions on the use of FRT. Part V also proposes model legislation for regulating FRT. Part VI discusses the pervasiveness of surveillance cameras equipped with FRT in China to demonstrate the dystopic like society that can result from the widespread, unregulated use of FRT.

II. How Does Facial Recognition Technology Work?

Human beings have an innate ability to recognize and distinguish human faces, but only within the past fifty years have computers been programmed to exercise the same ability.[19] Using computer algorithms, facial recognition systems measure and analyze distinguishable landmarks, or nodal points, that exist on the human face.[20] Specific measurements–such as the distance between the eyes, width of the nose, depth of the eye sockets, and length of the jaw line–may be extracted from either a two-dimensional (2-D) face image or a three-dimensional (3-D) face model.[21] The 3-D face recognition software captures the distinct geometry of a face from multiple angles.[22] By using depth and an axis of measurement that is unaffected by lighting, 3-D face recognition systems can detect a face in darkness and have the ability to recognize “a face in profile” if the head is positioned perpendicular to the camera’s line of view.[23]

Despite advances in FRT that allow for 3-D facial imaging, many facial recognition systems still rely on 2-D imaging for the sake of convenience, as most images in facial recognition databases are stored in 2-D format.[24] The first systems were developed in the 1960s and operated by comparing 2-D images.[25] The difficulty posed by 2-D systems is that the newly captured image and the image within the database must be an equal distance from the camera, and have similar lighting, similar facial expressions, and similar facial alignment.[26] A deviation in light or orientation reduces the ability of the system to correctly match the newly captured image with a stored image, which consequently reduces the recognition accuracy.[27] In any event, 3-D systems have emerged and have proven to be more accurate than their 2-D counterparts.[28]

The process of identifying and verifying the identity of an individual through a facial recognition system involves several steps.[29] First, the system receives either a still photo, like those taken upon arrival at a U.S. airport, or a frame from a video of a person in motion.[30] Once the system detects a face within the 2-D image or video, it scales, rotates, and aligns the face “so that every face that the algorithm processes is in the same position.”[31] When the front of the head is facing the camera, the face is in the best position for detection.[32] However, the system is capable of recognizing a face so long as the head is not rotated more than thirty-five degrees away from the camera in a 2-D system, or more than ninety degrees in a 3-D system.[33]

Once the face is aligned, the facial recognition system extracts certain facial features and measures the curves of the subject’s face on a sub-millimeter scale to create a template.[34] The template is converted into a unique, numerical code called a faceprint.[35] The faceprint can be compared to other faceprints within the database to find a potential match.[36] If FRT is used for verification purposes, or to confirm a subject’s identity, the image is matched to only one other image in the database.[37] If FRT is used for identification purposes, the algorithm will compare the image to other existing images in the database and generate a “numerical score reflecting the similarity of their features.”[38]

A recent development in FRT, known as Surface Texture Analysis (STA), analyzes skin biometrics or the uniqueness of skin texture to produce even more accurate results.[39] With a picture of a patch of skin, called a skinpatch, STA uses “algorithms to turn the patch into a mathematical, measurable space.”[40] The software is then able to distinguish “the actual skin texture” as well as any lines or pores within the skinpatch.[41] STA is so advanced that it can “identify differences between identical twins, which is not yet possible using facial recognition software alone.”[42] STA software may be used separately or in conjunction with other methods of FRT to increase accuracy.[43] According to one biometrics company, combining FRT with STA increases identification accuracy by 20 to 25%.[44]

Facial recognition systems are constantly advancing, as evidenced by the development of STA and even more recent developments like “real-time emotion recognition,” which maps a subject’s facial expressions to detect emotions such as anger, fear, and surprise.[45] However, this may create a false sense of progress, given that each of these advancements only improve the accuracy of 3-D face recognition systems and cannot be applied to systems that rely on 2-D images. Because most facial recognition systems in use today rely on 2-D camera technology,[46] the inaccuracy of facial recognition systems remains unresolved.[47]

III. Facial Recognition Technology: A Double-Edged Sword

a. The Many Uses of FRT in Law Enforcement and Beyond

Law enforcement agencies can benefit from FRT in several different contexts.[48] An officer on duty who encounters someone who is unable to identify themselves can take a photo of the individual and use facial recognition software to see if the photo matches any of the photos in the officer’s database, which may include “mug shots, driver’s license photos, or face images from unsolved crimes.”[49] If there is video or photographic evidence of a suspect’s face, then FRT also may be used to search an image against a database during an investigation.[50] Another common use of FRT is during “Real-time Video Surveillance,” when police officers possess images of specific individuals they are trying to locate.[51] Once these images are uploaded to a database known as a “hot list,” FRT is used to extract facial images from a live video surveillance feed and to compare them to the images on the “hot list.”[52] Each individual who walks within the video camera’s range of detection may be subject to this process.[53] The same FRT method can be applied to compare archived video images to a “hot list” database.[54]

As FRT becomes less expensive, more and more industries will begin to use it.[55] FRT is used in airports to verify that a foreign traveler in a database is the same person who seeks entry into the United States.[56] Some banks use FRT at ATMs and check cashing kiosks in order to allow their customers to verify their identity using their faceprint in place of a personal identification number or card swipe.[57] There are healthcare mobile phone applications that use FRT to detect rare genetic disorders such as Cornelia de Lange syndrome and Angelman syndrome.[58] FRT is used in retail stores to identify known shoplifters that walk into the store.[59] An individual who is caught shoplifting in one store may have a digital record of their face shared with other store owners across the country who use the same FRT company.[60] One FRT provider stated that the police are automatically alerted any time the retail store’s facial recognition system detects a known shoplifter’s face, even if they are not shoplifting.[61] Whether users of Apple’s iPhone X, iPhone 11 or iPhone 12 realize it or not, each time they gain access to their cellular device using “Face ID,” they are using a form of FRT, as their “faceprint [is] mapped by the phone’s front-facing camera.”[62] Social media platforms, such as Facebook, utilize facial recognition software to “identify human faces in pictures uploaded to the [app] with up to 97% accuracy.”[63]

FRT simultaneously serves the public welfare and raises serious privacy concerns.[64] For example, FRT is capable of tracking an individual’s movements for purposes of long-term surveillance of their daily life.[65] But, it is also capable of identifying a missing, lost and wandering child walking on the street.[66] FRT may be used to identify every attendee at a political rally without their consent.[67] But, it may also allow law enforcement officials to identify a suspected terrorist, who is present at that rally, and intends to harm those in attendance.[68] FRT is a double-edged sword and these conflicting uses illustrate the need “for thoughtful government regulation and for the development of norms around acceptable uses.”[69]

b. The Fallibility of Facial Recognition Technology

Though FRT companies are steadily improving their facial recognition systems to overcome certain technical challenges, the technology remains far from perfect.[70] Unlike fingerprints or DNA, faces inevitably change over time.[71] For example, a subject’s face can change over time due to fluctuation in body weight, change in hairstyle, growth or removal of facial hair, and the effects of aging.[72] Other factors that may interfere with an algorithm’s ability to detect a subject’s faceprint include the wearing of eyeglasses or sunglasses, and hair that obscures distinguishing nodal points.[73]

A more recent challenge to FRT is the growing popularity of facial plastic surgery, which can dramatically change the relationship between certain nodal points.[74] One study found “that appearance, feature-, and texture-based [facial recognition] algorithms are unable to effectively mitigate the variations caused by plastic surgery procedures.”[75] Similarly, the popularity of “beautification apps,” or mobile phone applications that allow users to retouch and reshape the face in an image, pose a challenge to FRT.[76] Finally, the greater the number of faces stored in a facial recognition system’s database, the less effective the system, because more faces look similar to one another.[77] Although manufacturers of FRT are constantly improving their products to address these limitations, erroneous facial recognition results can have disturbing effects, the most devasting of which is putting innocent subjects behind bars.[78]

c. Criticisms of Facial Recognition Technology: Why People are Begging for Policy Development

Erroneous FRT evidence can produce wrongful convictions, enhance racial discrimination, and may be used by the police to punish individuals for political expression.[79] Researchers from MIT and Stanford University analyzed the accuracy rate of facial recognition software in identifying skin-type and gender.[80] The researchers compiled a database with over 1200 images, in which women and dark-skin individuals were better-represented than individuals who fell into neither category.[81] Each image was assigned a score–I, II, III, IV, V, or VI–based on the Fitzpatrick skin tone scale.[82] The researchers found that such systems had an error rate of no more than 0.8% when identifying white males, but the error rate when identifying darker-skinned women, or those assigned scores of IV, V, or VI, was 20.8%, 34.5%, and 34.7%, respectively.[83] A test conducted by the American Civil Liberties Union (ACLU) revealed that Amazon’s facial recognition software “incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime.”[84] Almost 40% of the incorrect matches were of people of color, even though they make up only 20% of Congress.[85] These studies demonstrate why adversaries of FRT fear that it may “exacerbate the disproportionate surveillance of minority communities, particularly people of color.”[86]

A further criticism raised by opponents of FRT is that it endangers Americans’ right to anonymity when participating in certain activities protected by the First Amendment, such as protests and political rallies.[87] An investigation conducted by the ACLU revealed that during the protests that erupted after Freddy Gray’s death, the Baltimore Police Department used FRT in conjunction with a social media monitoring service to arrest protesters in the crowd who were identified as having outstanding warrants.[88] The ease with which the government may remove the anonymity of a group’s members, without consent, could chill First Amendment protected activities.[89]

A 2019 study conducted by Pew Research Center revealed “that a majority of Americans (56%) trust law enforcement agencies to use [FRT] responsibly.”[90] However, several groups– particularly African Americans, younger Americans, and Democrats–expressed low levels of such trust in law enforcement agencies.[91] The same study revealed that between a third to over a half of Americans find it unacceptable for FRT to be used by landlords to track who enters or leaves their apartment buildings, or by advertising companies to gauge how people respond to public advertisement, or by employers to monitor their employees’ attendance.[92] As this study shows, the fear that other organizations, aside from law enforcement agencies, will misuse this data, is very real.[93]

IV. How Will the Courts Interpret Privacy Interests in Light of Facial Recognition Technology?

The U.S. Supreme Court has yet to hear a case regarding the use of FRT and it is unclear how the Supreme Court will apply the Fourth Amendment and the ruling in Carpenter v. United States to a case challenging FRT.[94] The Fourth Amendment protects “[t]he right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.”[95] The majority opinion in Carpenter, written by Chief Justice Roberts, discusses the evolution of Fourth Amendment jurisprudence, beginning with the early cases in which courts took an exclusively property-based approach and leading to the modern understanding that “the Fourth Amendment “protects people, not places.”[96] Roberts emphasizes that an individual does not relinquish all constitutional protection by stepping out in public.[97]

Although the Supreme Court has not directly ruled upon the constitutionality of law enforcement’s use of FRT,[98] the Supreme Court’s ruling in Carpenter may offer some guidance as to how the Court will handle other forms of surveillance technology.[99] The technology at issue in Carpenter is cell-site location information (CSLI).[100] CSLI refers to a time-stamped record that is created each time a cell phone connects to a nearby cell site.[101] For example, CSLI is created each time a phone is turned on, sends or receives a text message, or receives a phone call.[102] Most smart phones connect to a cell site several times a minute whenever their signal is on, regardless of whether the phone is being used.[103] The more cell sites within a geographic area, the more precise the CSLI, and the easier it is to pin down a cell phone user’s location.[104]

Before Carpenter, under the Stored Communications Act, law enforcement could order wireless carriers to produce the CLSI associated with the suspect of a crime, by merely showing “that the cell-site evidence might be pertinent to an ongoing investigation.”[105] This standard is significantly lower than the probable cause required for a typical warrant.[106] The facts of Carpenter provide that the FBI obtained 12,898 location points cataloging a robbery suspect’s movements over 127 days.[107] The suspect at issue moved to suppress the data, arguing that the Government’s acquisition of such data was unconstitutional, because it was a search within the meaning of the Fourth Amendment, which requires a warrant supported by probable cause.[108]

In Carpenter, the Supreme Court held that an individual maintains a legitimate expectation of privacy, for Fourth Amendment purposes, “in the record of his physical movements as captured through CSLI.”[109] The ruling in Carpenter requires that police obtain a warrant supported by probable cause to access CSLI from a wireless carrier, unless a case-specific exception to the warrant requirement applies, such as exigent circumstances.[110] Roberts reasoned that “a cell phone—almost a ‘feature of human anatomy’—tracks nearly exactly the movements of its owner,” and therefore presents heightened risks to privacy.[111] A person’s face is most certainly a feature of human anatomy, and unlike cell phones, a face cannot be powered off or left behind. If the Supreme Court found that police acquisition of cell phone location data from third parties constituted a search, it seems plausible that the Court would find that police acquisition of facial recognition data from third parties would likewise constitute a search.[112]

Supporters of FRT may argue that the acquisition of facial recognition data from third parties is not a violation of privacy because under the third-party doctrine, the government is free to access, without a warrant, information that an individual voluntarily provided to a third party.[113] In Carpenter, the Court declined to extend the third-party doctrine to cell site records, on the grounds that in 1979, when the doctrine was established, “few could have imagined a society in which a phone goes wherever its owner goes, conveying to the wireless carrier not just dialed digits, but a detailed and comprehensive record of the person’s movements.”[114] The Court also refused to apply the voluntary exposure rationale of the third-party doctrine, reasoning that CSLI is not truly “shared” because “carrying [a cell phone] is indispensable to participation in modern society” and CSLI is generated by virtually every action carried out on a cell phone.[115]

CSLI and FRT are both valuable investigatory tools, but both technologies have the potential to invade the privacy of Americans, and therefore must be regulated. Just as law enforcement is required to obtain a warrant before collecting cell phone location information, law enforcement should be required to obtain a warrant before using FRT to conduct public surveillance of an individual.

Quoting the Supreme Court’s ruling in Camara v. Municipal Court of City and County of San Francisco, 387 U.S. 523 (1967), Chief Justice Roberts reminds the reader that the purpose of the Fourth Amendment “is to safeguard the privacy and security of individuals against arbitrary invasions by governmental officials.”[116] Unfortunately, the holding in Carpenter is a narrow one.[117] As Chief Justice Roberts explains, the decision does not apply to real-time location tracking, nor does it “call into question conventional surveillance techniques and tools, such as security cameras,” or “business records that might incidentally reveal location information.”[118] This narrow ruling is unfortunate because it limits the extension of Carpenter to other activities that raise privacy concerns, namely the government’s ability to obtain video footage from a camera mounted by a third party, such as a retail store.[119] Nevertheless, despite the Court’s apprehension, Chief Justice Roberts notes that as technological innovations enhance the government’s ability to intrude on constitutionally protected areas, the courts must interpret the Fourth Amendment in a more nuanced, as opposed to mechanical, fashion when deciding what constitutes a search for Fourth Amendment purposes.[120]

Given the rapid pace at which FRT is evolving, the courts, alone, cannot be relied on to address the privacy concerns posed by FRT.[121] Federal, state, and local policymakers must take steps to constrain the use of FRT.[122]

V. Legislation Regulating Facial Recognition Technology: The Enacted, Pending, and Absent

The growing concern surrounding the privacy implications of FRT in certain contexts has driven some cities and states to place limits on its use.[123] For example, “San Francisco and Oakland, California, Brookline, Cambridge, Northampton and Somerville, Massachusetts have all banned the use of [FRT] by city agencies.”[124] However, these prohibitions do not speak for all jurisdictions, as law enforcement agencies in several North Texas cities have increased the use of FRT despite the growing trend that recognizes the need for privacy protections from unregulated FRT use.[125] Some cities, like Detroit, fall somewhere in the middle, permitting the use of FRT only in certain circumstances, such as in connection with the investigation of violent crimes.[126] At the state level, California, New Hampshire, and Oregon have banned the use of FRT and other biometric tracking technology in body cameras worn by law enforcement.[127] As of January 17, 2020, ten states introduced bills to regulate, ban, or study FRT.[128] Although several state and local governments have placed restrictions on the use of FRT, there remains an absence of a unified federal law, regulation, or oversight.[129]

With no federal regulations currently in place, commercial and governmental entities are essentially free to use FRT as they please.[130] It is not expected that the federal government will enact any such regulation any time soon, as “Congress has so far been unable to pass even a basic federal online privacy law.”[131] However, members of the House Committee on Oversight and Reform have been working since at least the beginning of 2019 to enact legislation that will “pause” the advancement of FRT, to give “Congress and federal regulators [time to understand] how the technology is being used now and put guardrails in place for its use in the future.”[132]

According to Susan Crawford, a professor at Harvard Law School, action at the federal level is unlikely to happen any time soon, and if hundreds of cities across the country enact their own unique restrictions, tech companies will struggle to remain compliant.[133] Crawford speculates that as the patchwork of local laws grows, compliance will become too onerous and push “both companies and [the] government to reach a much-needed, national consensus on the use of biometric data.”[134]

When the time comes to create unified restrictions on the use of FRT, lawmakers and the Courts should look to the ruling in Carpenter and the Illinois Biometric Information Privacy Act of 2008 (BIPA) for guidance. BIPA provides a framework for regulating the use of FRT in the private sector, and the ruling in Carpenter provides a framework for regulating the government’s use of FRT. Under BIPA, private entities who wish to collect or store facial recognition data must (1) provide written notice to individuals that the collection will occur; (2) indicate the purpose of the collection; (3) describe the length of time the data is to be collected, stored, and used; and (4) receive informed written consent prior to collecting or sharing the collected data with third parties.[135] Although the BIPA requirements may be too burdensome in the federal context, imposing a general notice and consent requirement will force private entities to collect, use, and store facial recognition data responsibly.

Understanding that the BIPA requirements may curtail certain governmental applications of FRT that are beneficial to society, lawmakers should look to Carpenter when determining how to regulate the government’s use of FRT. The proposed legislation should require government entities to obtain a probable cause warrant prior to using FRT for ongoing surveillance of an individual or for some other authorized investigative use. In addition, the warrant should specify the date on which the court order expires. These proposed guidelines not only protect fundamental Fourth Amendment privacy rights but also the government’s right to use FRT for public safety reasons.

VI. Is America Unknowingly Following in China’s Footsteps?

Unregulated use of FRT seems incompatible with American values, yet many cities and states have not created any serious restrictions on facial recognition systems.[136] As FRT creeps into more and more law enforcement agencies with little notice or oversight, America grows closer to possessing a pervasive surveillance system similar to that deployed in China.[137] In China, cameras equipped with FRT are ubiquitous.[138] A report, released by industry researcher IHS Markit, states that by the end of 2021, over one billion cameras around the world will be used for surveillance, and over half will be located in China.[139] Surveillance cameras in China are able to track and quickly identify individuals over an enormous geographic area.[140] The use of FRT has become so extensive in China that the “[r]estrooms at some tourist attractions even require a facial scan in order to receive toilet paper to curb over-consumption.”[141] Moreover, one Chinese company is reported to have developed a system for identifying individuals wearing a surgical mask, which includes most Chinese citizens in the wake of COVID-19, the disease caused by the novel coronavirus.[142] From an outsider’s perspective, China appears to have become a dystopia, constantly monitoring its citizens’ moral behavior in a fashion strikingly similar to that seen in George Orwell’s Nineteen Eighty-Four.[143] However, the focused attention on China’s use of FRT may be masking the pervasive use of FRT in the United States.[144] According to IHS Markit analyst, Oliver Philippou, “the US [is] nearly on par with China in terms of camera penetration, [and] future debate over mass surveillance is likely to concern America as much as China.”[145]

VII. Conclusion

The characterization of FRT as a double-edged sword explains in part why the technology remains largely unregulated. Lawmakers and the courts face the difficult task of balancing Fourth Amendment privacy rights with the government’s need to detect and prevent criminal activity. Despite the difficulties that lie ahead, lawmakers and the courts must act soon, because although Carpenter imposed a warrant requirement for cell phone tracking, no such limitation exists for FRT.

Since the rules for electronic location tracking established by the Court in Carpenter do not apply to FRT, law enforcement will opt to use FRT, instead of CSLI, to bypass the warrant requirement.[146] Accordingly, the only thing seriously limiting the American government’s location tracking from reaching the level of that employed by China is the “relatively lower number of cameras continuously recording the public.”[147] Therefore, it is critical to recognize that newer technologies, like FRT, provide the same capacity for monitoring location that cellphones do, and that legal standards restricting electronic location tracking should be preserved.[148]

Moreover, walking in public spaces is an indispensable part of modern life; therefore, in order to participate in normal daily life, people are left with no choice but to risk subjecting themselves to the “inescapable and automatic” collection of facial recognition data.[149] For that reason, the holding in Carpenter should be extended to the use of FRT for the purpose of large-scale surveillance. Unfortunately, until such a case is decided, or federal legislation is passed, the virtually unrestricted use of FRT will persist.


  1. * J.D. Candidate, Tulane University School of Law, 2021
  2. Brad Smith, Facial Recognition Technology: The Need for Public Regulation and Corporate Responsibility, Microsoft (July 13, 2018), http://blogs.microsoft.com/on-the-issues/2018/07/13/facial-recognition-technology-the-need-for-public-regulation-and-corporate-responsibility/.
  3. See id.
  4. Steve Symanovich, How Does Facial Recognition Work?, Norton http://us.norton.com/internetsecurity-iot-how-facial-recognition-software-works.html (last visited Mar. 16, 2020).
  5. Symanovich, supra note 4.
  6. Steve Knopper, Why Taylor Swift Is Using Facial Recognition at Concerts, Rolling Stones (Dec. 13, 2018, 11:24 AM), http://www.rollingstone.com/music/music-news/taylor-swift-facial-recognition-concerts-768741.
  7. Clare Garvie et al., The Perpetual Line-Up: Unregulated Facial Recognition in America, Geo. L. Ctr. On Privacy & Tech. (Oct. 18, 2016), http://www.perpetuallineup.org.
  8. Abdullah Hasan, 2019 Proved We Can Stop Face Recognition Surveillance, ACLU (Jan. 17, 2020), http://www.aclu.org/news/privacy-technology/2019-was-the-year-we-proved-face-recognition-surveillance-isnt-inevitable/.
  9. Facial Recognition Market Worth $7.0 Billion by 2024 – Exclusive Report by MarketsandMarkets™, Cision: PR Newswire (June 27, 2019), http://www.prnewswire.com/news-releases/facial-recognition-market-worth-7-0-billion-by-2024–exclusive-report-by-marketsandmarkets-300876154.html.
  10. Symanovich, supra note 4.
  11. Allie Funk, I Opted Out of Facial Recognition at the Airport—It Wasn’t Easy, Wired (July 2, 2019, 9:00 AM), http://www.wired.com/story/opt-out-of-facial-recognition-at-the-airport/.
  12. Sharon Nakar & Dov Greenbaum, Now You See Me. Now You Still Do: Facial Recognition Technology and the Growing Lack of Privacy, 23 B.U. J. Sci. & Tech. L. 88, 91 (2017).
  13. Symanovich, supra note 4.
  14. Facial Recognition Technology Warrant Act of 2019, S.2878, 116th Cong. (2019) (“Currently, government agencies can use facial recognition technology to surveil a person without anyunified federal law, regulation, or oversight.”).
  15. See Clare Garvie et al., The Perpetual Line-Up: Unregulated Facial Recognition in America-Risk Framework, Geo. L. Ctr. On Privacy & Tech. (Oct. 18, 2016), http://www.perpetuallineup.org/risk-framework#footnote29_xbi6f92.
  16. Carpenter v. United States, 138 S. Ct. 2206, 2222 (2018).
  17. Id. at 2217.
  18. See Memorandum from Majority Staff on Hearing on “Facial Recognition Technology (Part 1): Its Impact on our Civil Rights and Liberties” to be heard before the H. Comm. on Oversight and Reform, 116th Cong. (2019).
  19. Kevin Bonsor & Ryan Johnson, How Facial Recognition Systems Work, How Stuff Works, http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/facial-recognition.html (last visited Feb. 16, 2020).
  20. See id.; Brian Newlin, A Closer Look at Facial Recognition Technology, ClickOnDetroit (Oct. 4, 2019), http://www.clickondetroit.com/2019/10/04/a-closer-look-at-facial-recognition-technology-2/.
  21. Bonsor & Johnson, supra note 19.
  22. Space and Naval Warfare Systems Center, System Assessment and Validation for Emergency Responders Tech Note: Three­Dimensional Facial Recognition, U.S. Dep’t of Homeland Security (May 2008), http://www.dhs.gov/sites/default/files/FacialRecognition-TN_0508-508.pdf.
  23. Bonsor & Johnson, supra note 19.
  24. Andrew Heinzman, How Does Facial Recognition Work?, How-To Geek (July 11, 2019, 6:40 AM), http://www.howtogeek.com/427897/how-does-facial-recognition-work/; The Complete Guide to Facial Recognition Technology, Panda Security (Oct. 11, 2019), http://www.pandasecurity.com/mediacenter/panda-security/facial-recognition-technology/.
  25. Bonsor & Johnson, supra note 19; Jesse Davis West, A Brief History of Facial Recognition, FaceFirst (Aug. 1, 2017), http://www.facefirst.com/blog/brief-history-of-face-recognition-software/.
  26. Space and Naval Warfare Systems Center, supra note 22.
  27. Id.; Bonsor & Johnson, supra note 19.
  28. Song Zhou & Sheng Xiao, 3D Face Recognition: A Survey, 8:35 Human-centric Computing & Info. Sci. 1, 6 (2018).
  29. Bonsor & Johnson, supra note 19.
  30. Bill Mann, How Does Facial Recognition Technology Work? – 5 Real World Use Cases, Blokt (Aug. 19, 2019), http://blokt.com/guides/facial-recognition.
  31. Clare Garvie et al., The Perpetual Line-Up: Unregulated Facial Recognition in America – Background, Geo. L. Ctr. On Privacy & Tech. (Oct. 18, 2016), http://www.perpetuallineup.org/background#footnoteref17_re3c47a.
  32. Panda Security, supra note 24.
  33. Bonsor & Johnson, supra note 19.
  34. Id.
  35. Id.; Panda Security, supra note 24.
  36. Bonsor & Johnson, supra note 19.
  37. See Bonsor & Johnson, supra note 19; Garvie et al., supra note 31.
  38. Garvie et al., supra note 7; see also Bonsor & Johnson, supra note 19.
  39. Bonsor & Johnson, supra note 19.
  40. Id.
  41. Id.
  42. Id.; See also Mara Calvello, Facing the Reality of Facial Recognition: The Good and the Bad, G2 (Oct. 15, 2019), http://learn.g2.com/facial-recognition.
  43. U.S. Gov’t Accountability Office, GAO-15-621, Facial Recognition Technology: Commercial Uses, Privacy Issues, and Applicable Federal Law (2015), http://www.gao.gov/assets/680/671764.pdf.
  44. Bonsor & Johnson, supra note 19.
  45. Bill Siuru, Is Facial Recognition Technology Ready for Prime Time?, Police & Sec. News (Sept. 18, 2019), http://policeandsecuritynews.com/2019/09/18/is-facial-recognition-technology-ready-for-prime-time/.
  46. Identity Matters: Facial Recognition in 2019, Gemalto, http://www.gemalto.com/review/facialrecognition/index.aspx (last visited Mar. 22, 2020).
  47. See Garvie et al., supra note 31.
  48. See Calvello, supra note 42.
  49. Garvie et al., supra note 7.
  50. The photo or video still of a suspect’s face may be obtained from a security camera, smartphone, social media post, or even from an officer who clandestinely photographed the suspect. Garvie et al., supra note 7.
  51. Id.
  52. Id.
  53. Id.
  54. Id.
  55. Bonsor & Johnson, supra note 19.
  56. Id.
  57. Id.
  58. James Vincent, Facial Recognition and AI Could Be Used to Identify Rare Genetic Disorders, Verge (Jan. 15, 2019, 2:11 PM), http://www.theverge.com/2019/1/15/18183779/facial-recognition-ai-algorithms-detect-rare-genetic-disorder-fdna.
  59. See Alfred Ng, With Facial Recognition, Shoplifting May Get You Banned in Places You’ve Never Been, CNET (Mar. 20, 2019, 8:11 AM), http://www.cnet.com/news/with-facial-recognition-shoplifting-may-get-you-banned-in-places-youve-never-been/.
  60. Id.
  61. Id.
  62. Calvello, supra note 42; Brandon Vigliarolo, Apple’s Face ID: Cheat sheet, TechRepublic (June 11, 2020, 7:43 AM), http://www.techrepublic.com/article/apples-face-id-everything-iphone-x-users-need-to-know/.
  63. Id.
  64. See Smith, supra note 2.
  65. Id.
  66. Id.
  67. Id.
  68. Id.
  69. Id.
  70. See Siuru, supra note 45.
  71. Id.
  72. Id.
  73. Bonsor & Johnson, supra note 19.
  74. B.S. Sruthy & M. Jayasree, Recognizing Surgically Altered Face Images and 3D Facial Expression Recognition, 24 Procedia Tech. 1300, 1301 (2016).
  75. Richa Singh et al., Plastic Surgery: A New Dimension to Face Recognition, 5:3 IEEE Transactions on Information Forensics and Security, 441-48, (2010).
  76. Christian Rathgeb et al., Impact and Detection of Facial Beautification in Face Recognition: An Overview, 7 IEEE Access, 152667 (2019).
  77. Siuru, supra note 45.
  78. See id.
  79. See Nicole Martin, The Major Concerns Around Facial Recognition Technology, Forbes (Sept. 25, 2019, 3:15 PM), http://www.forbes.com/sites/nicolemartin1/2019/09/25/the-major-concerns-around-facial-recognition-technology/#47fd01914fe3; Panda Security, supra note 24.
  80. Larry Hardesty, Study Finds Gender and Skin-Type Bias in Commercial Artificial-Intelligence Systems, MIT News (Feb. 11, 2018), http://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212.
  81. Id.
  82. Id.
  83. Id.
  84. Jacob Snow, Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots, ACLU (July 26, 2018, 8:00 AM), http://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28.
  85. Id.
  86. The Constitution Project’s Task Force on Facial Recognition Surveillance & Jake Laperruque, Facing the Future of Surveillance, POGO (Mar. 4, 2019), http://www.pogo.org/report/2019/03/facing-the-future-of-surveillance/.
  87. Id.
  88. Id.
  89. Id.
  90. Aaron Smith, More Than Half of U.S. Adults Trust Law Enforcement to Use Facial Recognition Responsibly, Pew Res. Ctr. (Sept. 5, 2019) http://www.pewresearch.org/internet/2019/09/05/more-than-half-of-u-s-adults-trust-law-enforcement-to-use-facial-recognition-responsibly/.
  91. Id.
  92. Id.
  93. See id.
  94. See Garvie et al., supra note 15.
  95. U.S. Const. amend. IV.
  96. Carpenter v. United States, 138 S. Ct. 2206, 2237 (2018) (quoting Katz v. United States, 389 U.S. 347, 351 (1967)).
  97. Id. at 2217 (quoting Katz v. United States, 389 U.S. 347, 351-352 (1967)).
  98. Facial Recognition Technology (Part 1): Its Impact on our Civil Rights and Liberties, Before the H. Comm. on Oversight and Reform, 116th Cong. (2019), https://docs.house.gov/Committee/Calendar/ByEvent.aspx?EventID=109521.
  99. Robyn Greene & Michael Pizzi, The Supreme Court Made a Sweeping Decision About Privacy Rights, New Am. (July 26, 2018), http://www.newamerica.org/weekly/edition-213/supreme-court-made-sweeping-decision-about-privacy-rights/.
  100. Carpenter, 138 S. Ct. at 2211.
  101. Id.
  102. Sabrina McCubbin, Summary: The Supreme Court Rules in Carpenter v. United States, Lawfare (June 22, 2018, 2:05 PM), http://www.lawfareblog.com/summary-supreme-court-rules-carpenter-v-united-states.
  103. Carpenter, 138 S. Ct. at 2211.
  104. See id.; McCubbin, supra note 102.
  105. Carpenter, 138 S. Ct. at 2221.
  106. Id.
  107. Id. at 2212.
  108. Id. at 2213.
  109. Id. at 2217.
  110. Id. at 2222.
  111. Id. at 2218 (citation omitted).
  112. Memorandum from Majority Staff on Hearing on “Facial Recognition Technology (Part 1): Its Impact on our Civil Rights and Liberties” to be heard before the H. Comm. on Oversight and Reform, 116th Cong. (2019), http://docs.house.gov/meetings/GO/GO00/20190522/109521/HHRG-116-GO00-20190522-SD002.pdf.
  113. See Smith v. Maryland, 442 U.S. 735, 742-44 (1979) (holding that “a person has no legitimate expectation of privacy in information he voluntarily turns over to third parties;” therefore, the government’s acquisition of such information does not constitute a search under the Fourth Amendment).
  114. Carpenter, 138 S. Ct. at 2217.
  115. Id. at 2220.
  116. Id. at 2213 (quoting Camara v. Mun. Court of City & Cty. of San Francisco, 387 U.S. 523, 528 (1967)).
  117. See id. at 2221.
  118. Id. at 2220.
  119. See Shea Denning, Pole Cameras After Carpenter, UNC Sch. of Govt. (July 31, 2019, 6:05 PM), http://nccriminallaw.sog.unc.edu/pole-cameras-after-carpenter/; see e.g., United States v. Kay, No. 17-CR-16, 2018 WL 3995902, *1 (E.D. Wis. Aug. 21, 2018) (holding that the investigators’ warrantless use of pole camera footage was not a violation of Fourth Amendment rights).
  120. See Carpenter, 138 S. Ct. at 2214.
  121. See Greene & Pizzi, supra note 99.
  122. See Susan Crawford, Facial Recognition Laws Are (Literally) All Over the Map, Wired (Dec. 16, 2019, 8:00 AM), http://www.wired.com/story/facial-recognition-laws-are-literally-all-over-the-map/.
  123. Benjamin Hodges & Kelly Mennemeier, The Varying Laws Governing Facial Recognition Technology, IP Watchdog (Jan. 28, 2020), http://www.ipwatchdog.com/2020/01/28/varying-laws-governing-facial-recognition-technology/id=118240/.
  124. Id.
  125. Brian New, Facial Recognition Use by North Texas Police Grows Along with Privacy Concerns, CBS DFW (Feb. 4, 2019, 6:30 PM) http://dfw.cbslocal.com/2019/02/04/facial-recognition-texas-police-grows-privacy-concerns/.
  126. Hodges & Mennemeier, supra note 122.
  127. Crawford, supra note 121.
  128. Georgetown Law Center on Privacy and Technology (@GeorgetownCPT), Twitter (Jan. 17, 2020, 11:25 AM), http://twitter.com/GeorgetownCPT/status/1218222879097049088.
  129. Facial Recognition Technology Warrant Act of 2019, S.2878, 116th Cong. (2019).
  130. Memorandum from Majority Staff on Hearing on “Facial Recognition Technology (Part 1): Its Impact on our Civil Rights and Liberties” to be heard before the H. Comm. on Oversight and Reform, 116th Cong. (2019).
  131. Crawford, supra note 121.
  132. Aaron Boyd, Lawmakers Working on Legislation to ‘Pause’ Use of Facial Recognition Technology, Nextgov (Jan. 15, 2020), http://www.nextgov.com/emerging-tech/2020/01/lawmakers-working-legislation-pause-use-facial-recognition-technology/162470/.
  133. Crawford, supra note 121.
  134. Id.
  135. 740 Ill. Comp. Stat. (2008).
  136. See Crawford, supra note 121.
  137. See The Constitution Project’s Task Force on Facial Recognition Surveillance & Laperruque, supra note 86.
  138. Charlie Campbell, ‘The Entire System Is Designed to Suppress Us.’ What the Chinese Surveillance State Means for the Rest of the World, TIME (Nov. 21, 2019), http://time.com/5735411/china-surveillance-privacy-issues/.
  139. Liza Lin & Newley Purnell, A World With a Billion Cameras Watching You Is Just Around the Corner, Wall St. J. (Dec. 6, 2019, 1:00 AM), http://www.wsj.com/articles/a-billion-surveillance-cameras-forecast-to-be-watching-within-two-years-11575565402?mod=hp_listb_pos1.
  140. The Constitution Project’s Task Force on Facial Recognition Surveillance & Laperruque, supra note 86.
  141. Kelly Wang, China facial-recognition case puts Big Brother on trial, Tech Xplore (Jan. 8, 2020), http://techxplore.com/news/2020-01-china-facial-recognition-case-big-brother.html.
  142. Martin Pollard, Even Mask-wearers Can be ID’d, China Facial Recognition Firm Says, Reuters (Mar. 9, 2020, 3:40 AM), http://www.reuters.com/article/us-health-coronavirus-facial-recognition/even-mask-wearers-can-be-idd-china-facial-recognition-firm-says-idUSKBN20W0WL.
  143. Ryan Smith, Destination Dystopia: Facial Recognition Payments Already a Thing in China, CCN (June 30, 2019, 1:35 PM), http://www.ccn.com/destination-dystopia-facial-recognition-payments-Already-a-thing-in-china/.
  144. See Thomas Ricker, The US, Like China, Has About One Surveillance Camera for Every Four People, Says Report, Verge (Dec. 9, 2019, 10:48 AM), http://www.theverge.com/2019/12/9/21002515/surveillance-cameras-globally-us-china-amount-citizens.
  145. Id.
  146. The Constitution Project’s Task Force on Facial Recognition Surveillance & Laperruque, supra note 86.
  147. Id.
  148. Id.
  149. Carpenter, 138 S. Ct. at 2223.