top of page

Understanding the Basics of Facial Recognition Tech: How Does it Affect Us?

Updated: Dec 30, 2020

Akansh Garg,

Editorial Intern,

Indian Society of Artificial Intelligence and Law.


 

The algorithms of the law must keep pace with new and emerging technologies. This technology allows remote, contactless, data processing, even without a person’s knowledge. In the current digital environment, where people’s faces are available across multiple databases and captured by numerous cameras, facial recognition has the potential to become a particularly ubiquitous and intrusive tool. The increased surveillance enabled by this technology may ultimately reduce the level of anonymity afforded to citizens in the public space.[1]

What is Facial Recognition Technology?

Facial Recognition Technology (FRT) provides an individual's identification based on an interpretation of his or her geometric facial characteristics and an algorithm comparison between the characteristics derived from the image captured and one already stored. Identification/recognition is just one aspect, since it is important to first capture images (or recordings) in the form of data and process those data in the computer system before they are removed.

It is documented that face recognition was used as far back as the 1960s, with the development of a simple database by the United States Defence Advanced Research Project Agency in the early 1990s. In 2010, Facebook started introducing an automated 'tagging' scheme on their network. Their system proposed 'tags' in photographs with names for faces. By 2017, the I-Phone X of Apple was the first phone that could be activated using “Face ID”.

Let us now discuss what technological operations are being done by FRT in order to recognize an individual face or identity:[2]

· Collection/acquisition of images,

· Face detection,

· Normalisation,

· Feature extraction,

· Storage of raw data and features (face templates),

· Comparison,

· Use for primary purpose (e.g., identification of a wanted person),

· Potential reuse for other purposes,

· Potential disclosure,

· Deletion of raw data and/or features (face templates).

The programme takes digital images and conducts mathematical operations to detected faces of individuals (for instance, images captured from a camera, or stored in an image database). The facial description data (e.g., scaled, rotated, aligned, etc) are configured to the manner in which the facial features can be identified. The FRT algorithm extracts features that identify a particular person from the normalised face pictures. These features are stored and compared (or matched) to previously collected features that are on the algorithm's (or database) list. The consequence depends on the scenario of use. For instance, if a match is identified, the machine will signal that the operator matches or carry out other (or additional) automated tasks. Critical issues for further legal review can lie beyond the operation of comparison (or recognition). For instance, where raw data is obtained, what happens to those data, whether they are preserved or removed, how they are usable and possibly reused, may be important.

After understanding how FRT software works, lets put some light on the various uses of this technology.

USES OF FRT –

There are 3 principal categories in which FRT is being used:

  1. Verification [one to one comparison] - This involves the comparison of two biometric templates to verify a person’s identity. The Smart Gate system used at the airport is a good example of this use. Identify and check people under investigation at the border, Record the identity of deportees and stop them re-entering your country with another identity, Expose face identities, etc.

  2. Identification [one to many comparison] - This involves the comparison of an individual’s biometric template to another stored in a database. An example of this is the automated FRT system used by police forces which can extract facial images from video footage and compare against a ‘watchlist’.

  3. Categorisation - FRT may also be used to extract information about particular characteristics of a person such as race, sex and ethnicity. This is also known as ‘face analysis.’8 This analysis could predict or profile a person based on their facial image. It does not specifically identify a person, but if characteristics are inferred from a facial image and potentially linked to other data (e.g., location data), it could de facto enable the identification of an individual.

Some of the other uses of FRT are:

  • It is used by police of different countries in order to nab the wrongdoer.

  • Facebook uses FRT to suggest ‘tags’ of people in photos.

  • FRT may be used to find missing children.118 Nearly 3,000 missing children were identified during a trial of the app in New Delhi[3].

  • Use of FRT to speed up border control procedures has been recommended by the Tourism Export Council.

  • FRT may serve a number a of functions in the banking, finance and anti-money laundering sector, mainly in identity verification.

  • FRT is also used in Anti-Money Laundering efforts in the UK. Smart Search, a company that provides Anti-Money Laundering services in the UK introduced FRT to help customers provide visual confirmation of ID in 2020.[4] This is thought to be particularly useful during the Covid-19 pandemic as the process can be carried out remotely.

  • Businesses are also employing FRT for security and surveillance purposes. In May, 2018 a man was taken aside by staff at a New World supermarket in US after he was mistakenly identified as a shoplifter.

  • FRT can be used in several contexts in customer loyalty and tracking in the retail environment. In the United States, fast food chains have self-service ordering kiosks – the customer can register using loyalty program and then when they enter the chain and walk towards kiosks, they will be recognised using FRT: “food orders from previous visits are remembered and easily selected again or quickly modified”.

  • Churches in various countries around the world are using FRT to track the attendance of their members. Also, educational institutes, such as schools and universities utilise FRT to track attendance and monitor students.In China, the technology has been used to catch students cheating in high school exams.

  • FRT may be used for authentication or verification purposes such as entry to secured places e.g., military bases, border crossings, nuclear power plants or to access restricted resources including medical records.

  • FRT might be used as back-end verification systems to uncover duplicate applications for things such as benefits that require other forms of identification.

  • FRT is used to combat crimes again children in several ways. In North America, a non-profit organisation uses FRT to identify and prevent child pornography and sex trafficking[5]. The technology can compare images of missing children with advertisements for sexual services, identifying any matches and alerting authorities.

  • Casinos were one of the earliest adopters and most widespread users of FRT. Casinos can use FRT for security purposes, identifying cheaters or advantage players when they arrive on the premises and alerting casino staff.[6]

  • FRT may be used to monitor movement in public of a known set of individuals (such as positive cases subject to a quarantine order, in COVID-19 context) by matching unknown individuals to a ‘watchlist’.

Let us now analyse the threats FRT poses to human rights and consequently what the appropriate parameters of its use may be.

What Human Rights may be Impacted by FRT?

Human rights are the basic rights and freedoms that all people are entitled to. A person’s human rights arise from a mixture of international and national sources. The impact of technology, artificial intelligence and data-driven decision-making is a fast-evolving area of human rights analysis.

Some of the principal areas of human rights that may be affected by the use of FRT are as follows:

  1. Freedom of thought, conscience and religion (e.g., where facial recognition systems are used to monitor protests);

  2. Freedom of expression (e.g., where facial recognition systems are used to monitor protests);

  3. Freedom of assembly and association (e.g., where facial recognition systems are used to monitor protests);

  4. Freedom of movement (e.g., where facial recognition systems are used in border control);

  5. Freedom from discrimination (e.g., where facial recognition systems run on biased algorithms);

  6. Privacy/respect for private life (e.g., where facial recognition equipped cameras are used in public spaces);

  7. Protection of personal information/data (e.g., where facial images are stored by the state);

  8. Right to be free from unreasonable search and seizure (e.g., where facial recognition is used in surveillance by the police);

  9. Minimum standards of criminal procedure (e.g., where evidence of identity from a facial recognition match is sought to be introduced into evidence).

After looking at principal areas of human rights that may be affected by the use of FRT, lets map out some of the threats that FRT might pose to societal interests and the rights of individuals.

It considers issues in the development and deployment of the technology, from a fundamental human rights perspective. As has been discussed, the technology is on the rise, and new uses continue to be found for FRT. These developments raise pressing questions concerning the accuracy of the technology, the level of public support it enjoys, and the impact the technology has on individual rights, and society more broadly. This section provides an overview of these issues, which forms the basis of discussion for how this technology can, and indeed should, be regulated.

The use of facial images as identification evidence has been used by police and at trial for many years. This is a spectrum from longstanding investigative and evidential techniques such as showing witnesses ‘mugshots’ of suspects or defendants, technological advances such as expert opinion based on image comparison techniques, to ‘facial mapping’ and now automated FRT.[7]

Inaccurate FRT matching could have particularly serious repercussions in the context of criminal proceedings. In the course of a criminal investigation, the police may seek to identify individuals in a ‘probe image’.

  • Example 1: Using FRT to verify the identity of an arrestee - A suspect is arrested, but refuses to provide his name to police. Police could take a ‘probe image’ of the individual’s face. Facial recognition software could then be used to verify the individual’s identity by comparing the probe image against a database of images that the police control, or to which the police have access

  • Example 2: Using FRT to identify a suspect etc. - CCTV footage shows a suspected burglar leaving a property. A still of the suspect’s face is used as a probe image and compared with a database of custody images (commonly known as ‘mugshots’). The facial recognition software generates a shortlist list of possible matches, and police arrest a suspect based on his place of residence being close to the crime scene and the strength of the FRT ‘match’

  • Example 3: Using FRT as evidence of identity - Following on from example 2, the suspect is charged but contests that he is not the person in the probe image. The prosecution present evidence that the suspect was identified through the use of facial recognition software at trial, which suggested that his stored custody image was a ‘likely match’ to the probe image taken from a CCTV feed.

Privacy and Information Rights

Like fingerprint scanning and DNA profiling, FRT involves the processing of biometric information about the individual. The technology allows the police to go further in monitoring and tracing individuals than ordinary observation or CCTV monitoring would. The FRT process ‘involves the creation of informational equivalents of body parts that exist outside their owner and are used and controlled by others. Through this process, the individual loses full ownership of the geometric features of his or her face as these features acquire new meanings that the individual does not understand, and new uses realised outside of his or her own body.

RECOMMENDATION

In this section, we provide general recommendations about regulation and oversight of FRT which are applicable to a range of uses of the technology.

Recommendation 1: Create a new category of personal information for biometric information,

Recommendation 2: Provide individuals with additional control over personal information,

Recommendation 3: Establish a Biometrics Commissioner or other oversight mechanism,

Recommendation 4: Implement high-quality Privacy Impact Assessments,

Recommendation 5: Add enforceability and oversight to Algorithm Charter,

Recommendation 6: Transparency in use of FRT,

Recommendation 7: Implement a code of practice for biometric information,

Recommendation 8: Information sharing agreements for facial images must be appropriate and transparent,

Recommendation 9: A moratorium on the use of live AFR by Police,

Recommendation 10: Consultation and consideration of legislation,

Recommendation 11: Review of collection and retention of facial images by Police,

Recommendation 12: Threshold before comparison can be made in Police’s image system,

Recommendation 13: Oversight of the Police’s image database,

Recommendation 14: Oversight of emerging technology such as FRT,

Recommendation 15: Regulate surveillance using FRT in public places.

CONCLUSION

The risks of using FRT need to be properly managed. We recommend a set of general and particular requirements that aim at addressing those risks with necessary regulation and oversight mechanisms. Those mechanisms should also increase public trust.

Public trust is essential for state services and particularly in policing. Our overarching recommendation is for transparency and consultation. Extensive media reporting has shown the level of public concern about the use of such technology. Minority groups and those affected disproportionately must be consulted on potential use and given opportunities to be involved in oversight.

We place the burden firmly on those who want to use FRT, particularly live FRT to demonstrate not only its utility as a surveillance tool, but also due appreciation of its broader social impact and the factoring of this into any assessment of use.


References

[1] Commission Nationale de l’Informatique et des Libertés, Facial Recognition: For a Debate Living Up to the Challenges, November 2019 www.cnil.fr/en/facial-recognition-debate-living-challenges [2] Luana Pascu “Apple patents potential new Face ID biometrics system, to launch face recognition to iMac” (17 June 2020) Biometric Update www.biometricupdate.com. [3] Anuradha Nagaraj “Indian police use facial recognition app to reunite families with lost children” Reuters (online ed, United States, 15 February 2020). [4] 6Rozi Jones “Smart Search launches facial recognition feature” Financial Reporter (online ed, United Kingdom, 5 May 2020) [5] Tom Simonite “How Facial Recognition Is Fighting Child Sex Trafficking” Wired (online ed, United States, 19 June 2019). [6] Sam Kljajic “Ask the Expert: Casinos, Face Recognition, and COVID-19” (15 April 2020) SAFR www.safr.com [7] 2 Ioana Macoveciuc, Carolyn J Rando and Hervé Borrion “Forensic Gait Analysis and Recognition: Standards of Evidence Admissibility” (2019) 64 J Forensic Sci 1294.


The Indian Learning, e-ISSN: 2582-5631, Volume 1, Issue 2, January 31, 2021.

The Indian Society of Artificial Intelligence and Law is a technology law think tank founded by Abhivardhan in 2018. Our mission as a non-profit industry body for the analytics & AI industry in India is to promote responsible development of artificial intelligence and its standardisation in India.

 

Since 2022, the research operations of the Society have been subsumed under VLiGTA® by Indic Pacific Legal Research.

ISAIL has supported two independent journals, namely - the Indic Journal of International Law and the Indian Journal of Artificial Intelligence and Law. It also supports an independent media and podcast initiative - The Bharat Pacific.

bottom of page