Technology and our rights

Live facial recognition (LFR) interferes with the right to a private life - but the impact of this technology extends far beyond this right. LFR technology identifies a person in real time using biometric processing. Combining LFR with other data sources can reveal much about a person’s professional and private life. The public need to know about this technology and their rights. An engaged public is paramount in debates about this technology.

In this blog, Dr Daragh Murray, highlights some of the human rights concerns associated with the use of LFR technology, both as currently used and as it is likely to be used in the future.

 

The use of live facial recognition (LFR) technology in the UK has been the subject of significant public engagement in recent months. High profile trials conducted by South Wales Police have been legally challenged, and a colleague, Professor Pete Fussey, and I have produced an independent report on similar trials run by the Metropolitan Police Service. More recently, revelations on the deployment of LFR by private companies in King’s Cross, and elsewhere, have been met by public shock, prompting an investigation by the Information Commissioner’s Office.

LFR technology involves the real time identification of all individuals passing through a camera’s field of vision, by means of biometric processing. The nature of this technology gives rise to an interference with the right to private life, as protected under Article 8 of the Human Rights Act. This is because biometric information is inherently personal, and so is treated as similar to a fingerprint or DNA sample. The right to private life of every individual passing through a camera’s field of vision is therefore engaged, irrespective of whether they are actively sought by the police or are merely an uninvolved passer-by: they are all subject to biometric processing.

The right to private life

The fact that LFR technology gives rise to an interference with the right to private life does not necessarily mean that it violates that right. Interferences with a right can be justified on a number of bases, such as the protection of public order or the protection of rights of others. Police use of live facial recognition technology clearly has the potential to be useful. For example, the ability to identify a known terrorist travelling on false papers has clear benefits in terms of protecting the public. The question is therefore not ‘is facial recognition technology useful’; rather it is ‘can facial recognition technology be deployed in a human rights compliant manner?’

The legitimacy of any rights interference is determined by a human rights law test: whether the measure (in this case the LFR deployment) is in accordance with the law, pursues a legitimate aim, and is necessary in a democratic society. Broken down to its simplest form, this test is intended to achieve two aims.

  • To protect against arbitrariness, by ensuring, for example, that the circumstances surrounding a facial recognition deployment are foreseeable, thereby allowing people to regulate their behaviour, and ensuring that police don’t overstep their authority.
  • To ensure that in pursuing a legitimate aim, other rights are not inappropriately interfered with. This balances the competing human rights interests raised by the aims a LFR deployment is intended to achieve, and the broader impacts of that deployment. For instance, the Surveillance Camera Commissioner concluded that police use of facial recognition technology in a Manchester shopping centre was disproportionate, in light of the limited utility of the deployment compared to the very large number of individuals whose right to private life was interfered with.

 

To-date, attention has focused primarily on the impact of biometric processing on the right to private life. This may be a result of the relatively limited nature of current LFR deployments, which are typically limited in duration and focused on identifying individuals already known by the police. The potential inherent in this technology is, however, significantly greater. For example, when connected to other data sources, such as passport or drivers’ licence databases, LFR technology can be used not only to look for specific individuals, but also to identify anyone in an image and to match them to their government ID. The integration of LFR technology into a CCTV system also enables the tracking of the movements of individuals across city-wide areas. In turn, this information can be subject to machine learning/artificial intelligence analysis in order to develop a detailed understanding, and record of, one person’s day-to-day activities. This can reveal remarkably detailed information regarding an individual’s personal and professional life.

A step-change in surveillance

These more advanced uses of LFR technology give rise to more extensive human rights concerns. The right to private life remains relevant. In essence, anonymity allows for a process of experimental learning or development, whereby individuals can engage with different ideas, divergent political discourses, or aspects of their sexuality without fear that this information will become public or that consequences will follow. Anonymity is recognised as essential to the development of our personality and identity. Yet LFR technology makes it possible to monitor, track, and analyse individual behaviour, directly threatening the ability to act anonymously. Given the potential impact on individual and societal behaviour, this gives rise to a serious interference with the right to private life.

However, the human rights impacts extend far beyond the right to private life. Two examples, both related to increased surveillance and analytical capabilities, illustrate this point.

  1. LFR technology may affect how individuals interact with each other, receive information, and engage with different thought or ways of life. This is intrinsically related to individual development, but brings into play rights such as freedom of expression, association, assembly, and religion. This may affect the functioning of a participatory democracy, as a ‘chilling effect’ brought about by surveillance can dissuade individuals from seeking out new ideas or challenging the status quo; what would happen if LFR was used at protests?

  2. Detailed individual profiles made possible by advanced facial recognition may be used to inform diverse decisions relating, for example, to the rights to work, to health, or to social welfare. What will it mean for how people engage with those around them, if all of their activities are recorded, and used to inform potentially life changing decisions about them?

A real concern is that people will be afraid of engaging at the fringes of society, and that they will modulate their behaviour towards the mainstream, potentially stagnating individual and societal development.

LFR represents a step-change in surveillance and analytical capabilities. There are clear benefits associated with this technology but the potential human rights harm is significant, particularly the possibility that participatory democracy may be (inadvertently) undermined. This demands both further research, to understand more fully the implications of using this technology, and genuine public debate. An informed public discussion on LFR is essential. It is only right that the public should decide whether they want to see LFR deployed, and if so under what circumstances.