By Agnes Leung
Facial recognition has emerged as one of the most frequently utilized artificial intelligence technologies in law enforcement to achieve national security, public safety and prevention of crime. Facial recognition involves ‘automated extraction, digitisation and comparison of the spatial and geometric distribution of facial features’, which is a kind of biometric data, and compares them to facial images in databases of the user. If there is a match, the user would be alerted.
Due to its face-paced, cost-efficient nature, law enforcement agencies around the world have adopted it in a wide range of circumstances. While it might be justified to adopt facial recognition in situations such as border controls, it might not be the case where it is manipulated as a tool for government surveillance. For instance, it is reported that the People’s Republic of China uses facial recognition technology to collect evidence of citizens’ behaviour like jaywalking to establish a social credit system. In Hong Kong, it is also observed that ‘smart lampposts’ have been installed in order to monitor protestors’ actions. This raises concern as to the appropriate balance between protection of privacy and ethical applications and public interest issues. As discussed subsequently, other fundamental human rights such as freedom of speech, assembly, fair trial and from discrimination are highly relevant to the adoption of facial recognition technology.
Freedom of right to respect for private and family life (the ‘Right to Privacy’) is guaranteed under Article 8 of the European Convention on Human Rights (the ‘ECHR’); and Article 8(2) sets out the conditions to fulfil the requirements of lawfulness which include but not limited to: ‘national security’, ‘public safety’, ‘prevention of disorder’ or even the ‘protection of health or morals’ and ‘economic well-being’ of the country. The Court of Appeal in the UK recently decided in R (on the application of Bridges) v Chief Constable of South Wales Police [2020] EWCA Civ 1058 (the ‘Bridges Decision’) that: (i) the use of automated facial technology by the authorities was not ‘in accordance with the law’ for purposes of Article 8 of the ECHR; (ii) they did not conduct a proper Data Protection Impact Assessment (“DPIA”); and (iii) they failed to comply with the public sector equality duty (“PSED”).
This article will be structured as follows: (i) an overview of the facts and the decision of the Bridges case in the perspective of compliance with ECHR, DPA Act 2018 and Equality Act; (ii) UK precedents in this area of law; (iii) impact of the decision; and (iv) recommendations. Although the Bridges decision calls for local authorities to implement clearer regulations on the use of facial recognition technology in the short term, it would possibly lead to fragmentation of the legal framework. Looking forward, it is crucial to establish nation-wide legislations that can be applied to various kinds of facial recognition technology in order to achieve predictability in the protection of Rights to Privacy. Systemic inequality in the criminal justice system shall be eradicated in order to avoid the Right to Privacy of the vulnerable groups from being infringed in the long run.
- AFR Locate, Case Summary and Judgment
The present case concerns Mr. Edward Bridges as the claimant and appellant in Court of Appeal (the ‘Appellant’) and the Chief Constable of South Wales Police as the respondent (‘SWP’). In the appeal, the Court considered SWP’s usage of live automated facial technology named ‘AFR locate’ (the ‘AFR Locate’) in two circumstances where the Applicant was captured.
AFR Locate deploys CCTV cameras to capture digital images of individuals, process them and compare with data of images of people contained in a database in real time. Such a database in effect is a watchlist compiled by the SWP (the ‘Watchlist’). If there is a match, SWP would be alerted. SWP would then decide, by commands given by police officers, whether human intervention, such as search, warrant or arrest, is necessary. If there is no match, the AFL Locate would delete the facial biometrics and/or images automatically and instantly.
The Watchlist consisted primarily of individuals who are: (i) ‘wanted on warrants’; (ii) ‘unlawfully at large’; (iii) ‘suspected of having committed crimes’; (iv) ‘in need of protection’; (v) if present at a ‘particular event causes particular concern’; (vi) ‘simply of possible interest to SWP for intelligence purposes’; and (vii) ‘vulnerable persons’ such as missing persons.
AFR Locate was utilized by SWP against the Appellant on two occasions. In December 2017, the Appellant was captured in a shopping area in Cardiff. In March 2018, AFR Locate took images of the Appellant when he was participating in a protest at the Defence Exhibition. In both circumstances, he was not on the Watchlist whatsoever. He also did not observe or was not aware of any notice or warning made by the SWP about deployment of the AFR Locate at the aforementioned scenes. The Appellant filed judicial review proceedings against the SWP in October 2018 on the following grounds: (i) breach of Article 8 of the ECHR, namely the Right to Privacy; (ii) breach of Article 10 and 11 of the ECHR, which are the freedom of expression and freedom of assembly and association respectively; (iii) breach of Data Protection Act 2018 (the ‘DPA Act 2018’) and (iv) breach of Equality Act 2010 (the ‘Equality Act’).
At the Divisional Court, the Appellant’s claim was dismissed. It was held that the use of AFR technology was ‘in accordance with the law’. The Court also decided that application of AFR technology was both necessary and proportionate in the circumstances. Regarding the issue of data protection, the Court accepted that the SWP complied with requirements of lawfulness, strict necessity and proportionality. The Court further rejected the allegation that there was gender or racial bias due to insufficient expert evidence to support such technical deficiencies (the ‘High Court Decision’). The Court of Appeal overruled the High Court Decision.
- Compliance with Article 8 of the ECHR
The Article 8(2) of the ECHR states that one’s Right to Privacy shall not be interfered except such as is ‘in accordance with the law’ and is ‘necessary in a democratic society’ with a legitimate interest set out therein. The test consists of: (i) legality test; (ii) necessity test; and (iii) proportionality test.
Before analysing whether the actions of SWP were lawful, one has to consider whether the law was ‘clear and sufficient’, taking into account the cumulative effect of the primary legislation, secondary legislation and the SWP’s own local policies.
The Divisional Court left substantial room of discretion to the SWP by stating that these were within the policing power of the SWP and it involved a novel technology that required frequent review from time to time. The Court of Appeal, on the contrary, recognized the ‘fundamental deficiencies’ in the legal framework and held that SWP failed to fulfil the requirement for lawfulness as it involved areas which were ‘impermissibly wide’, namely where the AFR technology was located, and selection of the individuals on the Watchlist – especially for those who were ‘simply of possible interest to SWP for intelligence purposes’ (emphasis added). Appeal was allowed on the basis that the use of AFR technology by SWP on the Appellant was unlawful.
With regard to proportionality (which indeed was not subject to decision as the use of AFR technology was not ‘in accordance with the law’), the Court of Appeal declined the Appellant’s proposition that it was necessary to consider interests of not only the Appellant, but all the other members of the public who would have been at the two venues when AFR Locate was deployed’. Since the impact on the Appellant was ‘negligible’, this ground of appeal would have been rejected even if it was considered.
- Compliance with the DPA 2018
According to Section 64 of the DPA 2018, SWP was required to undertake DPIA pursuant to Section 64(3) if processing of personal data would lead to high risks to the rights and freedom of individuals. The Court held that as SWP did not act ‘in accordance with the law’ pursuant to Article 8 of the ECHR, there was accordingly no proper assessment on the risks and possible measures to ensure the freedoms of individuals.
Section 42 of the DPA 2018 requires SWP to have ‘an appropriate policy document in place when carrying out sensitive processing in reliance on the consent of the data subject’. The Court decided that since this section was newly enacted at the time of proceedings and no guidance (which was issued subsequently in 2019) was published by the Information Commissioner, it was ‘entirely appropriate for the Divisional Court to make no judgment on the point and leave to the SWP to make such revisions as might be appropriate’. The Court rejected this ground of appeal as a result.
- Fulfilment of the Public Sector Equality Duty
Under Section 149(1) of the Equality Act, a public authority like SWP must have due regard to the need to eliminate discrimination, in the forms of both direct and indirect discrimination – including that of race and gender bias. It was submitted by the Appellant that the SWP failed to recognise the risk of indirect discrimination which might arise from the use of AFR Locate as it was said to create a higher risk of false identification in cases of individuals of colour and women. The Court held that ‘SWP have never sought to satisfy themselves, either directly or by way of independent verification, that the software program in this case does not have an unacceptable bias on grounds of race and sex’. As SWP was unable to discharge this positive duty, the appeal was allowed.
- Precedents
Prior to Bridges, the United Kingdom relied mostly on the common law decisions of S and Marper v United Kingdom [2008] ECHR 1581 and Catt v United Kingdom [2019] ECHR 76 in this area of law.
In Marper, the applicants’ DNA samples and fingerprints were being retained after their criminal proceedings had been concluded and discontinued, despite the fact that neither of them had been convicted. The European Court of Human Rights (the ‘ECtHR’) held that although there was a legitimate interest in the prevention of crime, the UK authorities infringed Article 8 of the ECHR via ‘the blanket and indiscriminate nature’ of the powers of retention of suspects’ biometric data. It constituted a failure to ‘strike a fair balance between the competing public and private interests’ which had exceeded any ‘acceptable margin of appreciation and should not be regarded as ‘necessary in a democratic society’.
In Catt, the applicant, aged 94, was a peace activist in an organization hosting protests which often turned violent. Although the applicant was arrested several times, he had never been convicted. Upon making a ‘subject access request’, it was disclosed by the police that they kept 66 records of his appearance at the peaceful protests within a period between 2005 to 2009. The ECtHR considered whether it was justified for the authorities to retain his personal data on the ‘extremism database’ which consisted of his political stances. The ECtHR did not rule on legality of the alleged acts apart from noting the ambiguity in the term ‘domestic extremism’ and criteria in relation to the collection of relevant data. The ECtHR found that although there was a ‘pressing social need’ to collect the personal data, there was no such need to retain the data. As there was no scheduled review and time limits, the retention was disproportionate. As a result, it was decided that there was a violation of Article 8 of the ECHR.
- Impacts of the Bridges Judgment
- Classification of the Nature of Biometric Data
- Extent of Intrusion
The Court of Appeal in Bridges distinguished facial recognition technology from other similar means of collection of personal data such as DNA profiling or fingerprints on the basis that although it collects sensitive data (as defined under DPA 2018), it is not as intrusive. For instance, it does not require the subject to provide blood samples or undergo a fingerprint scan. However, the Court failed to take into account the nature of the data collection – DNA, fingerprints and the data derived via AFR technology are all primary forms of biometric information which are unique to every single person. One shall not determine whether the technology is intrusive solely based on the manner the data is collected, but also whether it is so distinctive in its nature that it requires additional privacy protection.
Further, the Court’s decision creates a paradox that: if the Court identified AFR Locate as a kind of non-intrusive collection of sensitive data, should it not be classified as covert surveillance as opposed to overt as defined in the decisionAdditionally, consent nor awareness of the subject is oftentimes not given in the deployment of AFR Locate.
The Court adopted a ‘relativist approach’ in determining the lawfulness of AFR technology: ‘the more intrusive the act complained of, the more precise and specific must be the law said to justify it’. The classification in the level of intrusion is of utmost importance as it would directly influence whether its usage passes the initial hurdle – lawfulness. This article argues that as facial recognition is (i) intrusive in its nature; and (ii) amounts to covert surveillance, stricter legal framework shall be imposed to justify its lawfulness.
- ‘One-to-one’ verification versus ‘one-to-many’ identification
The Bridges decision is held based on comparison of digital images taken at real time and those stored in the database of the police, which is a form of ‘one-to-many’ searching for identification purposes. Higher level of restrictions and regulations should be implemented in identification process compared to verification of the same individuals. Biometric passports used in border controls or facial recognition for logging in your smartphone may be justified as only your own personal data is utilized in the matching process and one would almost certainly have given consent to authorize storage and usage of such private data. Nonetheless, where facial recognition is employed in ‘one-to-many’ searching, such deployment is often unknown to the subject. As the Court did not explicitly rule on the issue, individuals are at the risk of being exposed to unauthorized and unknown usage of biometric information for identification purposes.
- Protection of Right to Privacy under Article 8 of ECHR
- Legality
The Court of Appeal effectively commented that although some safeguards had been placed, too much discretion was left to SWP as to the two ‘fundamental deficiencies’ of AFR technology, namely the issues on: (i) who should be placed on the watchlists; and (ii) where the AFR Locate should be deployed.
It is controversial to include individuals ‘whose presence at a particular event causes particular concern’ or ‘simply of possible interest’ into the watchlists. Arguably, these people should not be entered into a watchlist at all in most circumstances as they are prone to police manipulation to suppress opposition. The most difficult question lies in individuals who have frequently participated in large-scale demonstrations but have never been prosecuted or have been acquitted. Should they be on the watchlist? Furthermore, should a target be removed from the watchlist after reaching a certain time limit? If local police forces could establish their internal policies that can eliminate such ambiguity, the issue as to who would be on the watchlists can be resolved.
However, it might not be as straightforward to identify the determining factors as to where to deploy facial recognition technologies. Human intelligence is required to consider issues such as: appropriate timing that the target would appear and in which event the target would participate. For instance, is it possible to place an AFR Locate at a location that is close to the target’s residence? If so, would there be any time constraints in conducting such surveillance? Assuming an AFR Locate is placed in the neighbourhood of a suspect, in an extreme case, it could possibly become a form of house arrest.
Since it is impossible to draw a clear line as to where facial recognition should be deployed, it is inevitable that the law enforcement would expand the scope of their surveillance as much as possible. As the Court had correctly pointed out: ‘it will often, perhaps always, be the case that the location will be determined by whether the police have reason to believe that people on the watchlist are going to be at that location’ (emphasis added).
It is doubtful whether the Bridges decision can realistically limit police’s use of the AFR techniques in effect. On one hand, the ‘who’ and ‘why’ questions cannot be adequately answered and police will continue to infringe individuals’ Right to Privacy. On the other hand, the general public will be influenced by the ‘chilling effect’ and self-regulate their behaviour at the expense of their privacy and personal lives in the fear that they might violate the unforeseeable and unclear legal framework surrounding facial recognition.
The Court of Appeal focused on the technicalities of facial recognition technology and departed from the precedents, which would result in fragmentation of the legal framework. In Bridges, the Court distinguished from that of the Marper on the basis that it primarily concerned the retention of fingerprints and DNA samples, while the present case concerned collection of data. The Court further distinguished from Catt by stating that the ECtHR did not rule on the legality of the alleged wrongful acts of the authorities, and instead focused on the proportionality issue. The Bridges decision has been criticized for focusing on the specifics of facts rather than laying down a general guiding principle that applies to surveillance by law enforcement authorities. Together with the ever-changing nature of biometric technology in this era, this would create a fragmented legal regime leading to unpredictability of law at large.
- Necessity and Proportionality
The Court rejected the proposition that in assessing the proportionality test, one needs to consider ‘not only the actual results of an operation’ but its anticipated benefits’. As the Court considered the impact on the Applicant was ‘negligible’ and ‘an impact that has very little weight cannot become weightier simply because other people were also affected’. This position has been challenged as the proper comparison shall be made between the significance of achieving the legitimate interests and against the possible violation of human rights. Facial recognition surveillance potentially involves everyone who passes by a camera placed at any public area. The Court fails to take into account the probable harm incurred to vast majority of the general public who have not committed any wrongful acts. Mass surveillance, in its very nature, concerns how the state can potentially control thoughts and actions of a substantial number of individuals. If the Court would not take into account the cumulative impact on the crowd, it would create an absurd scenario where human rights infringement would almost certainly be justified in facial recognition technology.
Furthermore, citizens’ lack of Right to Privacy would create a ‘chilling effect’ on other rights guaranteed under the ECHR including that of fair trial (Article 6), expression (Article 10), assembly and association (Article 11), in fear that they would be held liable by participation in demonstrations and protests.
- Data Protection under DPA 2018
The Court decided that since SWP did not pass the ‘lawfulness’ requirement under Article 8(2) of the ECHR, it would follow that it did not comply with Section 35 of the DPA 2018. It is worthy to note that processing of biometric data would require authorization by either the EU Member States law or the UK law. It is argued that Section 35 DPA 2018 imposes a higher threshold when compared to ‘lawfulness’ standard under ECHR. The judgment seems to blur the boundaries between the DPA 2018 and ECHR. Such unpredictability of law would lead to public’s confusion as to how to protect their Right to Privacy against future infringement of data protection domestically.
- Protection offered to vulnerable groups in relation to Right to Privacy
Research has demonstrated that AFR technology has a higher error rate by producing ‘false positives’ in relation to individuals who are women, with darker skin tones, or are non-binary etc. The Court of Appeal decision imposes a positive obligation on law enforcement divisions to satisfy themselves that there is neither direct nor indirect discrimination in the deployment of such technology which is discriminatory by design. On the surface, it appears that the Court’s decision provides improved protection to Right to Privacy to these ethnic minorities.
Nevertheless, facial recognition technology involves a machine learning software which is based on pre-existing training datasets which contain people who have been arrested. These training data sets often over-represent ethnic minorities, especially those of darker skin colour. In the US, ethnic minorities face a higher risk of being pulled over, searched, arrested, wrongly arrested etc. These minorities groups compose of watchlists which create systemic inequality through racial discrimination in the first place. The recent ‘Black Lives Matter’ Movement demonstrates the deeply-rooted systemic racism which exists in policing and the criminal justice system. Unless there is revolutionary change that can redress such structural inequality in the justice system, it is doubtful whether the infringement to Right to Privacy which is targeted at ethnic minorities could be rectified effectively.
- Contribution to Big Data and Information Sharing Arrangements
Since AFR Locate deletes information automatically and instantly where there is no matching result, the Court did not comment how it might affect facial recognition technologies where it can be saved or even shared between entities. Unlike DNA profiling and fingerprints collection, biometric data is more vulnerable to privacy infringement as it can be completely separated physically, which can then be collected and contributed to the ‘big data’. As mentioned earlier, the Court of Appeal did not establish a coherent decision for nation-wide application and left the power to local authorities to determine their own appropriate policies. Local law enforcement departments can thus set their own regulations leading to a fragmental legal framework which would reduce transparency for sharing among government divisions. It would encourage biometric data to be transferred between public authorities or even among private institutions. In the US, it is believed that Clearview AI holds a database that allegedly contains over three billion facial images derived from social media platforms and integrate them with facial recognition algorithm which would then identify unknown individuals from videos or photographs. It was reported that their software has been used by over 600 law enforcement agencies between 2019 to 2020.
Even if one has given consent to a law enforcement agency to process his biometric data, he cannot be guaranteed that such authority would not be used for secondary or even tertiary purposes which is known as ‘function creep’. Individuals’ Right to Privacy is further deteriorated if their private lives shown via social media platforms are given to the authorities. They would suppress their oppositions, political opinion and self-regulate their will in fear of government surveillance.
- Recommendations
At the moment, the Bridges decision may encourage local law enforcement personnel to establish clearer policies in determining who should be in the watchlists and where facial recognition is deployed. In the long term, although it can be achieved theoretically, it might not be as straightforward as it seems due to systemic inequality and racism in the criminal justice system and requirement of human intervention of the technology. It is possible that there would be a fragmented legal framework – the Court of Appeal’s decision put too much emphasis on technical specifics. This precedent may not be applicable in the future due to the evolving quality of facial recognition technology. It remains questionable whether the decision would clarify this area of law and provide heightened privacy protection to individuals.
Looking forward, public authorities shall implement nation-wide legislations to regulate use of facial recognition by all law enforcement agencies, so as to prevent technology being manipulated as a double-edged sword against our privacy.
Bibliography
(2018). Retrieved from Data Protection Act 2018: https://www.legislation.gov.uk/ukpga/2018/12/contents/enacted
Catt v United Kingdom [2019] ECHR 76, 43514/15 (European Court of Human Rights January 24, 2019).
Columbia University. (n.d.). Global Freedom of Expression. Retrieved from Catt v. the United Kingdom: https://globalfreedomofexpression.columbia.edu/cases/catt-v-the-united-kingdom/
European Convention on Human Rights. (1950).
Information Commissioner’s Office. (2019, January). An overview of the Data Protection Act 2018.
JUSTICE. (n.d.). S and Marper v UK [2008] . Retrieved from JUSTICE: https://justice.org.uk/s-marper-v-uk-2008/
Mann, M., & Smith, M. (2017). Automated facial recognition technology: Recent developments and approaches to oversight. University of New South Wales Law Journal, 121-145.
Marcus Smith, S. M. (2021, April 13). The ethical application of biometric facial recognition technology.
O’Flaherty, M. (2020). Facial Recognition Technology and Fundamental Rights. European Data Protection Law Review (EDPL), 170-173.
R (on the application of Bridges) v Chief Constable of South Wales Police [2020] EWCA Civ 1058, C1/2019/2670 (Court of Appeal (Civil Division) August 11, 2020).
S and Marper v United Kingdom [2008] ECHR 1581 (European Court of Human Rights December 4, 2008).
Woods, L. (2020). Automated Facial Recognition in the UK: The Bridges Case and beyond. European Data Protection Law Review (EPDL), 3, 455-463.Zalnieriute, M. (2021). Burning Bridges: The Automated Facial Recognition Technology and Public Space Surveillance in the Modern State. Columbia Science and Technology Law Review.