The boom in facial recognition technology in the UK, operating in a largely lawless space, is a moral threat to privacy as we know it. Walking down a street anonymously could soon be a thing of the past if the spread of facial recognition is not resisted” – Big Brother Watch.

What is Facial Recognition Technology [FRT]?

Facial recognition technology seeks to leverage the uniqueness of human facial features either for surveillance or to match/identify individual faces. It works through software which measures and analyses a face’s particular characteristics to create a unique bio metric template, or map, of that face which is then converted to a string of numbers [a code]. Algorithms then compare these codes to a database or watchlist, made up of other face maps, to search for potential matches. Unlike facial recognition tools seen in films or television, FRT produces a match score, usually a percentage, that rates the similarity of the facial image with others on the database. Depending on the tool used and confidence settings there may be one or a handful of potential matches generated by the system.

Live Facial Recognition

Live Facial Recognition [LFR] is a form of automated facial recognition which operates near-instantaneously. The software analyses live video feeds to recognise and match faces against a watchlist. Both the Metropolitan Police in London and South Wales Police have used LFR over the past five years, moving from so-called “trials” to active deployments. LFR is perhaps the most intrusive form of automated facial recognition technology as it is used to indiscriminately scan anyone passing by the camera in a public space, often without their knowledge. The streets where LFR is deployed become de-facto police lineups, where everyone is a potential suspect. In a policing context it is used in combination with a deployment of officers who seek to intervene with any potential matches amongst a crowd at that moment.

In 2016 and 2017 the Met Police used LFR at Notting Hill Carnival, the largest African Caribbean event in the country. More than 500 people were on the watchlist, but 2016 saw one false match and zero true ones, while 2017 saw 95 false matches and just a single true one. Targeting community-based events with mass bio metric surveillance, especially when the same communities are harmed by the Met’s institutional racism, underlines how this technology could be used to the detriment of particular groups. The potential for certain demographics or groups to be targeted is an ongoing concern with facial recognition.

An independent review of the Metropolitan Police use of facial recognition technology was damming. 81% of the people flagged by live facial recognition were in fact innocent people who had been misidentified, and that it was “highly possible” that the Metropolitan Police’s use of the surveillance technology would be found to be unlawful if it were challenged in court.

Despite the serious accuracy and legal issues, both the Metropolitan Police and South Wales Police have continued to deploy live facial recognition in London, Cardiff and Swansea. The deployments turn our city streets into mass-scale police line-ups with hundreds of thousands of innocent people subjected to bio metric identity checks. Despite scanning more than 560,000 people, the equivalent of the population of Belfast over the past five years, only 57 people were correctly identified while the technology got it wrong 90 times.

Seven years after UK police first rolled out this invasive technology there has still been no democratic consent to live facial recognition bio metric surveillance in Britain. No legislation has been passed, or even seriously proposed to approve or ban the use of live facial recognition technology in the UK. Instead, police operate in a grey area enabled by a democratic deficit to use rights-invading technology with minimal oversight.

The increasing use of retrospective facial recognition by police forces presents some major risks to privacy and civil liberties and could see innocent people having to prove they are not who the technology claims they are, while operator-initiated [mobile phone-based] facial recognition threatens to equip police with invasive bio metric scans on demand.

Other parts of the public sector have also tried to introduce bio metric face scanning, with schools using coronavirus as an excuse to replace lunch cards with face-scanning tills for children, and the Home Office making facial recognition the key to its scheme to process millions of claims for residency in the UK following Brexit.

The private sector has also capitalised on the growth of cheap and easy-to-use facial recognition algorithms, building and selling intrusive surveillance tools that put everyone’s privacy at risk. Retailers can now pay a small fee to have facial recognition cameras fitted on their doors to alert staff to “undesirables”, giving corner shops access to national facial recognition networks that even the police would be envious of.

The central government is working to create a mega-database of bio metrics that could be instrumentalised against migrant communities, and to discriminate against ethnic minorities.

In 2019, a Big Brother Watch investigation found several more instances facial recognition being used at privately owned sites in England. This included Meadowhall shopping centre in Sheffield, where the owners British Land told us that LFR was trialled for two days and then one month in early 2018. We estimate around 2 million people may have been scanned by LFR during the trial. Other sites were the Millennium Point, a mixed-use development in Birmingham containing a conference centre, entertainment venues and educational institutions, and Liverpool’s World Museum. Both were identified as police-linked LFR examples. There was added irony in the World Museum using LFR at an exhibition of the Terracotta Warriors, on loan from China – the biggest user of facial recognition surveillance in the world.

The Right to Refuse

In general, the public has the right to opt out of having their face scanned by the police’s live facial recognition cameras, and in theory, no negative inference should be drawn from an individual choosing to do this. Avoiding the LFR cameras alone is not justification for further police action. However, Big Brother Watch has seen plain clothed police officers monitoring the edges of deployments in the past and questioning people who took alternative routes after seeing Big Brother Watch’s placards informing the public that LFR was operating in this area, or taking one of our leaflets, some of those officers have confirmed to us off record that they viewed such behaviour as justifying a police intervention.

Live facial recognition is a significant threat

Live facial recognition technology poses a significant threat to rights and freedoms in Britain and stands to fundamentally unbalance the relationship between police forces and citizens. Used widely in more authoritarian states like China and Russia, live facial recognition has no place in a purportedly rights-respecting, democratic nation. Police use of LFR has continued to advance, despite growing public concern, a court ruling that found that South Wales Police’s use of LFR was unlawful, a lack of Government strategy and no parliamentary consent. The technology has been deployed at shopping centres, festivals, sports events, concerts, community events – and even a peaceful demonstration. One force used the technology to keep innocent people with potential mental health issues away from a Remembrance Sunday event. Facial recognition poses perhaps an ever greater threat to our privacy and civil liberties, as technology further outpaces legislative scrutiny and democratic accountability.

LFR technology indiscriminately scans the faces of everyone who passes in front of the camera, with members of the public treated as potential suspects until a biometric identity check proves otherwise. In policing, suspicion has traditionally preceded surveillance and individuals are considered innocent until proven guilty. LFR reverses these important principles and in doing so normalises blanket, suspicion less surveillance. It cannot be considered proportionate. Police forces have also failed to make the case that LFR is strictly necessary and that other, less intrusive, means of locating and identifying suspects have not been pursued first.

For such a powerful and controversial technology, the lack of democratic mandate for the use of live facial recognition is deeply problematic. There is no legislation that directly addresses the use of LFR in public spaces, and the words ‘facial recognition’ do not appear in any laws in the UK. The “chilling effect” of intrusive new forms of surveillance on freedom of expression has been well documented and recognised by the UN Special Rapporteur on The Rights to Freedom of Peaceful Assembly and of Association, the European Union Agency for Fundamental Rights and rights groups across the globe.

The negative effects of the use of facial recognition technology on the right of peaceful assembly can be far-reaching (…) Many people feel discouraged from demonstrating in public places and freely expressing their views when they fear that they could be identified and suffer negative consequences.” UN High Commissioner for Human Rights.

Surveillance technology with poor accuracy poses a risk to everyone but is particularly disturbing in light of research showing that many facial recognition algorithms disproportionately misidentify black people and women.

Because of how the technology works, facial recognition is more intrusive than traditional CCTV surveillance, it’s more akin to having your fingerprints taken, and like many other people, I felt that was an invasion of my privacy.” Dr Ed Bridges civil rights campaigner.

According to the Data Protection Impact Assessment published by Cheshire Constabulary, probe images [photos of the unknown subject] can be taken from anywhere as long as officers establish the source of the photograph, so they can be sure there is a legal basis to use it before facial matching takes place. Sources may include:

• CCTV

• Body-worn images are taken by officers when dealing with incidents or crime

• Social media

• E-fit images

• Other photos taken by officers on mobile phones/other devices

• Surveillance images

• Any other digital images, e.g. from dash-cams or doorbells

This illustrates the potential breadth of how police may employ facial recognition as the technology develops. Almost any image can be used for facial recognition identification, including computer generated images as evidenced by the suggestion that an e-fit could be used in a facial search.

In its documents on facial recognition technology, Cheshire Constabulary admits that its facial matching tool could be used to identify:

• People deemed to be at risk or interacting with officers who are using their statutory powers

• Vulnerable people

• Victims of crime

• Anyone driving a vehicle

• Children

• People who are subject to police powers in the street*

*No detail is given on what this means but it may include people subject to stop and search.

Far from being a “boost to victims of crime” as Cheshire Constabulary said when trumpeting its facial recognition rollout, facial recognition technology could be used to surveil victims of crime.

Police National Database Face Search

The PND is a nationwide system that acts as a data repository of a huge quantity of police photos, information and intelligence which is uploaded by individual forces. This includes information on events, organisations and people. Police are given the power to take photographs of people when they are held after being arrested under Section 64A of the Police and Criminal Evidence Act 1984. Police forces can upload these custody images from their own systems to the PND. It is these custody images which make up most of the reference database for the PND Face Search tool.

How Many Images Are On It?

As of January 2023, there are 16,102,341 images held on the Police National Database. This is a decrease of around three million from five years ago. However, the 2023 figure comes after the deletion of almost six million images as part of a system upgrade in 2021, many of which were low quality or duplicates. A huge number of new images are being added to the PND every year according to data from the Home Office, with 1.9 million being added in 2020, 1.1 million in 2021 and 982,000 in 2022 – totaling around four million extra photos in three years. Over the three years this equates to around 3,600 photos being added a day, or 2.5 per minute. Multiple images may be held of the same person, and whilst most photos are of faces, some may be of other identifiers such as tattoos and scars. One does not have to have ever been convicted of, or even charged with, an offence to feature on the PND. As custody images are taken on arrest there are likely millions of innocent people whose photos may have been uploaded and retained in the PND by a police force, without their knowledge and justification. The Home Office still does not know how many innocent people’s photographs are held on the PND. A significant number of innocent people’s custody images are being uploaded and retained in the PND each year – despite the High Court ruling the retention policy to be unlawful and a breach of individuals’ right to privacy. Unless the conviction rate for people arrested is more than 99.9per cent, there is clearly a greater number of innocent people arrested and photographed annually than innocent people’s photos being deleted.

Despite the poor level of accuracy with the PND’s facial searching algorithm the sheer number of images, lawfully and unlawfully held on it, underline its potential to transform into a mass-facial recognition search engine for the police. The Home Office is now funding a new bio metrics search program, including facial recognition, which vows to give law enforcement in the UK the ability to search faces with much greater accuracy than the PND allows.

Risk to privacy

The ability of police forces to bio metrically process and potentially identify anyone that they hold facial images of [through either photographs or videos] is a serious risk to our privacy and our ability to move through public space with anonymity. The processing of sensitive, personal bio metric data must be strictly regulated to be compliant with human rights law.

Bio Metrics in Schools

The use of bio metrics in schools to perform relatively straightforward tasks has become a significant threat to children’s data rights and privacy. Fingerprint-based systems are used in a large proportion of secondary schools, mostly to identify pupils in the school canteen, but also for other purposes such as library access. A number of schools are now introducing facial recognition systems, typically focused on lunch payments, to replace fingerprints or swipe cards.

Cashless payment systems for school canteens have become the dominant payment method. Parents top up their child’s account online, which is charged for the meals they purchase in the canteen. Swipe cards, fingerprints and increasingly facial recognition are used to identify the correct account on the school system so the right person is charged for the meal.

These facial recognition systems operate by capturing a reference image of a child and associating it with their account. A camera in the canteen then takes an image of the child as they purchase their food, and the software matches the bio metric face print of this image against the school database to identify the child present. A cashier then charges the account.

Schools that have adopted facial recognition systems cite several different reasons for processing bio metric data for the simple task of facilitating lunch payments.

Big Brother Watch’s work around the subject of facial recognition for lunch payments found that information provided to parents on the use of facial bio metrics was often incomplete, lacking key details about how the data would be processed. It also uncovered that some schools did not have effective consent procedures and made using facial recognition quasi-mandatory to allow children full participation in school life.

There is no guarantee that the bio metric data of children will be able to be kept secure by the schools and could be part of a future data breach.

While other countries around the world have banned children’s bio metrics in education settings as high risk, the UK was an early testbed for widespread use.” Jen Persson, Defend Me.

Conclusion

People are concerned about the growing normalisation of biometric identity checks in Britain, which is not limited to police use but is now expanding into a variety of inappropriate settings. There is concern regarding the increasing number of schoolchildren are being confronted by a face-scanning camera to pay for their lunch, while the Home Office seeks to use facial recognition with increasing regularity across immigration and visas.

In the private sector, we have seen the rise of companies seeking to be Google for faces, scraping the internet en-masse to bio metrically analyse every photograph on the web, and others who want to equip businesses across the country with custom facial recognition networks. The threat from facial recognition is not limited to state use of the technology but risks reaching into various areas of public life, with the potential for serious harms and rights impacts in people’s private lives.

Orwellian, authoritarian surveillance tools must not be normalised as an aspect of daily life in Britain. On the contrary, public institutions should protect and uphold our data rights, and teach our young people in particular the importance of consent and control over their body data.

New technologies put us at a crossroads for the future. If the UK is to positively embrace technology whilst protecting rights and democracy, parliamentarians must take action and legislate to prevent the serious harms we face and safeguard our rights.

Tags:

No responses yet

Leave a Reply

Your email address will not be published. Required fields are marked *