The SWP’s use of facial recognition was ruled illegal, and so what?

Earlier this month, the police use of facial recognition was ruled unlawful in South Wales.
How much do you know about the technology? And what does the judgment truly mean?

Image: Jon Candy

Outside a stadium, tons of football fans were hurtling towards the gate, looking forward to an upcoming Cardiff v Swansea match. The atmosphere was full of excitement, cheerful chatters flowing among the crowds.

But there stood Vince Alm, who was certainly not in the same mood. In fact, He was outraged.

Next to a police van, he had put on a costume mask, which was strange enough to catch everyone’s attention, and was holding up a placard writing in bold and clear letters: “No Facial Recognition!”

Looking around, Vince noticed two surveillance vans were patrolling the main route most fans would pass to attend the event. Walking past the police vehicles, people were not bothered by their existence at all.

They barely realised their faces were being filmed and scanned by some computer algorithms. That was automated facial recognition (AFR), a system South Wales Police have recently introduced to check attendees’ identities at mass gatherings.

“I was angry because we were told only a few days before the fixture and not consulted,” said Vince , Contact Manager of Cardiff City supporters’ club, whose members were standing beside him, wearing masks to support the protest.

“The use of facial recognition was not proportional to the threat [the game poses],” Vince added , regarding the technology as intrusive and unnecessary.

People were wearing masks to protest against the trial of facial recognition. Image: Vince Alm’s Twitter.

At the scene, there also saw members of the civil rights group, Big Brother Watch, who were giving out informative pamphlets in an attempt to warn passers-by of the vans and educate them about the police use of AFR.

Following this backlash from football fans and campaigners in January this year, police insisted it was in the public interest to carry out the trial. “It has often been difficult for parents to consider taking their sons and daughters to high-profile football matches because of fears of violence caused by a minority of fans,” said South Wales Police and Crime Commissioner Alun Michael .

He then noted that the technology could work as an effective tool for police to filter out ex-offenders and those who had been issued banning orders, further prevent them from attending the event, and “enable genuine fans to enjoy watching football in safety and confidence” .

But talking on behalf of the football fan club, Vince expressed his disapproval of the claim. “Nobody was arrested or prevented from attending the fixture at both fixtures it was used at. Potential offenders would not attend the fixture anyway due to hundreds of police on duty including trained spotters.”

They already have all the powers to maintain safety. They do not need facial recognition.

Vince Alm, Manager of Cardiff City supporters’ club

He added, “They [South Wales Police] already have all the powers to maintain safety. Police spotters, surveillance, special powers, support from the courts, banning orders, strict sentencing. They do not need facial recognition.”

The same objection was shared by campaign groups such as Liberty, who accused the police of using the technology to collect citizens’ information without their noticing and consents.

To respond, Mr Michael said , “It’s important to note that CCTV surveillance does involve retaining recorded images for up to 30 days but that is not the case with facial recognition technology.” He explained AFR was simply used to detect people on a list; for those who were not, their data and images would be removed from the system immediately after the operation.

All the back and forth statements, composed of two sides of voices, collide with each other over the issue of whether AFR should be used in public, leaving most citizens bewildered in the middle of the dispute.

A survey conducted in 2019 showed that up to 90% of Britons professed they were aware of the use of facial recognition, but nearly half of them admitted they had little knowledge about it. In the same report, more than 60% of citizens agreed that the police should use face detection to conduct criminal investigations, but only 17% and 7% approved its use in supermarkets to verify customers’ age and to track shoppers’ behaviour.

These figures indicate that most Britons accept police to use biometrics technologies to solve cases, but when it comes to detecting ordinary individuals in public spaces, a far lower number of people agree with the ruling. And the use of AFR falls into the category of the latter.

In other words, due to a lack of basic information and knowledge about how facial recognition functions and how the authorities are using it, generally, people’s high acceptance is based on either trust or obliviousness.

Without a fundamental understanding, it is barely possible to form an objective judgment on whether the police’s applying AFR is more of a benefit or a drawback. People tend to assume police officers will introduce AFR ethically when the current legislation has failed to keep up with speedy technological advances.

According to Liberty, facial recognition is “more like fingerprints than photographs”, working by “scanning the distinct points of our faces and creating uniquely identifiable biometric maps”. It means there will be hundreds of data points on each map formed by the calculated distance between a person’s discrete facial features.

South Wales Police were driving facial recognition vans with live cameras to conduct the detection. Image: Griff Ferris’s Twitter.

In the case of AFR, facial identification algorithms, integrated with surveillance technology, will be functioning with a vast network of personal data. And data collection, thanks to social media and all the other digital platforms, is no longer a difficult task for tech companies; images and information of users cannot be more easily accessed and compiled nowadays.

“There will be a database somewhere storing a link between the person’s information (eg name) and a biometric fingerprint which represents their face,” said Oli Bartlett, the product manager of DataSparQ, British Artificial Intelligence (AI) firm. “The face detection calculates a bunch of parameters about a face and then checks to see if any of the biometric fingerprints match.”

Based on this conception, AFR was developed as a “real-time” facial recognition surveillance, conducted with live-monitoring cameras and carried out by South Wales Police in two primary ways since 2015 on a trial basis.

One of them is used to search for suspects or persons of interest among a custody database including hundreds of thousands of individuals, on which the detection will analyse a still image of the wanted person and look for a match.

The police claimed that this method has already assisted in the identification of suspects and the vulnerable across South Wales.

Watch the video to undertand what AFR is and how the police are uisng it.

But what triggered a towering rage among football fans is the second approach, the instantaneous surveillance technology. In this regard, AFR is used with CCTV to film passers-by and compare their faces with others on a watchlist created by the police in advance according to different occasions. If a match is made in the process, the system will send an alert to the operating officers.

However, as for the matching result, Mr Bartlett said that inevitably, no biometric maps can reach a 100% resemblance with one another, thereby “the system will give a coincidence score and it’s up to whoever’s running the system to decide what level of confidence is appropriate for a match in their situation”.

In this sense, it is clear that the accuracy rate is flexible and even not completely reliable; whichever authority who controls the system has the final say to the identity-matching outcome. Thus, during AFR trials, it is police forces who have the power to decide whether a person identified by the algorithm is a suspect or an offender that should be investigated or arrested at the field.

Though South Wales Police proudly announced the AFR technology was accurate in 76% of cases, which indicated there were still chances citizens can be misidentified, Mr Bartlett pointed out that face detection is more likely to be inaccurate during real-time operation than it is used in an ideal setting. “For a well-trained algorithm in good lighting, with the right angle of the picture, it [accuracy rate] will be very high. But in many cases, that’s not the situation,” he said.

On top of this issue, civil rights organisations also note that facial recognition is notorious for its even lower levels of accuracy when recognising faces of people of colour, particularly females. This is due to the disproportionate representations of black and brown people in the database. With a pool of white people’s images, the software has been better trained to identify individuals with pale skin colour, while BAME ethnic communities are more likely to experience wrong matches.

As Black Lives Matter movement has spread globally to the UK including South Wales, where several protests took place in the capital, this faulty design of FR has concerned a wider range of people over the possible exacerbation of racism.

In June, hundreds of people gathered in Bute Park, Cardiff, to stand up for Black Lives Matter.

“The police are supposed to protect us and make us feel safe, but I think the technology is intimidating and intrusive,” Ed Bridges told Liberty after he launched an appeal against the roll-out of AFR in 2019, claiming that South Wales Police had twice used the technology to obtain his biometrics data without his consent.

However, the judge ruled the case legal in the end, saying the ruling was adequate with violating any privacy, thereby the police were allowed to continually put the AFR on trial. Unsatisfied with the result, Liberty and Bridges then brought the case further to the Court of Appeal, arguing the technology breaches people’s privacy, invades data protection rights, and discriminates against people of colour.

Following a long legal journey, on 11 August 2020, the Court handed down a groundbreaking judgment, announcing the use of AFR unlawful, bringing all the forthcoming trial to a halt.

“I’m delighted that the Court has agreed that facial recognition clearly threatens our rights,” said Bridges. “For three years now South Wales Police has been using it against hundreds of thousands of us. We should all be able to use our public spaces without being subjected to oppressive surveillance.”

Although the verdict seemed to be a huge victory for civil rights campaigners, it didn’t mean there would be a ban on the technology as the judge also appreciated the certain benefits of using facial recognition to fight crime.

“There is nothing in the Court of Appeal judgment that fundamentally undermines the use of facial recognition to protect the public,” said Deputy Chief Constable Jeremy Vaughan. “This judgement will only strengthen the work which is already underway.”

South Wales Police overall welcomed the judgement, which gave them a clear direction to work on, stating that they would continue to develop AFR with the Home Office and conduct it in a more responsible way in the foreseeable future.

Besides the police, supermarkets in Cardiff such as Tesco and Sainsbury’s started to install new CCTV cameras at self-service checkouts, one for each till. It is an ongoing trial that has played out since 2018, first launched in England. Shoppers can see themselves filmed by a live camera overhead when checking out, which most of them have found “spooky”.

“That’s not facial recognition technology. It’s just live cameras… If there’s any issue, we can report to the police more effectively,” said Adam, manager of a supermarket branch who wanted to keep his surname anonymous.

However, most UK’s biggest grocery stores are reportedly keen to take a step forward to introduce facial recognition in the coming years, enabling self-checkout registers to examine customers’ identitiy and age automatically.

In the meantime, some shops in England have already used face detection to capture shoplifters. Most of them are using software called Facewatch, UK’s leading facial recognition system and tech firm. Installed in the store, it can compare a customer’s identity with people who have had shoplifting or other criminal records on a database. And the system will send a warning notification to the staff when a match occurs.

“If someone triggers the alert,” Paul Wilks told The Observer, owner of a Budgens supermarket in Buckinghamshire. “They’re approached by a member of management and asked to leave, and most of the time they duly do,” he said.

It seems that sooner or later, the same surveillance system might come to South Wales, upgrading the existing CCTV cameras in stores and shopping malls with AFR. “I don’t agree with using it,” Adam said, “I think it’s intrusive. It intrudes people’s privacy.”

Being asked whether he thinks using facial recognition in the supermarket can enhance security, Adam responded, “no, it won’t help. It’s not useful. As far as I’m aware, we won’t use that in the store.” But he also said his statement could only represent the situation of one single branch, not the company’s opinion, which was in favour of developing facial recognition.

A supermarket branch in Cardiff has installed CCTV densly all over the place.

Overall, despite the judge’s verdict and controversy, the resolution to proceed with trials of AFR adopted by the police and private businesses shows no signs of abating. As for preventing crime by identifying suspects and repeat offenders, the technology seems to be a big help for them.

“There has not been one single wrongful arrest as a result of the use of facial recognition by South Wales Police,” Mr Michael said, “There have been numerous successes in apprehending criminals who were wanted for some very serious offences including sexual offences, serious assaults and burglary.”

However, many underlying problems still remain unsolved. For example, no constructive solution to AFR’s discriminatory problem has been brought forward. People of colour who are already suffering from unequal arrest rates will then have higher chances to be misidentified; real cases have already happened in the UK.

Additionally, although the police said that data of citizens who are not on a list will be deleted at once after the trial, many don’t even want their biometrics information collected in the first place.

“The system is open to abuse,” said Vince. “Who monitors the watchlist, what is the criteria to be put on a watchlist, and why should thousands of law-abiding citizens have their biometric data checked?” He seriously questioned the use of AFR, same as Bridges and Liberty, who all believe a total ban is the best antidote to this problematic technology.

But according to the current tendency, their request is seemingly doomed to fall through. As determined as the government seems to be, shortly, the surveillance system that some citizens have found intrusive will resume, coming back with more intelligent algorithms.

Eventually, whether or not the use of AFR should be banned is a fundamental question left to be considered, and the answer perhaps depends on how the majority of citizens consider it, either intrusive or safe, with no grey area in between.

Voices from residents in Cardiff

Passers-by in the city centre were asked the question below: do you think facial recognition surveillance can make Cardiff a safer place?

Abbas Khaal, 29, Grocery Store Assistant

“Yes, I think it definitely would make [the place] a lot safer because I work in this shop and I’ve got police daily coming in and they check CCTV, just to see who are criminals, who are doing some dodgy stuff.

“So, they should definitely keep on doing what they’re doing. I’m sure that’s one of the reasons that criminal activities are fewer [here], compared with some other countries where they don’t have all the cameras.”

Judith Dray, 30, Librarian

“I think it’s one of the things that might look tempting but it might lead to more problems like people kind of being picked up, and having quite bad racist problems come from it, black people being disproportionately picked up by it. I think it might give a full solution to some people but less safe.”

Alexander Pike, 26, Stylist / Fashion Designer

“Yes or no, it depends on who’s willing to have it done because I know there’s an invasion of privacy as well, isn’t it? If it’s just cameras around. That would be a help.

“I don’t know a lot about it [FR] but I know there are cameras everywhere, especially in Saint David [shopping mall], but we don’t know where they are because they’re kind of a hidden system but I think more could be enforced as well.”

Waseela Abbas, 24, Desk Operator

“Yes, it would be good because they’re stopping some form of harm that could happen to people when going to stadiums and other places, like a safeguard for everyone else.

“I wouldn’t mind if my face was detected because if it’s to make sure everyone is safe. It’s not like something to put on the Internet. If you are not the criminal, you have nothing to hide.”

Simon, 26, Volunteer of Tŷ Krishna Cymru

“I don’t think so. There will be more people arrested but it wouldn’t be safer. Because prison is not a way to stop crime. The right way is to educate people rather than suppression like you treat a symptom but you didn’t treat the real disease.

“I know we’re constantly being filmed [in public spaces]. I don’t really like it but at the same time, I understand the purpose of it. But they can be used in for good and for bad purposes.”

Timeline of Ed Bridges’ legal challenge against AFR