Face the future

Big Brother is watching – and can now recognise you in a crowd. Security cameras in public places have been using facial recognition technology

Hero image

There is nothing eye-catching about these cameras. CCTV equipment is so common that few people notice it as they go about their daily lives. But the new cameras are a tad smaller, and they are fixed quite low down compared to the more familiar ones high on walls and roofs. That’s because they need to get a good look at your face.

The Metropolitan Police’s trial of AFR found it was inaccurate and flagged up innocent people

This is to check your facial biometrics – a range of measurements between features like eyes, nose, lips, chin and forehead. When put together they draw a digital map that is as unique to you as your fingerprints. Instantly converted into computer code, the map of your face is compared with others which have been harvested from millions of scanned faces. If your face fits the image of someone who is of interest to the police, you may get a tap on the shoulder.

There have been several secret trials of automatic facial recognition (AFR) technology in the UK, most of them in the north of England. One was run last year by South Yorkshire Police at the Meadowhall shopping centre in Sheffield. Over the course of four weeks two million people are thought to have been scanned by facial recognition technology.

British Land, which owns Meadowhall, says that once the trial was completed there was no further facial recognition scanning at the site and all data gathered during the trial was deleted. South Yorkshire Police stated that the decision was taken to use facial recognition cameras at Meadowhall “in order to develop a better understanding of opportunities associated with the use of this technology”.

A much larger trial was run in Manchester’s huge Trafford Centre, where every visitor had their faces scanned between April and September last year and their facial maps compared against a Greater Manchester Police watch list of wanted criminals and missing people. Despite scanning an estimated 15 million faces, however, the AFR technology succeeded in making just one positive identification – a man who was wanted for recall to prison.

The trial was halted when the government’s own Surveillance Camera Commissioner, Tony Porter, who acts as a sort of watchdog over the watchdogs, raised serious concerns. He later wrote in his blog: “Compared to the size and scale of the processing of all people passing a camera, the group they might hope to identify was minuscule.”

Porter says he expects the integration of CCTV and facial recognition technology to continue to grow and predicts that more police and public partnerships in AFR usage like those at Meadowhall and the Trafford Centre are likely. But he points out: “There is a world of difference between an enhanced shopping experience and focused state surveillance, particularly when conducted on a mass scale.”

Care is required to make sure there is a clear dividing line, he warns, adding that the government must ensure there is legislation to provide safeguards against AFR technology misuse.

Another trial took place at Liverpool’s World Museum, opposite the city’s St John’s Gardens, during its highly popular Terracotta Warriors exhibition between February and October last year. The organisers claim that 600,000 visitors were drawn to the display of life-sized statues uncovered at a 2,200-year-old Chinese imperial tomb.

The museum insisted it had used the technology only after discussions with Merseyside Police because of a heightened risk of terrorism at the time. It added that signs were in place around the venue informing visitors of AFR technology in operation.

Silkie Carlo, who heads the civil liberties group Big Brother Watch, points to a “dark irony” in the technology scanning faces of people visiting an exhibition on ancient China. “This authoritarian surveillance tool is rarely seen outside of China,” she says. China has about 200 million surveillance cameras and uses them as part of a programme called Xue Liang – Sharp Eyes – which awards citizens “trustworthiness” points and monitors those considered untrustworthy.

Big Brother Watch is campaigning for live facial recognition to be banned in public spaces. Without control, Carlo says, people may have their identities checked whenever they leave home.

“These cameras can record who you are, where you go, who you go there with. A lot of people are angry and upset about this, and so they should be. In a democracy citizens don’t expect to go to a shopping centre or a museum and be under covert surveillance. The potential for what can be done with the technology to expand the surveillance state is really staggering.”

“It is time the government recognised the danger this dystopian technology presents.”

One of the main concerns of civil liberties groups like Big Brother Watch is that AFR works like a catch-net. The concept behind it is to video everyone who walks past and see if they are “good” persons or “bad” ones. Even if the technology was 99.9 per cent accurate, the fact that the number of people under surveillance goes into the millions puts many people at risk of being wrongly identified as “bad”. An independent review into the Metropolitan Police’s four-year trial of AFR found that it was 81 per cent inaccurate, and had flagged up perfectly innocent people.

“We saw this happen when we were allowed to be observers at trials,” says Carlo. “One 14-year-old boy on his way home from school was swooped on by four plain clothes police officers. He didn’t know what was going on. He was a young black boy, and typically the misidentifications are often young black men, a group that appears to be disproportionately represented on facial recognition systems.”

Some people are kicking back against what they see as an infringement of their human rights. In Cardiff in September, Ed Bridges, a 36-year-old office worker and former Lib Dem councillor, crowdfunded a legal challenge to South Wales Police’s use of AFR cameras in Cardiff city centre. He was represented by the human rights campaigning organisation Liberty, which like Big Brother Watch wants AFR trials to end.

After his case was dismissed Megan Goulding, one of Liberty’s lawyers, said: “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms. It is a highly intrusive surveillance technology that allows the police to monitor and track us all. It is time the government recognised the danger this dystopian technology presents to our democratic values and banned its use. Facial recognition has no place on our streets.”

Bridges has launched an appeal against the court’s decision, seeking a ruling that South Wales Police’s use of AFR is unlawful. He says: “This sinister technology undermines our privacy and I will continue to fight against it.”

In May, a man was fined £90 for refusing to show his face to police who were trialling the technology in east London with clear signs explaining that AFR cameras were in use. He had pulled up his jumper over his mouth.

Without any controls on its operation AFR systems have been introduced to a small number of bars and shops, and private use seems set to grow. A startup tech company called Facewatch claims it has sold systems to at least 15 “household name” retailers. Facial scans of customers are matched with a rogues gallery of people such as suspected shoplifters or credit card fraudsters.

Facewatch says the primary aim is for AFR to act as a deterrent, and last month the company admitted it had used a career criminal in Leeds as an adviser. It claims there will be at least 5,000 AFR cameras in private use across the UK by 2022.

With an unnamed UK supermarket chain reportedly planning to install them at entrances, get ready for your close-up.

If you liked this article, we think you’ll enjoy these:

Interact: Responses to Face the future

  • Carl Gohrigner
    03 Dec 2019 09:43
    Big Brother Watch like saying that the system is 81% inaccurate, even though they know full well it is not true. Consider this. There are 1,000 people in front of you: - You deploy an automated system to ASSIST. - The system instantly assess all 1,000 people and selects 10 for you to assess. - 5 of the selected 10 people are from the 6 criminals. The other 5 are not criminals. - YOU (NOT the system) assess these 10 people to make a determination. - There are 994/1,000 non-criminals. You deem 995 are non-criminals. 1 slipped through the net. - The system’s false reject rate was 1 / 1,000 (the missed criminal) = 0.1% * - The system’s false accept rate was 5 / 1,000 (the non-criminals it select for you to assess) = 0.5% * - The accuracy of the system was NOT 50% (5/10) - You only needed to assess 10 people, not 1,000. - You caught 5 of 6 criminals, which you otherwise would not have. If the system was 81% inaccurate, as reported, it would have selected over 800 people, not 10, from the 1,000 for you to manually assess. Also, the police are not tracking you, unless "you" are a criminal. Only the people in their database are tracked. I fully agree. A debate is necessary. A collective decision on the appropriate use of this technology is required. It should not be blanket deployed. Some uses are intrusive. But let's base the debate on fact and not misinformation please.

Leave a reply

Your email address will not be published.