Why you should care

The British capital’s experiments with this technology could set the global standard.

On the last day of January, few of the shoppers and office workers who hurried through Romford town center in east London, scarves pulled tight against the chill, realized they were guinea pigs in a police experiment.

Officers sitting inside a parked van nearby were watching them on screens, using a new technology that the police hope will radically reduce crime in London — live facial recognition. Cameras stationed near the Romford train station picked up every face walking past and matched it to a police watch list of wanted criminals. Successful matches would result in immediate arrest.

But the trial quickly stumbled into thorny issues. A bearded man in a blue baseball cap approached the surveilled area with his gray jumper pulled up to cover his face. He had just been informed by a bystander that the police were testing facial recognition in the area and did not want to participate.

It’s really alarming for Britain to go down this path and set this precedent.

Silkie Carlo, Big Brother Watch

The police demanded that he comply and scanned his face with a facial-recognition tool on a mobile phone. Although his face did not match that of any known criminals, a verbal altercation ensued, resulting in the man being fined 90 pounds. The incident — caught on camera by journalists — was one of four similar arrests in Romford that day, all of which are causing acute concern among observers and civil-rights activists about facial recognition. Given that the technology represents overt surveillance, many believe that explicit consent of citizens is fundamental — something the Romford man never gave.

“When people get stopped and searched in the street or fined for avoiding cameras when they don’t consent to being observed by cameras, that is a problem,” says Peter Fussey, a criminologist at the University of Essex who was present in Romford as an independent police-appointed monitor.

Because of its large number of operational CCTV cameras, London is now at the forefront of a battle over the use of facial recognition by the authorities that’s escalating across many democratic countries. The Romford operation was one of 10 such events around London carried out by the Met police over a period of three years, including twice at Notting Hill Carnival. Over the course of the trials, police planned to gather evidence about the accuracy and bias of the system and to assess whether the use of the dragnet technology could be justified by its potential benefits — preventing or solving major acts of violent crime.

As the technology has become commercially available in recent years, the biggest uptick in use has been in countries with authoritarian political systems — most notably in China, which uses facial recognition as part of its extensive and highly intrusive surveillance of Muslim Uighurs in Xinjiang province that has been denounced by human-rights groups. But the use of face recognition by two forces — London’s Metropolitan Police and the South Wales force — has also sparked a national debate about where people will draw the line to protect their right to privacy. The discussion centers on whether there is any legal basis to use live facial recognition on the general population and whether blanket use of the technology fundamentally undermines the rights of citizens. It comes on the heels of U.S. cities such as San Francisco and Oakland choosing to temporarily ban facial-recognition use by public bodies until regulations are in place.

Many of these issues could come to a head in a legal case in the United Kingdom. In May, Ed Bridges, who lives in Cardiff, the Welsh capital, brought one of the first legal challenges to police use of facial recognition on the grounds that it is a breach of the Human Rights Act 1998. The outcome of the case could set a precedent around the world, from the U.S. to India and Australia, where facial recognition is being quietly tested.

“We are not aware of anywhere live facial recognition is being used for general public surveillance, except in China,” says Silkie Carlo, executive director of Big Brother Watch, a civil-rights campaign organization that has brought a separate legal challenge against the Met Police’s use of the technology in London. “It’s really alarming for Britain to go down this path and set this precedent not only for other democracies but certainly for less liberal states.”

***

An estimated 420,000 CCTV cameras operate in and around London, making it the second-most monitored city in the world, after Beijing with its 470,000 cameras, according to the Brookings Institution. (Washington, D.C., in third place, has just 30,000.) Many were put in place in the early 1990s in response to IRA bombings in the city, followed by waves of installations after the September 11 and London Underground terrorist attacks and the 2012 Olympics.

For years, the cameras dotted around the city were “dumb” devices, peepholes that did not know what they were looking at. However, advances in artificial intelligence, along with the dropping cost of the cameras themselves, have transformed the business of visual surveillance. Machine-learning algorithms trained to recognize specific people, objects or strange behaviors have supercharged these cameras, allowing them to effectively “see.”

Over the next five years, the number of so-called smart cameras in public and private spaces, from schools to public toilets to hospitals, is expected to increase exponentially across London, according to a report by Tony Porter, the U.K.’s surveillance camera commissioner. This will create a smart city that will itself become the eyes and ears of an overburdened law-enforcement system.

Elements of this future are already being used by public bodies in London. Transport for London, which is responsible for the public transit system, has used AI to analyze footage from cameras to spot unusual behaviors such as lingering pedestrians or suspicious baggage. Local councils, such as Newham, have tried smart CCTV that sends automatic alerts to officials about events such as crowd build-up or suspicious objects. In NHS hospitals, millions of patients are exposed to “ever-increasing surveillance technology from drones and body-worn video to automated facial recognition,” Porter said in the January report.

Smart CCTV is also being pioneered by the private sector. Convenience stores like Budgens and supermarkets including Tesco, Sainsbury’s and Marks & Spencer have cameras that are already, or soon will be, capable of facial recognition. They will be used for applications ranging from crime prevention to estimating the age of those buying alcohol or cigarettes.

Yoti, a British technology startup, is rolling out its facial-analysis software in over 25,000 convenience stores in the next four months to estimate the age of customers; another London startup, Facewatch, says its software, which can recognize known criminals, has been tried out by a number of high-street retailers in the past two years and will soon be included in 550 stores across London. Facewatch has been in talks to sign data-sharing deals with the Metropolitan Police and the City of London police.

***

Yet the lines are blurring between the use of facial-recognition technology by private firms and the public sector. Surveillance camera systems in public places are operated by the private sector, who give law enforcement free access to their footage. The police’s own facial-recognition systems are built by commercial organizations, which can raise other issues. Japanese technology company NEC provides cameras to the Metropolitan and South Wales Police.

Hannah Couchman at Liberty, a nonprofit organization that is supporting the case against the South Wales Police, says that any examination of the technology requires access to training data and algorithms, but the company sees that information as a trade secret. “That overlap between government and private companies leads to a lack of transparency that is inevitable,” she says. NEC declined to comment.

Some experts believe the close collaboration between public and private sectors is a growing problem because there is currently no ethical or regulatory framework for the private use of surveillance technologies. “All these companies have customers, but they can’t share who their clients are,” says Stephanie Hare, a campaigner and researcher. “We have no data about how companies are using it. Nobody has oversight. It’s a total free-for-all.”

U.K. laws governing the use of biometrics, including our facial data, have not been updated since 2012 and focus primarily on DNA and fingerprints. That needs to change urgently, says Paul Wiles, the U.K.’s independently appointed biometrics commissioner. “Biometrics is emerging in both the public and private sector, and the important question is who will share what with whom, and who decides,” he says.

Fussey, the University of Essex criminologist, agrees, saying the current laws around facial recognition are “completely inadequate.” He adds, “Parliament is so choked up with Brexit there is no appetite for law changes, but there isn’t a legal basis. It is unfair on the public and on the police.”

MPs on the science and technology committee last week urged the home office to impose a moratorium on all facial-recognition trials until regulations can be established. The call has been backed by independent experts including AI researchers at the Ada Lovelace Institute, the biometrics commissioner and the U.K.’s data protection authority, which said the use of facial recognition in public spaces represented “a real step-change in the way law-abiding people are monitored as they go about their daily lives.”

However, the police believe the potential benefits are significant. “Live facial-recognition technology has the potential to help our officers locate criminals who are wanted for serious and violent offenses, such as knife and gun crime, and the sexual exploitation of children,” says deputy assistant commissioner Duncan Ball of the Met Police. “The public would expect the Met Police to use all available and proportionate means to catch violent offenders, and it is right that we trial emerging technology that could help us to do so.”

Before the end of the year, the Welsh courts will deliver a judgment on the case brought by Ed Bridges against the South Wales Police. “The inevitable result is that people will change their behavior and feel scared to protest or express themselves freely,” Bridges said in his crowdfunding appeal for the case. “In short, we’ll be less free.”

+
By Madhumita Murgia

OZY partners with the U.K.'s Financial Times to bring you premium analysis and features. © The Financial Times Limited 2019.

OZYFast Forward

New trends and breakthrough thinking in politics, science, technology, business and culture. It’s futurism at its best.