Why you should care
China is turning sci-fi policing films into reality.
OZY was first to this story, putting you ahead of the curve. View More OZY Originals
In the 2011 CBS show Person of Interest, reclusive computer scientist Harold Finch builds an artificial intelligence system called “the Machine” that compiles and analyzes troves of data to predict murders. Finch and his henchman, John, then chase down the perpetrator and prevent the crime. Public security officials in the Chinese autonomous region of Xinjiang, it turns out, are turning that fiction into fact.
An enormous data-driven program that pulls from health records, financials, vehicle checkpoints and police reports to identify individuals likely to commit crimes in China’s politically sensitive northwest region has just been revealed. Launched in 2016, the “big data” policing initiative in Xinjiang is one of many law enforcement initiatives the country is launching, drawing on its growing capabilities in the fields of AI and robotics.
While a number of countries … boast the technological capabilities China is integrating into law enforcement, political considerations make it difficult for these countries to follow suit.
Kam C. Wong, former Hong Kong police officer and professor emeritus at Xavier University
At Beijing’s Tiananmen Square, stun gun-wielding robots patrol crowds of tourists. While the robots negotiate their own path along designated routes, the stun guns are activated by an officer controlling the bot remotely. In Zhengzhou, the capital of China’s central Henan Province, similar police robots that look like armless Daleks roam the high-speed train station. They use facial-recognition software to help officers identify suspects, interact with customers and answer their questions. Police officers at the station wear facial-recognition sunglasses, developed by Beijing-based tech company LLVision, which pick out fake IDs and identify wanted criminals. And in the central metropolis of Wuhan, the Ministry of Public Security has teamed up with tech giant Tencent to develop a fully automated police station driven by the latter’s AI technology.
These technologies, which could help fix weaknesses in China’s public security infrastructure, have also sparked concerns of misuse. But they highlight key factors helping China pull ahead in the use of AI in policing. The country, which recently declared its goal of emerging as a global leader in AI and robotics, is channeling heavy investments in these fields. And as a single-party state, it can marry that technology with policing while facing fewer questions than its democratic counterparts.
“While a number of countries, including the U.S., boast the technological capabilities China is integrating into law enforcement, political considerations make it difficult for these countries to follow suit,” says Kam C. Wong, a former Hong Kong police officer and professor emeritus at Xavier University. The use of AI in policing does happen in the U.S., but it usually requires extenuating circumstances, he says.
For the moment, most initiatives are in a pilot phase and, therefore, small in scale. But the attraction of AI and robotics in policing is immense in a country with a fluid and diverse population, some 1.3 billion strong, and a relatively underskilled and undermanned police force. If public security is top priority — and human rights are somewhere near the bottom — as in China, that attraction can be almost irresistible.
According to Eda Erbeyli, project manager at Shanghai-based Daxue Consulting, technology is helping close the gap between urban and rural policing. In Shenzhen, the Municipal Public Security Bureau is developing a three-dimensional intelligence cooperation system where city, town and rural police can share content from a huge data cloud.
Despite such initiatives, rural police lack the basic infrastructure to develop the levels of control implemented in China’s cities. A huge majority of the country’s 170 million closed-circuit TV cameras are located in urban areas, according to Erbeyli, and those that are in place in the countryside are not usually connected to big-data platforms.
Globally, the use of automation in policing is raising far greater concerns than just effectiveness and infrastructure inadequacies. To find an equilibrium between privacy, human rights and the huge potential of these controversial technologies, full transparency throughout the transition is vitally important, according to Simon Clifford, who sits on a number of police technologies bodies, including acting as digital adviser to the Police ICT board for England and Wales.
It’s unclear if this is happening in China. A directive bars police in Beijing from discussing police policy with anyone outside the force, particularly the media. Debate about the use of big data and predictive policing has attracted the attention of China’s online censors.
“They can do these things in countries where there is no democratic oversight,” says Gloria Laycock, founding director of the Jill Dando Institute of Security and Crime Science at University College London. Police could successfully review CCTV footage and identify suspects in the wake of the 2011 London riots, she acknowledges. But automated tracking through AI would make people uncomfortable, she says. “People in democratic societies may trust the government, but they remain nervous of ‘the State,’ without really knowing what that is,” she says.
The response from Chinese netizens has largely been more of derision than worry, with the appearance of police robots receiving a majority of online attention. One Weibo user described the robots in Zhengzhou as “too cute,” with another imagining one of the machines mediating an argument between two angry citizens.
But concerns have also emerged about China’s end goal, and whether the use of advanced tech in policing is about more than just crime prevention. “The big-data program and the expanding use of CCTV and facial recognition for tracking citizens is actually about governance and controlling the public,” says Wong.
Human Rights Watch has voiced concerns over the use of data in policing areas of Xinjiang where ethnic minorities are frequently targeted by public security initiatives. But while these concerns are warranted, especially when it comes to detaining people for crimes they haven’t yet committed, Laycock cautions against seeing all technology as inherently threatening. In many cases, she suggests, technology can improve efficiency and negate cuts in police force numbers. Questions remain as to whether China will find this balance, but it is charging ahead regardless.