China’s New National Laboratory For Advanced Policing Heralds New Era In Surveillance Technology

Surveillance

The world over there is friction between the desire of citizens for governments, local authorities and police forces to make use of the latest technology in the world to prevent and prosecute crimes and the often equally strongly voiced demand for personal privacy. The reality is these two prerogatives often come into conflict.

In China, that conflict is neatly resolved for its citizens and visitors. They don’t have the cognitive dissonance between wanting the kind of surveillance technology that can prevent or help prosecute crime but also inherently infringes on personal privacy because they don’t have a choice in the matter. The state decides what’s in the best interests of its citizens and if they disagree, the authorities can’t be voted out of power.

And China’s authorities have definitively arrived at the conclusion that ignoring individual sensibilities around personal privacy is a price worth paying to insure maximum law and order. Surveillance technology is increasingly being utilised by national and local authorities.

The line between whether the technology is used to police or oppress in China may not always be clearly kept as far as Western observers are concerned. But, setting aside politics and broader questions around human rights, China clearly feels it has little choice but to leverage the technology available to it to police its huge population and vast expanse.

China, with 1.4 police officers per 1,000 people according to Interpol, is one of the least-policed countries in the world if defined purely on a metric of manpower. As such, it is understandable that its authorities are keen to bolster a thinly stretched human police force with surveillance and other new forms of policing technology.

To that end, China Electronics Group Corporation, the country’s biggest military technology company, has set up a national laboratory that will research new technology for ‘advanced policing’. AI for crime prediction and surveillance cameras able to recognise the emotional state of the individuals in range are just two of the fields of technology the lab has been tasked with developing to the point they can be deployed by China’s police and national security forces.

The lab, named the ‘National Engineering Laboratory (NEL) for Big Data Application on Social Security Risks Sensing, Prevention & Control’ is not a physical premise but rather a network of researchers funded by grants to work on particular projects. According to the Financial Times:

“Its research endeavours include anticipating where crimes might happen based on extrapolating from previous spatial patterns, reconciling surveillance footage from different cameras to piece together a person’s movements and using video to automatically recognise their emotions”.

Documents leaked to local media in China suggest that NEL’s Beijing office has already awarded 16 grants worth around £330,000 each to researchers selected by a panel that included Fan Lixin, Xinjiang’s deputy police chief. One of the grantees, the only overseas recipient, was Professor Tao Cheng of University College London.

Professor Chen’s lab, the UCL Space Time Lab, also counts the UK’s Metropolitan Police among its clients. The lab analyses large data sets to create hourly or daily crime prediction maps.

This aptly demonstrates that China is by far the only country increasingly leveraging the latest technology in the world to help it police itself more efficiently within tight budget constraints. Other countries, the UK included, may have less politicised definitions of crime than China does and tighter controls on how and when surveillance and other new policing technologies are used.

But the fact that technology that can be, and increasingly is, used in policing already exists, and continues to develop at pace, means that the debate between how the balance between public security and personal privacy should be maintained will only grow louder in coming years.

Here are some of the technologies that decisions will have to be made on in the UK and the rest of the world in coming years. And others already in use you probably had no idea about:

Gun Detection Software

Vox reporter Rebecca Heilweil has investigated some of the new surveillance technology the NYPD might be using. As part of the investigation, she filed a request for public records on contracts the NYPD has with different security companies.

She was particularly interested in whether gun detection software is already being utilised in New York. It works by artificial intelligence automatically sending an alert if a brandished firearm appears in a video feed, such as from CCTV cameras.

One filing her request she received the following response:

“But just over 48 hours after filing it, my entire request was rejected because, the NYPD said, those documents would reveal ‘non-routine techniques and procedures’ — with no elaboration.

So that same day, I appealed, and about a week later I heard back from the department. Now I had a new explanation: Answering my requests could reveal these companies’ trade secrets, harm their ‘competitive position,’ or impact ‘imminent contract awards or collective bargaining negotiations.’ If I wanted to fight further, I’d need to go to court”.

Gun detection software that works with surveillance camera footage exists. If the NYPD or other international police forces have already started testing it, is less clear.

Facial Recognition Technology

In January of this year, London’s police department announced that it would begin using facial recognition to spot criminal suspects with video cameras as they walk the streets. The software can detect in real time if a person caught on camera matches one of the faces in a database of suspects.

The ‘real time’ element goes a step further than most facial recognition software used elsewhere, which is only used to match an image from CCTV camera if requested. The introduction of the technology in London has divided opinion between those who believe it is a necessary step for a city that has suffered from a number of terrorist attacks in recent years and other who argue it violates personal privacy.

Other buyers of the technology, which is licensed from Japan’s NEC include Surat, a city of about five million people in India, and the country of Georgia, according to the company’s website. However, its use is still relatively rare outside of China.

Crime Prediction Algorithms

Around 14 UK police forces are now reportedly making use of AI-powered crime prediction software. Crime prediction software falls into two main categories.

The first, which is less controversial, involves “predictive mapping”. The software parses big data bases to map out crime “hotspots”, which also take into consideration the time of the day or particular dates, events and seasonality. Patrols are then ramped up and down based on the predicted likelihood of need.

The second kind of crime prediction software is called “individual risk assessment”, which attempts to predict how likely an individual is to commit an offence or be a victim of a crime. Civil liberty groups have accused this kind of software of perpetuating inherent biases.

Liberty, one such group, recently produced a report which levelled the accusation:

“These opaque computer programs use algorithms to analyse hordes of biased police data, identifying patterns and embedding an approach to policing which relies on discriminatory profiling”.

A spokesperson for Avon and Somerset Police, one of the departments which uses the ‘Qlik’ system responded:

“We make every effort to prevent bias in data models. For this reason the data… does not include ethnicity, gender, address location or demographics.”

Emotion Recognition Technology

At least for now, China appears to be the only country that is officially testing emotion recognition technology as part of its approach policing. This takes things onto a new level by aiming to predict an individuals’ behavior by reading their emotional state from surveillance camera footage.

Li Xiaoyu, a policing expert from Xinjiang, the northwestern region where the technology is being tested, explains that the technology uses cameras to scan people’s faces and ‘read’ their facial expressions in order to “identify criminal suspects by analyzing their mental state… to prevent illegal acts including terrorism and smuggling.”

The Future Of Crime Prevention and Policing

That, like almost every other sector imaginable, technology will play an increasing role in policing and crime prevention around the world is inevitable. That will range from non-controversial software that improves efficiency in the administration of resources to surveillance technology that will inevitably provoke fierce debate on the competing interests of communal safety and the right to privacy, focusing on the danger of biased and potentially inaccurate profiling.

Emotion recognition sounds like the most extreme extension of surveillance technology to date but new technology that is just as, if not more, controversial and disconcerting will almost certainly come to light over coming years. Where, when and how these technologies are used will undoubtedly be a major point of policy conflict in the UK and elsewhere around the world over the years and decades ahead.

Disclaimer: The opinions expressed by our writers are their own and do not represent the views of Scommerce. The information provided on Scommerce is intended for informational purposes only. Scommerce is not liable for any financial losses incurred. Conduct your own research by contacting financial experts before making any investment decisions.

scommerce

Welcome! Get free access to EVERYTHING we publish…

Whether you are an investor, tech enthusiast, or entrepreneur we have something for you. You'll get our FREE weekly newsletter with latest news and information along with special offers. Please take time to read our privacy policy. The information you provide us will be processed in accordance with this.