In China, AI cameras alert police when a banner is unfurled

The surveillance technology is just one example of the proliferation of ‘predictive policing’ in the country.

A widely used Chinese video surveillance company sanctioned by Western governments incorporates an AI technology that automatically alerts authorities if a person is detected unfurling a banner.

The AI in cameras made by Dahua Technology appears to be explicitly aimed at quelling protests, according to IPVM, a U.S.-based surveillance research company that first reported the technology's existence.

Dahua deleted references to the system, called “Jinn,” after IPVM asked the company for comment, but an archived version of its website discusses its use for the purposes of “social safety” and “social governance” – terms frequently used by Chinese authorities to justify surveillance and arrests.

The detection system is just one example of the growth of AI and government tracking technologies in China that have proliferated over the last several years amid the COVID-19 pandemic.

A series of mass technology procurements by police forces across China have greatly increased authorities’ abilities to clamp down on social freedoms, control citizens and, critics say, abuse groups targeted by the government.

'An alarm will be generated'

According to Dahua's archived webpage, the AI system was launched in 2021 and available as of May 2023.

Its debut appears to have coincided with a wave of police investment in geographic information systems across China in 2020.

Chinatechsurveillance_02.jpg
Dahua surveillance cameras installed on the Dahua Technologies office building in Hangzhou, China, May 29, 2019. Credit: AFP

It is not known what police jurisdictions use this particular Dahua AI, but the company is a major provider of police technology, said Charles Rollet of IPVM.

“With the banner alarm – that's catering to the Chinese enterprise market: the big, usually police, authorities,” he said. “It's intended for police or some form of city authority … there's no reason to track them [banners] automatically unless you want to track protests, basically.”

Perhaps the most recognizable protest in China in recent years – the White Paper protest against strict COVID lockdowns – was started by a man unfurling a banner on a bridge last year — an indication of the possible relevance of the technology for police (though it is not known if unfurling banner tracking was used by police in that particular case).

Dahua, which is sanctioned by the U.S., U.K. and Australian governments, provides a number of predictive policing AI technologies that can surveil civilians using biometrics data. Previously, internal documents from the company showed that it provides facial recognition AI to track Uyghurs, which led to the Western sanctions. Dahua denied racial targeting.

A demo of the banner-unfurling AI filmed in 2020 was also posted on Dahua’s website before being deleted. “If a person holding a banner is detected within the camera field and lasts for a certain period of time, an alarm to police will be generated,” the demo explained.

Dahua did not respond to a request for comment from RFA.

Policing tech boom

The banner unfurling technology is a continuation of “the development of AI and how that technology is becoming really available” to Chinese police, said Rollet.

China is known to collect vast troves of data on its residents, and rapidly expanding AI technologies give authorities a new way to gather intel.

A solicitation for proposals for an AI tracking project from Shanghai police also unearthed by IPVM last month lays out some of the ambitions authorities harbor for using the vast data they have gathered.

“Traditional police work needs to be transformed into digital, intelligent and convenient simplified online operation,” it said. “The effective management of the model to make it play its biggest role has become an urgent problem in the development of public security technology.”

Chinatechsurveillance_03.jpg
A man has his face marked for identification by technologies from state-owned surveillance equipment manufacturer Hikvision on a monitor at Security China 2018 in Beijing, China, Oct. 23, 2018. Credit: Ng Han Guan/AP

The project aims to create automatic alerts to inform police of movements of particular populations in the Songjiang district of Shanghai, a populous suburb with a large population of academics and university students.

The “target populations” the project seeks to automatically track include Uyghurs; foreigners with illegal residence status; faculty and staff members of key universities; foreign journalists stationed in China; foreigners who have visited Xinjiang or other similar areas; individuals with COVID vaccinations; suspected criminals, sex workers, and drug dealers; and families with abnormal electricity consumption.

According to a notice on its website that was later removed, Songjiang police awarded the project to a technology security firm, the Shanghai Juyi Technology Development Company, that appears to specialize in government contract work.

The Shanghai Juyi Technology Development Company did not return a request for comment.

As with Dahua, the Songjiang police removed the notice after IPVM publicized it in May, and RFA was unable to reach the project’s manager listed on the document.

The limits of Big Brother

The 26 categories of “target populations” in the Shanghai project are what are considered “focus personnel” by Chinese authorities, according to Maya Wang of Human Rights Watch.

“People who are petitioners, people who have a prior criminal record, people who have psychosocial disabilities and so on, … these groups of people are being monitored by the police” both physically and through technologies, Wang told RFA.

Chinatechsurveillance_04.jpg
Chinese paramilitary firefighters stand on guard beneath a light pole with security cameras at the Great Hall of the People in Beijing, March 8, 2018. Credit: Mark Schiefelbein/AP

But the way in which AI is used to track people shows both the sophistication and artlessness in how Chinese authorities think about surveillance, said Geoffrey Cain, author of "The Perfect Police State," a book on Chinese surveillance.

The parameters they use – tracking the unfurling of a banner or flagging jumps in household electricity use (in the Shanghai police project) – tend to work backwards from behaviors that might only be vaguely connected to censured activities they are trying to pre-emptively clamp down on, such as protesting or cryptocurrency mining.

“It reminds me back when this whole surveillance state really got kicking off around 2016 and 2017,” Cain said. “They were going after people who suddenly start smoking or drinking or people who suddenly, you know, purchase the items being used to make a tent.

“And it's not because there's any specific reason, but the reasons they would give is that those types of behaviors are suspicious. It's almost like they've arbitrarily chosen something that would be unusual,” he said.

“It's as if the authorities are moving backwards, putting the cause before the fact.”

Discrimination and danger

But there is real impact for the groups targeted.

Mass surveillance of Uyghurs in particular has been a key factor in enabling their persecution, said HRW’s Wang.

“Wherever they go in China, Uyghurs are essentially being singled out for discriminatory and targeted policing,” she said. “And that means that they often suffer – they often are unable to find a place to stay, a hotel. Typically, when they take the train, they are subjected to investigation and interrogation and so on.”

Chinatechsurveillance_05.jpg
Visitors take photos near surveillance cameras as a policeman watches on Tiananmen Square in Beijing, July 15, 2021. Credit: Ng Han Guan/AP

According to a May analysis of Chinese police geolocation systems acquisitions by China Digital Times, a specialist media firm, a wave of police investment in these tracking systems was first seen in 2017, and then again in 2020, increasing throughout the COVID-19 pandemic.

“Some contracts coincided with other government purchases of surveillance systems specifically designed to target Uyghurs,” the report noted. “There are also notable concentrations of procurement in regions with significant Uyghur or other minority populations.”

More broadly, the concern is that “these [AI surveillance] systems are all empowering authorities to violate human rights in different ways, depending on how they are used,” said Wang.

“And when they are so cheap and widely available and in the context of the Belt and Road Initiative, given Chinese government Chinese financing, they are spreading with detrimental impact on rights globally,” she said.

Rollet agreed. “I could see this taking off in other countries,” he said. “I think the bigger risk is that it sets a precedent and gives other countries ideas about what they should do, you know?”

Edited by Boer Deng