Skip to content

Chengdu, China—18 October 2021: A citizen uses a a facial recognition machine at the entrance of a 24-hour bookstore. Readers can enter the 24-hour bookstore through facial recognition system. (Photo by An Yuan/China News Service via Getty Images)

China Announces Facial Recognition Regulations

The Cyberspace Administration of China (CAC) released a draft set of regulations on the use of facial recognition in an attempt to put limits on the increased use of the technology in the country. 

The draft regulations are in an open comment period through 7 September 2023. They place restrictions on the use of facial recognition technology by businesses, but allow for national security use case exceptions, according to the proposed text in the notice on the "Regulations on the Safety Management of Face Recognition Technology Application (Trial Implementation)."

Press reports on the announcement depict regulations focused more on broad directions than explicit requirements. Some of those broad directions, as reported in CNBC and Reuters, include:

  • Nonbiometric means of identification are preferred over biometric means.

  • Facial recognition should only be used for a specific purpose and when other means of identification are not sufficient.

  • If facial recognition is necessary, then using a national system is encouraged.

  • Installations in public places should be for the purpose of maintaining public safety.

  • Businesses should not offer special or better services to people who use facial recognition versus those who choose not to.

Some measures in the proposed regulation are more explicit, including:

  • Organizations in possession of facial identifications of 10,000 or more individuals must register with the CAC, and authorization is needed to retain facial images in their original resolution.

  • Signs must accompany any public installations of the technology.

  • Buildings may not require facial recognition as the only method of entering or exiting; they must provide equally convenient alternatives.

The regulations are part of China’s continued efforts to protect the privacy of its citizens from business overreach. Facial recognition in particular has rapidly grown in various sectors such as hotel guest check-in and services to payment systems to apartment building access. The Reuters report even mentioned instances where bathroom toilet paper dispensers were equipped with facial ID systems.

The proposed regulations do have carve outs for national security issues and do not restrict the government’s collection or use of information. The regulations also do not appear to be designed to curb the country’s growing reputation of being a surveillance state.

Reporting from The Register and TechCrunch noted what they called “credible” reports that the Chinese government routinely uses advanced surveillance systems, including facial recognition, in the widely publicized abuses of the Uyghur populations in the country’s northwest.

Elsewhere in the News...

Another biometrics news story of note was published in yesterday’s New York Times. Sam Altman, the CEO behind OpenAI and ChatGPT, is cofounder of Tools for Humanity and its Worldcoin cryptocurrency project. At the crux of the project is the vision that artificial intelligence (AI) will displace multitudes of workers around the world, and Worldcoin will provide an income to sustain them.

The catch? Well, you have to protect this stream of income from AI itself, and the way to do that is “to record images of a person’s irises” and “convert those scans into bits of numerical code, which are supposed to serve as a new type of digital ID.”

[ Stay Aware of Threats. SM7 Newsletter: Sign Up ]

The audacious goal is to scan the irises of all 8 billion inhabitants of Earth. And data collection has started, raising alarms.

“Last month, the authorities in France and Germany said they were investigating Worldcoin’s data collection practices,” the Times reported. “On Wednesday, the government of Kenya ordered Tools for Humanity to stop conducting scans, blaming a ‘lack of clarity’ in its handling of sensitive information.”

Stay tuned.