Skip to content
Menu
menu

Illustration by iStock; Security Management

Settlement Would Restrict Database Sales, But Not Impact Clearview AI’s Overall Business

A settlement filed in the U.S. state of Illinois on 9 May would place restrictions on Clearview AI’s ability to sell or provide free access to its facial vector database for use in the United States, but not limit its ability to do business with most government entities.

Clearview AI is one of the largest companies selling a facial matching product built on a database that contains more than 10 billion facial images from public Web sources, including news media, mugshot websites, social media sites, and more. Law enforcement is a regular customer of Clearview AI’s,  including agencies at the U.S. federal government level that recently used the company’s products to identify individuals involved in the U.S. Capitol attack on 6 January 2021. 

The American Civil Liberties Union (ACLU) filed suit against Clearview AI in May 2020, alleging the company “repeatedly violated the Illinois Biometric Information Privacy Act (BIPA), a law adopted in 2008 to ensure that Illinois residents would not have their biometric identifiers—including faceprints—captured and used without their knowledge and permission,” according to an ACLU statement.

The settlement, which must still be approved by the court, would prohibit Clearview AI from selling its database to private entities or private individuals—except financial institutions allowed under BIPA—and prevents Illinois government agencies from purchasing or perusing the database for free. The settlement, however, does not restrict Clearview AI from selling its database or products to U.S. federal government agencies, U.S. state government agencies outside of Illinois, or to these entities’ contractors. It also des not prohibit Clearview from selling its algorithm to private entities.

Clearview AI admitted no wrongdoing under the proposed settlement and will pay approximately $300,000 to cover advertising of settlement details and attorney fees, which Lee Wolosky, partner at Jenner & Block representing Clearview, called a “huge win” for the company.

“Clearview AI will make not changes to its current business model,” Wolosky said in a statement to Security Management. “It will continue to expand its business offerings in compliance with applicable law. And it will pay a small amount of money to cover advertising and fees, far less money than continued litigation would cost.”

Clearview had been exploring the possibility of selling its product to private customers, but so far had not begun to engage in that business. 

“Clearview AI’s posture regarding sales to private entities remains unchanged,” said Hoan Ton-That, CEO of Clearview AI, in a statement sent to Security Management. “We would only sell to private entities in a manner that complies with [Illinois Biometric Information Privacy Act, (BIPA)]. Our database is only provided to government agencies for the purpose of solving crimes.”

Instead, Ton-That said that Clearview has informed the court that it will continue to provide its facial recognition algorithm to commercial customers, without the database, in a consent-based manner.

“Today, facial recognition is used to unlock your phone, verify your identity, board an airplane, access a building, and even for payments,” Ton-That added. “This settlement does not preclude Clearview AI selling its bias-free algorithm, without its database to commercial entities on a consent basis, which is compliant with BIPA.”

While narrow, the settlement does raise questions about the relationship between privacy and law enforcement, says Samuel Adams, Westin Fellow at the International Association of Privacy Professionals (IAPP).

“Even though the scope of the settlement is narrow—the agreement does not, for example, restrict Clearview’s sales in 49 other states—companies should be aware of the shifting legal landscape surrounding facial recognition and other biometrics,” he says. “Seven states have introduced biometric laws, which are generally based on Illinois’ BIPA, and two other states—Washington and Texas—have enacted their own biometric laws. It feels like only a matter of time until more states join Illinois, Washington, and Texas by establishing legal standards for using facial recognition by public and private entities.”

BIPA is a unique statute in the United States that places limits on how biometric identifiers for Illinois residents can be used. These identifiers include retina, iris, fingerprint, voiceprint, hand, and face geometry scans, but do not include writing samples, written signatures, photographs, human biological samples used for scientific testing or screening, demographic data, tattoo descriptions, or physical descriptions.

The law requires private entities to have written policies—available to the public—that establish retention schedules for biometric identifiers and information, along with guidelines for destroying those identifiers. Private entities must also obtain consent before collecting biometric identifiers; the law prohibits private entities from selling or sharing biometric identifiers without consent; and private entities must implement measures to protect biometric identifier information that is obtained in compliance with the law.

The law, however, creates exemptions for financial institutions and state and local government agencies, says Don Zoufal, CPP, president and founder of CrowZnest Consulting and legal advisor for the Illinois Association of Chiefs of Police.

What the ACLU has achieved through this proposed settlement, Zoufal says, could not have been achieved by suing the Illinois government directly because the BIPA statute expressly exempts governmental operations. Instead, by suing Clearview AI, the ACLU was able to obtain a settlement where the company is unable to do business with the government—not a court ruling that would restrict the government’s ability to use facial matching or facial recognition tools.

“Essentially, they say this is about privacy, but they’ve done nothing to vindicate the privacy rights of the individuals,” adds Zoufal, who is also a member of the ASIS International Security Applied Sciences Community. “They are just keeping the gallery of photographs out of the hands of law enforcement.”

Zoufal adds that the settlement will not likely have a significant impact on the private sector and that the agreement “does little to advance the true interests of privacy protection, but does everything to advance the interest to make it difficult for government to conduct criminal investigations” by limiting access to the facial vector database Clearview AI has compiled.

BIPA was in the spotlight previously due to a recent settlement with Facebook, which agreed to pay $650 million to settle a class-action lawsuit alleging the company violated BIPA by using its facial recognition software to tag and store residents’ photos without consent.

The recent settlements may encourage other U.S. states to adopt similar laws to BIPA and take them a step further by limiting law enforcement use of facial recognition and face matching tools to address privacy concerns.

“Surveillance partnerships between private companies and governments are a far-reaching threat to privacy,” said Adam Schwartz, senior lawyer at the Electronic Frontier Foundation, in a statement shared with Security Management. “A prime example is Clearview, which extracts faceprints from billions of people, and then sells police the service of identifying unknown suspects in probe photos. The settlement announced today in the Illinois lawsuit, ACLU v. Clearview, demonstrates the need for strong data privacy laws, modeled on the Illinois Biometric Information Privacy Act. These laws must also include a ban on government use of face recognition technology, including through private contractors like Clearview.”

For more insights on privacy and biometrics, check out the December 2021 issue of Security Technology.

arrow_upward