Backlash Against Unregulated Use of Facial Recognition
Lawmakers in the United States and Scotland have begun looking into regulating law enforcement's use of facial recognition software after concerns were raised about privacy and human rights violations.
On 12 February, two U.S. senators handed out an early valentine: a bill aiming to ban law enforcement from using facial recognition technology until formal guidelines and limits on government use are established.
Introduced by senators Jeff Merkley (D-OR) and Cory Booker (D-NJ), the bill, S.3284, also hopes to prevent the technology from infringing upon privacy, civil rights, and First Amendment rights. It would, however, allow law enforcement to use facial recognition with a warrant.
Senators introduce facial recognition bill proposing moratorium on federal government use https://t.co/Let6Ahlw2j— Andrew Wyrich (@AndrewWyrich) February 13, 2020
While some law enforcement departments have been using some form of facial recognition for decades, recent advances in the technology have garnered much attention. In January, reports on Clearview AI's pioneering facial recognition app, used by hundreds of U.S. law enforcement agencies and a few companies, showed that the technology and its applications had not waited for permission and continued barreling past privacy concerns. An in-depth article in The New York Times reported on how more than 600 law enforcement agencies use the app, which is advanced enough to identify persons who appear in the background or even in the mirror of another person's photo.
Since the Times' look into Clearview, the company and facial recognition in general have received a lot more attention, including cease and desist letters to Clearview from Twitter, YouTube, Facebook, and Venmo; class-action lawsuits filed in Illinois and Virginia, claiming the company's practices violated state privacy laws; and calls for higher learning institutions to commit to not using the technology.
In New Jersey, attorney general Gurbir S. Grewal banned police from using the Clearview app in January, citing concerns about data privacy, cybersecurity, law enforcement security, and the ethics of the tool and its use in investigations. Grewal also pointed to issues with Clearview potentially sharing information to potential customers about ongoing prosecutions. The attorney general's office also sent the company a cease and desist letter over using such information in promotional materials.
Some Congressional lawmakers are looking for answers from Clearview. Representative Patrick McHenry (R-NC) has called for a hearing on the company's data collection practices in front of the House Financial Services Committee. Senator Edward Markey (D-MA) sent Clearview a letter asking for information on the agencies they advertised to app to and the ones that use the app, the accuracy of the software, and other internal facets of the technology's development, use, and consequences.
According to CNET, Civil rights groups, including the ACLU, the Electronic Frontier Foundation, and the Liberty Coalition, threw in their support to student groups concerned over the use of facial recognition on school campuses. A post by privacy rights group Fight for the Future announced a campaign against the use of such technology, including a national day of action with letter-delivery to campus administrations and other events, scheduled for 2 March. The coordinated event is in response to a perceived lack of urgency in Congress to legislate or regulate the use and spread of facial recognition.
We wish you had consulted with privacy and civil liberties groups before releasing this legislation, Senator.— Fight for the Future (@fightfortheftr) February 13, 2020
It's great to see lawmakers engaging on #facialrecognition, but we don't need legislation that speeds us toward weak regulations. We need to #BanFacialRecognition. https://t.co/cZ78O1iFIx
In 2016, Police Scotland announced it wanted to begin using a live facial recognition by 2026, however members of the Scottish Parliament argued against use of the technology, claiming a lack of justification. The program can compare faces from a crowd against photographs in police databases.
According to the BBC, the Parliament's justice subcommittee recently investigated the technology and found it would "discriminate against females and those from black, Asian, and ethnic minority communities." The report also noted that use of the technology would vastly divert from the department's current policy of "policing by consent."
The Justice Committee, @SP_Justice, also has a Sub-Committee on Policing, and it has been looking into live facial recognition technology, and what impact it would have on policing.— Scottish Parliament (@ScotParl) February 13, 2020
Read the report here: https://t.co/YVVbFSz9Ml
Police Scotland said it has placed its roll-out plans on hold and will participate in future debates about the software's use and implications.
Also in January, London's Metropolitan Police said it would begin using facial recognition on the city's streets.