The Virtual Lineup
U.S. State and federal agencies are amassing databases of American citizens’ fingerprints and images. The programs were largely under the public radar until a governmental watchdog organization conducted an audit on them. The so-called “virtual lineups” include two FBI programs that use facial recognition technology to search a database containing 64 million images and fingerprints.
In May 2016, the U.S. Government Accountability Office (GAO) released Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy, a report on the FBI programs. Since 1999, the FBI has been using the Integrated Automated Fingerprint Identification System (IAFIS), which digitized the fingerprints of arrestees. In 2010, a $1.2 billion project began that would replace IAFIS with Next Generation Identification (NGI), a program that would include both fingerprint data and facial recognition technology using the Interstate Photo System (IPS). The FBI began a pilot version of the NGI-IPS program in 2011, and it became fully operational in April 2015.
The NGI-IPS draws most of its photos from some 18,000 federal, state, and local law enforcement entities, and consists of two categories: criminal and civil identities. More than 80 percent of the photos are criminal—obtained during an arrest—while the rest are civil and include photos from driver’s licenses, security clearances, and other photo-based civil applications. The FBI, which is the only agency able to directly access the NGI-IPS, can use facial recognition technology to support active criminal investigations by searching the database and finding potential matches to the image of a suspected criminal.
Diana Maurer, the director of justice and law enforcement issues on the homeland security and justice team at GAO, explains to Security Management that the FBI can conduct a search for an active investigation based on images from a variety of sources—camera footage of a bank robber, for example. Officials input the image to the NGI-IPS, and the facial recognition software will return as many as 50 possible matches. The results are investigative leads, the report notes, and cannot be used to charge an individual with a crime. A year ago, the FBI began to allow seven states—Arkansas, Florida, Maine, Maryland, Michigan, New Mexico, and Texas—to submit photos to be run through the NGI-IPS. The FBI is working with eight additional states to grant them access, and another 24 states have expressed interest in using the database.
“The fingerprints and images are all one package of information,” Maurer says. “If you’ve been arrested, you can assume that you’re in, at a minimum, the fingerprint database. You may or may not be in the facial recognition database, because different states have different levels of cooperation with the FBI on the facial images.”
The FBI has a second, internal investigative tool called Facial Analysis, Comparison, and Evaluation (FACE) Services. The more extensive program runs similar automated searches using NGI-IPS as well as external partners’ face recognition systems that contain primarily civil photos from state and federal government databases, such as driver’s license photos and visa applicant photos.
“The total number of face photos available in all searchable repositories is over 411 million, and the FBI is interested in adding additional federal and state face recognition systems to their search capabilities,” the GAO report notes.
Maurer, who authored the GAO report, says researchers found a number of privacy, transparency, and accuracy concerns over the two programs. Under federal privacy laws, agencies must publish a Systems of Records Notice (SORN) or Privacy Impact Assessments (PIAs) in the Federal Register identifying the categories of individuals whose information is being collected. Maurer notes that the information on such regulations is “typically very wonky and very detailed” and is “not something the general public is likely aware of, but it’s certainly something that people who are active in the privacy and transparency worlds are aware of.”
GAO found that the FBI did not issue timely or accurate SORNs or PIAs for its two facial recognition programs. In 2008, the FBI published a PIA of its plans for NGI-IPS but didn’t update the assessment after the program underwent significant changes during the pilot phase—including the significant addition of facial recognition services. Additionally, the FBI did not release a PIA for FACE Services until May 2015—three years after the program began.
“We were very concerned that the Department of Justice didn’t issue the required SORN or PIA until after FBI started using the facial recognition technology for real world work,” Maurer notes.
Maurer says the U.S. Department of Justice (DOJ)—which oversees the FBI—disagreed with the GAO’s concerns over the notifications. Officials say the programs didn’t need PIAs until they became fully operational, but the GAO report noted that the FBI conducted more than 20,000 investigative searches during the three-year pilot phase of the NGI-IPS program.
“The DOJ felt the earlier version of the PIA was sufficient, but we said it didn’t mention facial recognition technology at all,” Maurer notes.
Similarly, the DOJ did not publish a SORN that addressed the collection of citizens’ photos for facial recognition capabilities until GAO completed its review. Even though the facial recognition component of NGI-IPS has been in use since 2011, the DOJ said the existing version of the SORN—the 1999 version that addressed only legacy fingerprint collection activities—was sufficient.
“Throughout this period, the agency collected and maintained personal information for these capabilities without the required explanation of what information it is collecting or how it is used,” the GAO report states.
It wasn’t until May 2016—after the DOJ received the GAO draft report—that an updated SORN was published, Maurer notes. “So they did it very late in the game, and the bottom line for both programs is the same: they did not issue the SORNs until after both of those systems were being used for real world investigations,” Maurer explains.
In the United States, there are no federally mandated repercussions for skirting privacy laws, Maurer says. “The penalty that they will continue to pay is public transparency and scrutiny. The public has very legitimate questions about DOJ and FBI’s commitment to protecting the privacy of people in their use of facial recognition technology.”
Another concern the GAO identified is the lack of oversight or audits for using facial recognition services in active investigations. The FBI has not completed an audit on the effectiveness of the NGI-IPS because it says the program has not been fully operational long enough. As with the PIA and SORN disagreements, the FBI says the NGI-IPS has only been fully operational since it completed pilot testing in April 2015, while the GAO notes that parts of the system have been used in investigations since the pilot program began in 2011.
The FBI faces a different problem when it comes to auditing its FACE Services databases. Since FACE Services uses up to 18 different databases, the FBI does not have the primary authority or obligation to audit the external databases—the responsibility lies with the owners of the databases, DOJ officials stated. “We understand the FBI may not have authority to audit the maintenance or operation of databases owned and managed by other agencies,” the report notes. “However, the FBI does have a responsibility to oversee the use of the information by its employees.”
Audits and operational testing on the face recognition technology are all the more important because the FBI has conducted limited assessments on the accuracy of the searches, Maurer notes. FBI requires the NGI-IPS to return a correct match of an existing person at least 85 percent of the time, which was met during initial testing. However, Maurer points out that this detection rate was based on a list of 50 photos returned by the system, when sometimes investigators may request fewer results. Additionally, the FBI’s testing database contained 926,000 photos, while NGI-IPS contains about 30 million photos.
“Although the FBI has tested the detection rate for a candidate list of 50 photos, NGI-IPS users are able to request smaller candidate lists—specifically between two and 50 photos,” the report states. “FBI officials stated that they do not know, and have not tested, the detection rate for other candidate list sizes.”
Maurer notes that the GAO recommendation to conduct more extensive operational tests for accuracy in real-world situations was the only recommendation the FBI agreed with fully. “It’s a start,” she says.
The FBI also has not tested the false positive rate—how often NGI-IPS searches erroneously match a person to the database. Because the results are not intended to serve as positive identifications, just investigative leads, the false positive rates are not relevant, FBI officials stated.
“There was one thing they seemed to miss,” Maurer says. “The FBI kept saying, ‘if it’s a false positive, what’s the harm? We’re just investigating someone, they’re cleared right away.’ From our perspective, the FBI shows up at your home or place of business, thinks you’re a terrorist or a bank robber, that could have a really significant impact on people’s lives, and that’s why it’s important to make sure this is accurate.”
The GAO report notes that the collection of Americans’ biometric information combined with facial recognition technology will continue to grow both at the federal investigative level as well as in state and local police departments.
“Even though we definitely had some concerns about the accuracy of these systems and the protections they have in place to ensure the privacy of the individuals who are included in these searches, we do recognize that this is an important tool for law enforcement in helping solve cases,” Maurer says. “We just want to make sure it’s done in a way that protects people’s privacy, and that these searches are done accurately.”
This type of technology isn’t just limited to law enforcement, according to Bloomberg’s Hello World video series. A new Russian app, FindFace, by NTechLab allows its users to photograph anyone they come across and learn their identity. Like the FBI databases, the app uses facial recognition technology to search a popular Russian social network and other public sources with a 70 percent accuracy rate—the creators of the app boast a database with 1 billion photographs. Moscow officials are currently working with FindFace to integrate the city’s 150,000 surveillance cameras into the existing database to help solve criminal investigations.
But privacy advocates are raising concerns about other ways the technology could be used. For example, a user could learn the identity of a stranger on the street and later contact that person. And retailers and advertisers have already expressed interest in using FindFace to target shoppers with ads or sales based on their interests.
Whether it’s a complete shutdown to Internet access or careful monitoring of potentially dangerous content, countries and companies around the world are taking advantage of the possibilities—and power—inherent in controlling what citizens see online. As criminals and extremists move their activities from land and sea to technology, governments must figure out how to counter digital warfare while simultaneously respecting and protecting citizens’ basic human right to Internet access.