The Art and Science of Bypassing Biometric Screening
In 1971’s Diamonds Are Forever, Sean Connery’s James Bond adroitly pulls silicone off his fingertips, his false fingerprints having served their purpose of verifying his false identity. In 2001’s Minority Report, a science fiction adventure set in 2054, Tom Cruise’s John Anderton thwarts iris detection protocols by means of eyeball transplants (Note: the clip is not for the faint of heart).
Hollywood has long been ahead of the game in both depicting futuristic uses of biometric security technology as well as the methods that can defeat it. Leaving the realm of fiction, Europol Innovation Lab just released Biometric Vulnerabilities: Ensuring Future Law Enforcement Preparedness, a 60-page report examining how criminals, terrorists, spies, and other bad actors are developing ways to fool biometric screening techniques thought to be extremely hard to fool.
“It should be understood that most of the vulnerabilities reported in academia are still at the laboratory testing stage,” the report explains. “The possibility of the attacks detailed in this report, therefore, should not be taken to mean that the systems are weak, but rather should be seen in the light of a continued effort to pre-empt the possible exploitation of such weaknesses and to raise awareness so that any attempts to exploit them will be caught early on.”
Despite its length, the report has a narrow focus: It examines a specific vulnerability present in four kinds of biometric screen technology—fingerprints, facial recognition, iris recognition, and voice recognition. ISO/IEC 30107-1:2023 identifies the different vectors that could be used to attack biometric systems. The Europol report focuses solely on the vulnerability described as a “presentation attack”—aka, fooling the sensor.
“A biometric Presentation Attack (PA) is a direct attack at the biometric capture device (e.g. fingerprint sensor, camera, microphone, etc.) performed by an attacker using a presentation attack instrument (PAI)—an artefact or a modified biometric characteristic—with the intention to impersonate a bona fide user (enrolled data subject) or to evade biometric recognition (obfuscate their identity),” the report explains.
Fingerprints
When the seemingly impossible spycraft of the 1970s meets 50 years of computer imaging technology and materials science, the James Bond blueprint of sourcing fingerprint data from a database and using it to 3D-print replicas on a thin substrate that can then be applied to fingers is no longer farfetched. One of the limiting factors of fooling fingerprint reading devices with this method is the source material being a 2D representation of actual three-dimensional fingertips. However, the report notes that new, advanced 3D printing technologies “allow for the production of high-resolution artefacts which present characteristics similar to those of real fingerprints.”
Of course, the report notes, if the person whose fingerprints are being copied is consensual in the scheme, then actual 3D molds of fingertips can lead to even better recreations.
“This may allow illegal immigrants to cross the border or criminals to travel under the biometric characteristics and name of another person,” the report says.
Another form of fingerprint manipulation is to destroy or alter fingerprints in order to avoid identification. “The destruction and alteration of fingerprints can occur in various ways such is cutting, rubbing or stitching the finger, using acids, or even through fingerprint transplantations,” the report explains.
Facial Recognition
The promise of biometrics is that they can provide enhanced security. They should not, for example, be able to be fooled by a photo of someone. Yet, the report cites a research study in which 96 percent of personal devices requiring facial recognition to unlock were bypassed with a simple photo printout. So-called “replay attacks” increase the success rate a bit more: up to 98 percent. In replay attacks, videos rather than static photos are used.
There are more advanced applications of facial scanning technology that what is present in consumer devices, and even those are getting more advanced. Enter 3D mask printing. Cheap, generic silicone masks may be effective at avoiding detection by facial recognition systems in a crowd, but they would not be successful at spoofing an identity to gain access. However, building a customized mask of someone’s face has more promise. The Europol report estimates the cost of creating such a mask at $3,000, though the cost is falling. One study showed that such masks were 57 percent effective at fooling 2D facial recognition systems.
Another facial recognition manipulation technique is called morphing, which is taking two somewhat similar-looking individuals and using photo manipulation software to create a composite face. A photo of this face, then, could be used in the creation of an identification, such as a passport. “Morphing allows, for instance, criminals on a watch list to pass through border checks unnoticed by travelling with someone else’s passport with a morphed picture of their and the passport owner’s face in it,” the report says.
Finally, there is good, old-fashioned make-up alterations, which can be used to conceal a subject’s identity and, in some cases, to impersonate someone. Such transformations can be remarkable—just ask Mrs. Doubtfire.
Iris Recognition
While the gruesome method of detection avoidance in Minority Report—eye transplants—remains science fiction at least as a criminal method, one of the iris recognition presentation attacks in the report does have an ick-factor: “Another method involves using the iris of deceased individuals, as the texture remains unchanged for a few hours after death.”
Less grisly methods mirror some of the facial recognition methods, including printed photographs of eyes, which can be done both as 2D printouts as well as 3D eyeball-sized substrates. The report says duplicating iris textures on contact lenses is not currently a viable method of fooling an iris detection system.
However, textured contact lenses are a method employed to avoid identification, just as the printed methods can also be used for this purpose.
Speech Recognition
Applications from online banking to e-commerce to smart device authentication use voice biometric systems to authenticate users. The systems primarily use patterns associated with physiological attributes, including the vocal tract and pharyngeal, oral, and nasal cavities, which combine to form unique sounds.
“Despite their increasing adoption, evidence shows that, unless they are adequately protected, voice biometrics systems can be vulnerable to manipulation through a variety of different spoofing/presentation/deepfake attacks stemming from impersonation, replay, synthetic speech, and converted voice,” the report says. “At the same time, there is an ever-increasing number of easy-to-use apps available for deepfake voice imitations.”
While it may fool a human authenticator, voice impersonation is unlikely to fool voice authentication systems based on physiological attributes of speech. The primary attack vectors are replay attacks, which are direct recordings of people’s speech; synthetic speech, which use text-to-speech systems to synthesize speech in the target’s voice; and voice converters, which convert one speaker’s words into the voice patterns of another person.
Obviously replay attacks are limited to actual recordings of the target’s speech. The latter two systems—synthesis and conversion—are more dangerous and are “under constant development and the latest techniques are beginning to challenge even the best-performing presentation attack detection solutions.”
The report also discusses deepfake technology, which employs variants of facial recognition deception often paired with synthetic or converted voice alterations. In some situations, it’s possible for deepfake technology to be used as presentation attacks—particularly in the case of access controlled by remote video access. It also notes the potentially catastrophic role deepfakes can have on scams and fraud.
Mitigation
The report aims to increase the effectiveness of biodata security methods by offering several recommendations.
Raise awareness. It’s important for law enforcement and others using the technology to share information and seek continuous education on recent developments. “By staying informed and updated, experts can effectively address potential threats and enhance their investigative capabilities,” the report says.
Advanced evasion detection techniques. Presentation attack sophistication and methods continue to evolve. Those relying on the technology must maintain cutting-edge practices in their detection systems. “Any implementation should be based on a thorough understanding of the capabilities and limitations of these” presentation attack detection technologies, the report says.
Adopt an integrated approach to biometric recognition. The parts of a biometric recognition system include data collection, storage, transmission, identification, and verification. The security of each part of this system defines the security of the holistic solution. “All these separate parts together make a strong biometric system and focusing only on one may be pointless if the other parts are not equally strong,” the report says.
Enhanced collaboration. Experts in various fields from biometrics, forensics, cybersecurity, security, and other areas should share knowledge, theory, experiences, and applications of biometric identification. “To get a good sense of biometric recognition (identification/verification) and its vulnerabilities, it is important to connect experts in the field and in research so they can share their insights,” the report says.
Standardized reporting and aggregation. Presentation attacks and other attacks against biometric identification technologies are not widely reported, at least not as attacks specific to these types of systems. An access control breach may be reported as that, not as a breach of a specific biometric application. “A harmonized international coding scheme is needed to compile aggregated data, indicating the potential threats against operational biometric systems,” the report says.