Research found that deepfakes can fool biometric checks used by banks. Despite this obvious flaw, KYC providers don’t really seem bothered about the possibility of exploitation.
A group of researchers discovered that deepfake technology can trick biometric checks used among banks and cryptocurrency exchanges to authenticate users’ identities.
Sensity, a security company specialized on deepfake detection, detailed how it managed to circumvent an automated “liveness test” by employing AI-generated faces in a paper published on Wednesday.
Such verification methods, typically known as “know your customer” or KYC checks, frequently require individuals to supply images of their identification as well as their faces. A “liveness test” is then performed to record the users’ faces in real-time so that they may be matched to their selfie and identification photo using facial recognition.
KYC verification is used in many industries, namely banking, fintech, insurance, cryptocurrency, and gambling. Sensity released video of their presentation a week before its study, which detailed how 9 of the top 10 KYC providers proved extremely susceptible to deepfake assaults.
Subscribe to GreatGameIndia
“Despite its widespread adoption, active liveness checks are weak against attacks by Deepfakes,” the report states. “The reason is that real-time Deepfakes can reproduce faithfully facial landmark movements of the attackers.”
Despite this obvious flaw, KYC providers don’t really seem bothered about the possibility of exploitation. Sensity’s chief operating officer, Francesco Cavalli, stated that susceptible firms did not appear to care in a comment to the Verge, which first published the report on Wednesday.
“We told them ‘look you’re vulnerable to this kind of attack,’ and they said ‘we do not care,’” he said. “We decided to publish it because we think, at a corporate level and in general, the public should be aware of these threats.”
With huge crypto heists being more regular, it appears probable that cybercriminals could abuse such flaws more and more as deepfake technology grows more realistic and easier to deploy.