The realm of artificial intelligence (AI) has experienced notable expansion in recent years, with forecasts indicating that, by 2030, the sector could contribute up to £15.7 trillion to the global economy — a sum surpassing the current economic outputs of key players like China and India combined.
One noteworthy development in this domain, attracting significant attention of late, is the emergence of “deepfakes” — highly realistic video and/or audio recordings crafted using AI, capable of replicating genuine human appearances or voices to a degree often indistinguishable from the real thing.
A recent viral post on X exemplified how some individuals are utilising readily available open-source and commercial software to manipulate a person’s selfie with generative AI tools, fabricating counterfeit ID images that could potentially deceive many contemporary security measures.
As the digital landscape progresses, the proliferation of AI-generated deepfakes poses substantial challenges to the established Know Your Customer (KYC) framework. Toufi Saliba, CEO of HyperCycle, a company developing the requisite components for enabling AI microservices to communicate seamlessly, explained to Cointelegraph that a significant aspect of this challenge lies within the existing security processes themselves. He remarked:
“Perhaps KYC itself is the attack vector on self-sovereignty, and these [deepfake] tools are proving how today’s KYC systems could be rendered obsolete in the near future.
A resilient solution would involve using certain cryptography properly to service the claimed intent of KYC proponents, thereby also protecting future generations.”
Saliba underscored the implications of AI and deepfakes for the cryptocurrency sector, stressing the urgent need for swift adaptation.
“This issue of fake image creation is likely to disrupt entire centralized systems from the inside out, thus presenting one of the best opportunities for well-intentioned regulators and centralized entities to realize that cryptography can come to the rescue when needed,” he asserted.
Regarding the detection of deepfake content, Dimitry Mihaylov, an AI research expert for the United Nations, stated that with criminals increasingly employing sophisticated tools to produce realistic fake identification documents, unforeseen challenges are emerging.
He emphasised the imperative for industries across the spectrum to evolve rapidly, adding:
“The market for AI-generated image detection is evolving with projects like FakeCatcher, a project that has been developed in partnership with Umur Ciftci. The technology showcases the potential of real-time deepfake detection with a 96% accuracy rate.”
Looking ahead, Mihaylov anticipates a significant shift in regulatory approaches to KYC, suggesting a future where dynamic and interactive verification processes become standard.
“We may see the introduction of more dynamic KYC procedures, like video KYC, as regulatory frameworks adapt to these technological advancements,” he suggested.
The impact of deepfakes reverberates across various industries, including cryptocurrency.
For instance, a platform named OnlyFake recently gained attention for purportedly circumventing the KYC protocols of several prominent cryptocurrency trading platforms successfully.
At just £15 each, the platform claims to produce counterfeit driver licenses and passports for 26 nations, including but not limited to the United States, Canada, the United Kingdom, Australia, and several European Union member states.
On Feb. 5, a company called 404 Media revealed that it had leveraged OnlyFake’s services to evade the KYC verification process of the popular crypto exchange OKX.
Likewise, leaked discussions unveiled OnlyFake’s clientele celebrating their success in bypassing verification processes at numerous other cryptocurrency exchanges and financial institutions, including Kraken, Bybit, Bitget, Huobi, and PayPal.
READ MORE: Grayscale’s Bitcoin ETF Sees Slowing Outflows, Potential for Further Bleeding
The process of generating a counterfeit document on the website is reportedly swift, with the platform capable of producing up to 100 fake IDs at once using Excel spreadsheet data alone.
Additionally, users have the option to include their own photo or select one from a curated “personal library of drops,” foregoing reliance on a neural network.
The counterfeit driver’s licenses and passports are depicted on various domestic surfaces like kitchen counters, bedsheets, carpets, and desks, mimicking the typical presentation for online verifications.
One post even showcased a fabricated Australian passport bearing the details of a former U.S. president laid out on a piece of fabric.
In late 2022, blockchain security company CertiK unveiled an underground marketplace where individuals offered their identities for sale for as little as £8.
These individuals consented to serve as the legitimate face for fraudulent cryptocurrency initiatives and to establish banking and exchange accounts for users otherwise barred from certain platforms.
Additionally, the widespread availability of AI deep fake technology has raised concerns among leaders in the cryptocurrency sector, particularly regarding the reliability of video verification processes employed in certain identity validations.
Binance chief security officer Jimmy Su expressed concern in May 2023 about the rise in fraudsters using deep fake technology to bypass exchange KYC procedures, warning that these video forgeries were approaching a level of realism capable of deceiving human evaluators.
Finally, a study by Netherlands-based Sensity AI revealed that liveness tests used for identity verification were significantly vulnerable to deepfake attacks, enabling scammers to substitute their own faces with those of others.
A man from Kerala, India recently fell victim to a ruse where the scammer, posing as his friend, stole £40,000 (approximately $500). Similarly, a deepfake video of Elon Musk sharing misleading crypto investment advice circulated on Twitter last year.
As we advance towards a future shaped by artificial intelligence, there is ample evidence suggesting that the threat of “face swap” deepfake attacks will continue to rise.
A recent study noted that attacks against remote identity verification systems surged by 704% between 2022 and 2023, attributed to the accessibility of free and low-cost face swap tools, virtual cameras, and mobile emulators.
Furthermore, with each passing month, hackers and scammers appear to be growing more sophisticated with their attacks, as evidenced by the emergence of digital injection attack vectors and emulators, enabling miscreants to create and utilise deepfakes in real-time, posing a serious challenge to both mobile and video authentication systems.
Consequently, it will be intriguing to observe how this emerging security paradigm evolves, particularly considering humans’ increasing reliance on these advanced technologies.
Read the latest crypto news today
Disclaimer: This article is provided for informational purposes only. It is not offered or intended to be used as legal, tax, investment, financial, or other advice.
Read on Crypto Intelligence Investment Disclaimer