A new AI-powered tool, ProKYC, has raised concerns over its potential to bypass stringent Know Your Customer (KYC) measures at major crypto exchanges, according to a report by cybersecurity firm Cato Networks. The tool is seen as a significant advancement in crypto fraud, allowing bad actors to easily generate convincing fake identities and pass security checks.
The Oct. 9 report highlights how ProKYC allows criminals to circumvent traditional KYC protocols that compare a user’s webcam image to their government-issued IDs like passports or driver’s licenses. Instead of relying on forged documents purchased on the dark web, fraudsters can now create entirely new identities using AI, making detection even more challenging for security systems.
ProKYC Deepfake Tech Targeting Crypto Platforms
A video demonstration provided by ProKYC shows the tool in action as it generates a deepfake video and corresponding identity document for a fake person. In the video, the AI-generated face is seamlessly integrated into an Australian passport template, successfully passing the KYC measures of Dubai-based crypto exchange Bybit.
Etay Maor, chief security strategist at Cato Networks, remarked that this tool represents “a new level of sophistication” in fraud tactics, which poses a growing threat to crypto exchanges and financial platforms. The tool’s customizability specifically for bypassing high-level KYC protocols makes it an even more formidable tool in the hands of cybercriminals.
Also read: Ripple Co-Founder Donates $1M in XRP to Kamala Harris Amid SEC Legal Battle
New Account Fraud Made Easier
ProKYC is not just limited to crypto exchanges. It also claims to bypass KYC systems on payment platforms like Stripe and Revolut. It offers a subscription package that includes various features such as camera emulators, facial animations, and fingerprint generation for $629 annually. This enables fraudsters to open new accounts under fake identities, a practice known as New Account Fraud (NAF).
While the tool’s sophistication makes it harder for crypto platforms to detect fraudulent accounts, Maor emphasized the challenge of balancing security with user experience. Overly strict biometric systems could lead to false positives, frustrating legitimate users, while lenient controls may let fraudsters slip through.
AI Fraud on the Rise
The rise of AI-driven fraud has heightened concerns in the crypto space. Cato Networks notes that identifying AI-generated content can still rely on human oversight, such as spotting inconsistencies in image quality or facial movements in deepfake videos.
As AI-powered fraud tools like ProKYC gain traction, crypto exchanges and financial institutions may need to rethink their approach to security to prevent new forms of identity fraud.
With increasing incidents of fraud in the industry, penalties for identity fraud remain severe in the United States, carrying potential sentences of up to 15 years in prison and heavy fines.
In recent months, firms like Gen Digital, which owns antivirus brands Norton and Avira, have reported a surge in the use of AI deepfake videos to promote fraudulent token schemes, illustrating the growing prevalence of this emerging threat in the crypto space.