The Increasing Use of AI in Crypto Crimes
The use of AI in illicit activities is still in its infancy but growing rapidly
June 10, 2024 03:30 AM
Reading time: 2 minutes, 14 seconds
TL;DR A recent report from Elliptic reveals how AI is used in cryptocurrency crimes. From deepfake scams to state-sponsored attacks, the report highlights emerging threats.
AI-Enabled Crime: A Growing Threat
A recent report from blockchain intelligence firm Elliptic, titled "AI-enabled crime in the cryptoasset ecosystem," has shed light on the emerging threats posed by artificial intelligence (AI) in cryptocurrency crime.
The report, supported by case studies, identifies five emerging types of AI-enabled crimes, ranging from deepfake scams to state-sponsored cyberattacks, emphasizing that these threats are still in their infancy.
The Power and Risk of AI
AI can potentially transform the global economy significantly, but it also brings risks. According to Elliptic, threat actors exploit AI for illicit activities within the cryptoasset ecosystem. One of the report's findings is using AI to create deepfakes.
Scammers use these highly realistic videos and images to impersonate high-profile individuals such as celebrities, politicians, and industry leaders to legitimize fake projects.
"Crypto giveaway and doubling scams are increasingly using deepfake videos of crypto CEOs and celebrities to encourage victims to send funds to scam crypto addresses."
Specific instances mentioned in the report include deepfakes targeting Ripple (XRP) and its CEO, Brad Garlinghouse, particularly following the company's legal victory against the U.S. SEC in July 2023.
High-Profile Targets
Other individuals who deepfake scams have targeted include Elon Musk, former Singaporean Prime Minister Lee Hsien Loong, and the 7th and 8th Presidents of Taiwan, Tsai Ing-wen and Lai Ching-te.
Anne Neuberger, the U.S. Deputy National Security Advisor for Cyber and Emerging Technologies, also addressed the growing concerns about AI's misuse. She highlighted that AI is being used for everyday scams and more sophisticated criminal activities.
"Some North Korean and other nation-state and criminal actors have been observed trying to use AI models to accelerate the creation of malicious software and identifying vulnerable systems," Neuberger stated.
The Hype Around GPT-Themed Tokens
The hype around AI has also given rise to the creation of GPT-themed tokens, which are often promoted with promises of high returns.
Elliptic warns that while some may be legitimate, many are being promoted in amateur trading forums with false claims of official associations with AI companies like ChatGPT.
The report also reveals discussions on dark web forums about leveraging AI models to reverse-engineer crypto wallet seed phrases and bypass authentication for various services.
"Throughout numerous dark web cybercrime forums, Elliptic has identified chatter that explores the use of LLMs to reverse-engineer crypto wallet seed phrases, bypassing authentication for services such as OnlyFans, and providing alternatives to image 'undressing' manipulation services such as DeepNude."
Vigilance and Proactive Measures
Elliptic also states that most AI-related threats in the cryptocurrency sector are still early, highlighting the importance of being vigilant and taking proactive steps to address these developing forms of crypto crime.