FBI says AI deepfakes used for cyber extortion, how is crypto involved?
The Federal Bureau of Investigation has issued a warning about “deepfakes” being used for cyber extortion.
According to a recent report, malicious actors are using deepfakes to manipulate photographs or videos, often obtained from social media accounts or the open internet, and create sexually explicit images that appear authentic.
These images are then circulated on social media or pornographic websites for the purpose of sextortion schemes or to harass the victim.
- Shiba Inu coin’s price jumps 40,000%; Wall St Memes invests $4.5m.
- Blockchain stock drops 20% due to SEC lawsuit.
- Tips for new blockchain coders from 10 industry insiders.
The FBI mentioned that the improvements in the quality, customizability, and accessibility of artificial intelligence-enabled image generators have further contributed to the growth of deepfakes.
The commission said it has received reports from victims, including minors, whose photos or videos were altered to create explicit content that was then publicly circulated.
Many victims were unaware their images had been copied, manipulated, and circulated until it either came to their attention or they stumbled across them online.
Once the manipulated content is circulated, victims face significant challenges in preventing its continual sharing or removal from the internet.
The FBI said that malicious actors have used manipulated photos or videos to extort victims for ransom or to gain compliance for other demands (e.g., sending nude photos).
The federal agency recommended that people exercise caution when posting or direct messaging personal photos, videos, and identifying information on social media, dating apps, and other online sites.
Moreover, people should use discretion when posting images, videos, and personal content online, particularly those that include children or their information, as they can be captured, manipulated, and distributed by malicious actors without your knowledge or consent.
The FBI also recommends applying privacy settings on social media accounts, running frequent online searches for personal information, using reverse image search engines, exercising caution when accepting friend requests or communicating with unknown or unfamiliar individuals, and securing online accounts with complex passwords and multi-factor authentication.
Deepfakes Used to Target Crypto Users
Recently, there have been instances where deepfakes were used to target unsuspecting crypto users.
For example, in May, a deepfake of Tesla and Twitter CEO Elon Musk was created to promote a crypto scam. The video contained footage of Musk from past interviews, manipulated to fit the fraudulent scheme.
Scammers impersonate anyone from influencers to high-profile crypto figures, but also ordinary people to gain victims’ trust.
Last year, Miranda, an e-commerce worker who did not wish to disclose her real name because her company had not given her permission to speak publicly, was targeted by such an attack when imposters released a deepfake video of the Melbourne woman promoting a crypto scam and published it on her Instagram account.