The FBI has issued a warning that malicious actors are creating sex-themed deepfakes by manipulating photos and videos shared online by regular people.
Criminals have long been using AI to create synthetic content, commonly referred to as “deepfake,” with the purpose of deceiving victims – either for financial gain (spear phishing / CEO impersonation) or to influence public opinion.
In one recent instance, a deepfake video circulated showing what appeared to be Ukrainian President Volodymyr Zelensky calling on his country’s citizens and army to lay down their weapons and surrender to invading Russian forces.
Until recently, it was mostly celebrities and political figures that got targeted by deepfake creators. Now, threat actors are turning their attention to regular people.
Targeting regular netizens
Extortionists are creating sex-themed deepfakes by manipulating benign photos or videos people share on social media, the FBI warns.
“Technology advancements are continuously improving the quality, customizability, and accessibility of artificial intelligence (AI)-enabled content creation,” the public service announcement reads. “The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content. The photos or videos are then publicly circulated on social media or pornographic websites, for the purpose of harassing victims or sextortion schemes.”
Victims unknowingly provide the raw materials
According to the notice, threat actors typically harvest raw, unmodified images from victim’s social media accounts, then use “content manipulation technologies” to turn them into “sexually-themed images that appear true-to-life in likeness to a victim.”
The crooks then circulate the modified content on social media, public forums, and even porn sites.
“Malicious actors have used manipulated photos or videos with the purpose of extorting victims for ransom or to gain compliance for other demands,” the FBI warns.
Many victims include minors, the notice reveals. Victims are often unaware their images were copied, manipulated and circulated until somebody else brings it to their attention.
Pay ransom or send real nudes
Extortionists send the modified content back to their victims showing their likeness in a compromising manner and threaten to share the content with friends and family unless the victim pays a ransom or sends real sexually-themed images or videos.
Cases of deepfake sextortion are on the rise. Based on recent victim reporting, the malicious actors typically demand direct money transfers, gift cards, or actual sex-themed content (i.e. nudes).
Because victims can face significant challenges in preventing the continual sharing of the manipulated content or removal from the internet, the FBI directs victims threatened in the second-described manner to employ the services of Take It Down, which uses file hash values to detect compromising images or videos and remove them from the public domain.
Careful what you post online
The bureau’s notice includes a list of recommendations to stave off this criminal activity, chief among them: “Use discretion when posting images, videos, and personal content online, particularly those that include children or their information.”
Netizens are also instructed to exercise caution when accepting friend requests or engaging in communications with individuals they don’t know in person, as well as to secure online accounts using complex passwords and multi-factor authentication.
Last, but not least, the feds recommend to research the privacy and data retention policies of social platforms, apps, and websites before uploading and sharing images, videos, or other personal content.
Bitdefender Digital Identity Protection lets you instantly find out if your data has leaked online, what type of information was compromised, what risks you face, and whether your information is for sale on the dark web.