July 11, 2024

Women

 WhoWhatWhy -  In the past two years, people with malicious intent have increasingly targeted celebrities, children, and young women alike with fake nude photos and videos. These photos, called deepfakes, take previous photos of a person, and, using artificial intelligence, create synthetic images in their likeness. A 2023 report by the US cybersecurity firm Home Security Heroes found that 98 percent of deepfake videos online are pornographic, and the victims targeted by this cyber-crime are almost exclusively women.

Last year, the FBI warned the public of these deepfakes and noted an uptick in “sextortion schemes,” in which a perpetrator uses altered images to blackmail victims into either paying large sums of money or sending them actual nude photos. “Malicious actors use content manipulation technologies and services to exploit photos and videos — typically captured from an individual’s social media account, open internet, or requested from the victim — into sexually-themed images that appear true-to-life in likeness to a victim, then circulate them on social media, public forums, or pornographic websites,” the FBI said. “Many victims, which have included minors, are unaware their images were copied, manipulated, and circulated until it was brought to their attention by someone else.”

 

No comments: