Deepfake Video: Survey reveals 94 percent deepfake adult videos target celebs
India, currently, ranks 6th in the list of nations that are most susceptible to deepfake adult content.
Deepfake Viral Video News: Experts have been warning people about the Deepfake AI scam as it is a highly sensible issue. A recent video of Zara Patel, who happens to be a model, went viral on social media, not because of her but because of Rashmika Mandanna. It was a morphed video made by Deepfake AI in which Zara Patel's face was replaced by that of Rashmika.
Rashmika Mandanna was shocked to know about the development. She took to Instagram and issued a strong statement. She said, "I feel really hurt to share this and have to talk about the deepfake video of me being spread online. Something like this is honestly, extremely scary not only for me, but also for each one of us who today is vulnerable to so much harm because of how technology is being misused."
"Today, as a woman and as an actor, I am thankful for my family, friends and well-wishers who are my protection and support system. But if this happened to me when I was in school or college, I genuinely can't imagine how could I ever tackle this. We need to address this as a community and with urgency before more of us are affected by such identity theft," she added.
In the meanwhile, a recent survey on deepfake AI content revealed that India, currently, ranks 6th in the list of nations that are most susceptible to deepfake adult content.
The 2023 State of Deepfakes reported that the US-based Home Security Heroes, public figures, particularly those from the entertainment industry, are at a higher risk due to their visibility and potential impact on careers.
The deepfake videos are created by swapping faces or even altering voices. The report further stated that 94 percent of individuals featured in deepfake pornography videos belonged to the entertainment sector. These included singers, actresses, social media influencers, models, and even athletes.
Also Read: Rashmika Mandanna's Deepfake Goes Viral, Raises Concern