DeepNude has now been taken offline, but it won’t be the last time such technology is used to target vulnerable populations, writes Karen Hao.
The news: An app called DeepNude that “undressed” photos of women, using generative adversarial networks to swap their clothes for highly realistic nude bodies, has been shut down after an outcry. However, the story feeds into growing fears over the unique way deepfakes can be weaponized against women and other vulnerable populations.
Targeting: The app only generated images of the female body, even when given a picture of a man. Although they didn’t depict the women’s actual bodies, they could be easily mistaken for the real thing and used as revenge porn or for blackmail (in fact this has happened before.)
What can be done? Researchers suggest that regulators need to act quickly to crack down on deepfakes, companies who produce tools for deepfakes must invest in countermeasures, and social media firms should integrate those countermeasures directly into their platforms. Read the full story here.
No comments:
Post a Comment