The internet hosts at least 14,678 deepfakes, according to a report by DeepTrace, a company that builds tools to spot synthetic media. But most weren’t created to mess with elections.
Back to the beginning: Deepfakes arrived on the scene in late 2017. The word was originally used to describe AI-generated fake porn, but now it refers to any kind of manipulated video. This has stoked fears about the end of truth and the potential of deepfakes to swing elections.
Now: Most of the videos aren’t about politics. A full 96% of deepfakes are still plain old fake porn. All the fake porn contains women, mostly famous actresses and musicians.
Fighting back with law: The issue has caught the attention of legislators. In California, Governor Gavin Newsom just signed into law two bills that limit what people can do with deepfakes. One law makes it illegal to make and distribute a malicious deepfake of a politician within two months of an election. The second gets closer to how the manipulations are really being used. It lets people sue if their picture is used in deepfake porn without consent.
—Angela Chen
No comments:
Post a Comment