AOC Announces Bill That Would Let Women Sue People Who Make Deepfake Porn of Them


New York representative Alexandria Ocasio-Cortez has introduced a bill that would allow victims of deepfaked porn to sue its creators, Rolling Stone reports.

Titled the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act of 2024, the proposed legislation would be added as an amendment to the 1994 Violence Against Women Act. Should the bill pass, per Rolling Stone, those targeted by nonconsensual deepfakes — which overwhelmingly impact girls and women — will finally have legal footing to defend themselves against the creation and dissemination of damaging and invasive fake pornographic material.

"How we answer these questions is going to shape how all of us live as a society, and individually the things that are going to happen to us or someone that we know, for decades," Ocasio-Cortez, who sits on the House of Representatives' recently-formed bipartisan task force on AI, told Rolling Stone, adding that "folks have waited too long to set the groundwork for this."

Baby Steps

Ocasio-Cortez — who's co-leading the bill alongside Illniois' Dick Durbin South Carolina's Lindsey Graham — told Rolling Stone that the task force worked closely with survivors of nonconsensual deepfakes, which she described as a way of "really centering the people that have been most affected by this." Per the report, the action has been endorsed by over 25 organizations including the Sexual Violence Prevention Association and the National Domestic Violence Hotline.

Perhaps most notably, the bill doesn't only allow victims to sue the creator of a nonconsensual deepfake. It also allows them to sue anyone who distributed or received that deepfake as well, as long as it can be proven that distributors and downloaders knew it wasn't consensual. (That said, though, that might be a difficult thing for a victim to prove.)

Pornographic deepfakes aren't a new problem. But new and publicly-available generative AI tools have made it absurdly easy for bad actors to create nonconsensual fake pornography, a reality that's inflicted real harm to victims ranging from famous pop stars and public figures to high school girls. Legislation has been slow to catch up, and giving victims legal footing to fight back against abusers is always a good thing.

The bill isn't above critique. For one thing, it puts the onus of retribution on individual victims, who for years now have borne the brunt of the burden of policing nonconsensual deepfakes. It also fails to allow victims to seek reparations from the makers of the AI tools used to generate deepfakes — the public availability of which led to the deepfake proliferation that the bill is hoping to counter in the first place.

In other words: DEFIANCE is, overall, a step in the right direction. But it's a small one.

More on deepfakes: New Law Would Illegalize AI Taylor Swift Porn Flooding Internet