Home » Technology » Senate passes a bill that would let nonconsensual deepfake victims sue

Share This Post

Technology

Senate passes a bill that would let nonconsensual deepfake victims sue

Senate passes a bill that would let nonconsensual deepfake victims sue

It comes amid the global uproar over X’s mass AI undressing of users on its platform.

It comes amid the global uproar over X’s mass AI undressing of users on its platform.

STK419_DEEPFAKE_3_CVIRGINIA_C
STK419_DEEPFAKE_3_CVIRGINIA_C
Lauren Feiner
is a senior policy reporter at The Verge, covering the intersection of Silicon Valley and Capitol Hill. She spent 5 years covering tech policy at CNBC, writing about antitrust, privacy, and content moderation reform.

The Senate passed a bill that could give people who’ve found their likeness deepfaked into sexually-explicit images without their consent a new way to fight back.

The Disrupt Explicit Forged Images and Non-Consensual Edits Act (DEFIANCE Act), would let victims sue the individuals who created the images for civil damages. The bill passed with unanimous consent — meaning there was no roll-call vote, and no Senator objected to its passage on the floor Tuesday. It’s meant to build on the work of the Take It Down Act, a law that criminalizes the distribution of nonconsensual intimate images (NCII) and requires social media platforms to promptly remove them.

The passage comes as policymakers around the world have threatened action against X for enabling users to create nonconsensual and sexually suggestive AI images with its Grok chatbot. X owner Elon Musk has shrugged off blame onto the individuals prompting Grok, writing, “Anyone using Grok to make illegal content will suffer the same consequences as if they upload illegal content.” But even after pushback, X continued to let users prompt Grok to virtually strip people down.

Senate Democratic Whip Dick Durbin (D-IL), a lead sponsor of the bill, referenced Grok’s nonconsensual undressing in remarks on the Senate floor. “Even after these terrible deepfake, harming images are pointed out to Grok and to X, formerly Twitter, they do not respond. They don’t take the images off of the internet. They don’t come to the rescue of people who are victims,” Durbin said. Though the Take It Down Act, whose takedown provision goes into full force later this year, could have implications for X, the DEFIANCE Act would impact individuals, like those Grok users creating deepfaked nonconsensual intimate imagery.

Governments around the world are creating new protections against AI-generated nonconsensual images, spurred in part by the recent Grok controversy. The UK, for example, recently pushed up a law that criminalizes the creation of nonconsensual intimate deepfakes.

The DEFIANCE Act similarly passed the Senate in 2024 following a different nonconsensual deepfake scandal on X. Early that year, sexually explicit AI-generated images of Taylor Swift circulated on the platform. Durbin along with Sens. Lindsey Graham (R-SC), Amy Klobuchar (D-MN), and Josh Hawley (R-MO) introduced the bill to expand on a provision in the Violence Against Women Act Reauthorization Act of 2022, which gave people whose non-AI generated intimate images were shared without consent a right to sue. Rep. Alexandria Ocasio-Cortez (D-NY), who has found her own image digitally altered in nonconsensual intimate deepfakes, sponsored the bill in the House. The bill stalled in the House without a vote during the last Congress, requiring the Senate to take it up again this year. Now the ball is again in the House leadership’s court; if they decide to bring the bill to the floor, it will have to pass in order to reach the president’s desk.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

Jay Peters
Jess Weatherbed
Terrence O’Brien

Most Popular

Share This Post

Leave a Reply