Skip to content

May 24, 2024

King Introduces Bipartisan Bill to Combat Non-Consensual, Digital Deepfake Porn

Legislation establishes both civil and criminal penalties for sharing artificially generated, intimate images online without consent

WASHINGTON, D.C. — U.S. Senators Angus King (I-ME), the Co-Chair of the Cyberspace Solarium Commission (CSC), is introducing bipartisan legislation to impose criminal penalties on individuals who create and share non-consensual, intimate deepfake images online—often referred to as deepfake porn. The Preventing Deepfakes of Intimate Images Act would not only establish a new criminal offense for distributing these images, but it would also allow victims to file a lawsuit against someone who intentionally distributes these images. Under the legislation, the criminal penalties can include a fine and up to two years in prison, and the civil penalties can include a fine up to $150,000.

The bill is also cosponsored by Senators Maggie Hassan (D-NH), John Cornyn (R-TX), and Laphonza Butler (D-CA).

“Artificial intelligence is rapidly helping to advance critical components of society, but it’s also being used maliciously to victimize innocent Americans,” said Senator King. “The Preventing Deepfakes of Intimate Images Act would ensure that Maine people, and Americans nationwide, have legal civil and criminal recourse in the event they become victims to fake content posted online. In the age of digital ingenuity and innovation, legislation is needed to protect individuals from bad actors exploiting new technology.”

“The sharing of intimate images without consent can cause extraordinary emotional distress and harm and can put victims at risk of stalking and assault. Especially as technology advances to the point where it is hard to tell which photos and videos are real and which have been entirely faked, we need stronger guardrails that protect people’s safety, privacy, and dignity and prevent non-consensual intimate images from proliferating across the internet,” said Senator Hassan. “This bipartisan bill provides tools to hold accountable – both financially and criminally – the people and websites who are knowingly sharing these images without consent, and I urge my colleagues to support it.”

“While there are many benefits to artificial intelligence, the use of deepfake technology to generate nonconsensual and realistic intimate images of actual people poses a growing threat,” said Senator Cornyn. “This legislation will help safeguard against the malicious use of this technology by closing loopholes in revenge porn laws and criminalizing the creation and spread of nonconsensual intimate deepfakes.”

“As artificial intelligence continues to advance, we must take steps to prevent its misuse,” said Senator Butler. “That’s why we need this legislation to protect victims and hold perpetrators accountable.”

Using advanced technology, like artificial intelligence, it is easier than ever to make fake images and videos that are convincingly real. Bad actors have been using this technology to create intimate, sexualized imagery of people without their knowledge. Several public figures, have been victimized by deepfake images posted online without their consent.   

Senator King has been a leading voice in fighting threats from technology, having served as the Co-Chair of the Cyberspace Solarium Commission – which has had dozens of recommendations become law since its launch in 2019. As a member of the Senate Intelligence and Armed Services committees, Senator King has been a strong supporter of increased watermarking regulations. In a September 2023 open Intelligence hearing, King asked Dr. Yann LeCun – a New York University Professor of Computer Science and Data Science at New York University – about what is technologically feasible in terms of implementing watermarks (a small icon or caption) for users to discern between real and artificially created content. The FY2024 National Defense Authorization Act legislation includes a Senator King-led provision to evaluate technology, including applications, tools, and models, to detect and watermark generative artificial intelligence. Most recently, he joined the bipartisan Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (DEFIANCE Act) that would allow victims to sue perpetrators for up to $150,000 who create and share fake visual depictions to falsely appear to be authentic. Additionally, he recently had an exchange during a hearing of the Senate Energy and Natural Resources Committee hearing, where he raised the question of what Congress and the private sector can do to combat fake and misinformation online.

###


Next Article » « Previous Article