“We are reintroducing the DEFIANCE Act to grant survivors and victims of nonconsensual deepfake pornography the legal right to pursue justice,” Ocasio-Cortez said in a statement. “I am proud to lead this legislation with Representative Lee, and Senators Durbin and Graham to provide victims with the federal protections they deserve.”
Elsewhere in the world, places like Denmark are hoping to help protect people by passing legislation that gives everyone the copyright to their own likeness, including their voice.
In Minnesota, Guistolise and the women in her group are working alongside Senator Erin Maye Quade and the state’s Senate Judiciary and Public Safety Committee on a potential bill that would outlaw nudification, requiring AI companies to disable the function that allows it to create the images, or fine them up to $500,000 for each non-consensual deepfake.
“For these AI-generated photos and videos, the harm begins at creation,” Quade told MPR News. “Dissemination currently in Minnesota of nonconsensual, sexual deepfakes is illegal. But they can download these apps on their phones, and they’re doing that. They’re nudifying their teachers, their classmates, their siblings, friends.”
These laws, however common sense they may feel, will still face an upward battle. In May, the Elon Musk-owned platform X sued the state of Minnesota over its law banning the creation of deepfakes to influence an election, which it said violated free speech. In August, Musk won a similar lawsuit against the state of California for its deepfake ban.
For Guistolise, this event has caused immeasurable pain. She’s lost trust in others. She’s afraid of how this may affect her future and her career. However, there is a “next” for her. She gets to go on being the sister and friend she’s always been. She’s excited to go to work tomorrow. She’s training her pitbull puppy, whom she appropriately named Olivia Benson, how to give a high five. And despite it all, “I love humans,” she says, before pausing. “I guess I still do.”
Some Practical Steps to Take If You’re a Victim of Deep Fakes
It’s impossible to measure the toll and reach of deepfakes, as apps allow users to create them in mere moments. However, according to the experts we spoke to for this piece, there are practical steps to take if you find out you’re a victim.
Call a loved one
“I recommend that somebody calls a friend to help them with this process,” Martone says, noting that it can be difficult for people to view the images over and over alone.
Read the full article here


