The Deepfake Porn Crisis Is Here

These laws, as sensible as they may seem, will still face an uphill battle. In May, Elon Musk-owned Platform X sued the state of Minnesota over its law banning the creation of deepfakes to influence an election, which it said violated free speech. In August, Musk won a similar lawsuit against the state of California over its ban on deepfakes.
For Guistolise, this event caused immeasurable pain. She lost trust in others. She is afraid of the consequences this could have on her future and her career. However, there is a “sequel” to it. She will continue to be the sister and friend she has always been. She can’t wait to go to work tomorrow. She trains her pit bull puppy, whom she aptly named Olivia Benson, to give a high five. And despite everything, “I love humans,” she says before pausing. “I guess I still do.”
Some practical steps to take if you are a victim of Deep Fakes
It’s impossible to measure the impact and reach of deepfakes because apps allow users to create them in moments. However, according to the experts we spoke with for this article, there are practical steps you can take if you discover you’re a victim.
Call a loved one.
“I recommend someone call a friend to help them with this process,” says Martone, noting that it can be difficult for people to watch the footage over and over again alone.
If you don’t feel comfortable speaking with a friend or loved one, SVPA has a hotline you can call at 800-656-HOPE (4673) and you can chat with RAINN online.
If you’re interested in pursuing possible legal action, Martone recommends asking a friend or relative to document any instances of deepfake so that lawyers or police can have them readily available. (The only caveat is that if the victim is a minor: do not take a screenshot any content. Instead, take your phone directly to your local police station to have it cataloged.)
Alert the platform.
The next step is to alert all social platforms where the deepfake may have been distributed. On Instagram, you click on the image, then Report, then False Information and Edit or Digital Creation.
Talk about it at your school or job.
If you’re comfortable talking about it at work or school, or if you think your deepfake might spread there, it’s a good idea to talk to HR or an administrator (especially if you’re the parent of a minor child).
Deepen your media education.
Not all AI is bad. As a good example, let’s take BitMind, which specializes in “synthetic and AI-generated media detection.” Its founder, Ken Jon Miyachi, tells Charm that the program can identify synthetic and semi-synthetic media that may be difficult for humans to detect, which can help people understand whether what they are seeing is real or not. Here are more expert tips for spotting misinformation.
Talk about it.
Meghan Cutter, Victim Services Manager at RAINN, is speaking not just to policymakers for change, but to all of us. “How can we, as a society, have conversations about sexual violence and different forms of sexual violence, and how can we make communities safer places for survivors to speak up, ask for help and identify that they might need support?” she said. “[We want to] creating that awareness, so that when it happens, people know that it’s not right and that I can do something.
Know that it is not your fault and that it is violence.
As Cutter says: “Just because someone hasn’t physically touched you, or maybe the image is your face, but not your body, doesn’t mean it’s not a form of violence. It’s a form of assault. It’s a form of sexual violence. I think it’s very important to be explicit about it, to help survivors understand that there are options available to them and to have words to describe their experience.”

