Sophie Compton and Reuben Hamlyn, acclaimed documentary directors and filmmakers, shed light on stories of injustice and healing through their award-winning work. Their latest documentary, "Another Body," will be released in theaters and on VOD on October 20th. The opinions expressed in this piece are solely theirs. Find more thought-provoking perspectives on CNN.
One day, a friend sends you a message with a surprising link: a Pornhub URL. Along with it, they apologize but insist that you need to see it. You click on the link, and to your shock, you come face to face with an explicit video featuring you. The reality sinks in, leaving you numb. You've never participated in pornography before, and you're left questioning who would do such a thing and for what reason.
This is what happened to Taylor, a 22-year-old engineering student from New England, whose name has been changed for privacy reasons. Taylor is the subject of our award-winning documentary, "Another Body," which won the Special Jury Award at SXSW. As Taylor soon discovers, these videos are deepfakes, videos manipulated using artificial intelligence to superimpose one person's face onto another person's body.
The anonymous perpetrator who targeted Taylor had uploaded six deepfake videos to various porn profiles, posing as her. Disturbingly, he also revealed her real college and hometown names, and encouraged men visiting the profiles to send her direct messages, accompanied by a wink emoji. And unfortunately, they did. Taylor began receiving unsettling messages on Facebook and Instagram from unfamiliar men.
However, when Taylor reported the incident to the police, she was informed by a detective that the perpetrator had acted within their rights and no legal violations had occurred.
At present, there are no existing federal laws in the United States that specifically address the creation and dissemination of non-consensual deepfake pornography. Our goal is to advocate for the implementation of a federal law that criminalizes non-consensual deepfake pornography and to propose amendments to Section 230 of the Communications Decency Act. These changes would remove online platforms' immunity from liability concerning user-generated content. By doing so, we aim to disrupt the thriving industry of creating and distributing non-consensual deepfake pornography that has flourished in this online environment.
CNN/Ian Berry
The prevalence of deepfake pornography targeting women has significantly increased due to advancements in artificial intelligence, according to experts. Sensity AI, an identity verification company, discovered a doubling in the number of pornographic deepfakes available online every six months between 2018 and 2020. Shockingly, 96% of these deepfakes contain sexually explicit content and feature women who did not provide consent for the creation of these videos.
Content has been
Previously, it used to require a large number of images of an individual's face to produce a believable deepfake. However, it is now achievable with only one or two images. In the initial stages of deepfake development in 2017, significant computer processing power and programming knowledge were necessary. Presently, there are user-friendly deepfake applications accessible for iPhones. Alternatively, some deepfake creators accept requests and charge as low as $30 to generate explicit videos featuring a customer's desired celebrity, former partner, or teacher.
Consequently, both famous personalities and ordinary individuals are discovering their faces unlawfully utilized in pornography, with these videos being uploaded to adult websites for widespread viewing. A prominent deepfake porn website currently attracts an average of 14 million views every month.
This practice has now become mainstream, no longer limited to a niche. Consequently, a new concern arises - a generation of young individuals who may perceive watching a pornographic deepfake featuring their favorite actress or even their classmates as normal.
The consequences for the victims can be severe. Taylor, for instance, faced distressing circumstances as she remained oblivious to the identity of the creators and viewers of these videos. This uncertainty bred a sense of distrust as she questioned whether her classmates, friends, or even her boss could be involved.
Allie Phillips is running for the Tennessee House of Representatives.
From Allie Phillips for HD75/Facebook
Opinion: The Dobbs effect isnt just galvanizing voters, its creating candidates
As a result, she went through a period of intense OCD and anxiety, leading her to reconsider her social circle and question whether she should continue her studies. Another survivor we spoke to, Helen, a 34-year-old teacher, also began experiencing panic attacks. She shared, "Every time I stepped out of the house, I felt an overwhelming sense of dread."
Amnesty International introduced the term "the silencing effect" to describe how online abuse targeting women can limit their involvement in public discussions. In a report on harassment on Twitter, the organization discovered that numerous women deactivate their accounts, self-censor their posts, and feel discouraged from pursuing careers in journalism or politics. Similarly, female victims of deepfake abuse often exhibit the same response.
According to Danielle Citron, a law professor at the University of Virginia School of Law, Section 230 is responsible for the existence of approximately 9,500 websites in the United States dedicated to sharing non-consensual intimate imagery. She expresses her concern, stating that this issue is like an incurable disease, as the content cannot be easily removed and the websites are not held accountable for their actions.
This problem extends beyond the creators of deepfake pornography. Search engines contribute to the promotion of deepfake content by driving web traffic towards it. Internet service providers host these sites, and credit card and payment companies facilitate transactions on them. Additionally, other companies advertise their products alongside these videos.
Get Our Free Weekly Newsletter
Sign up for CNN Opinions newsletter
Join us on Twitter and Facebook.
In England and Wales, the sharing of pornographic deepfakes without consent is set to become a criminal offense under a new law. In contrast, the United States has only a few states with specific laws targeting non-consensual deepfake pornography, and these laws often have restricted applicability.
With Taylor and fellow survivors, we've established the My Image My Choice campaign, advocating for federal legislation aimed at both creators and platforms. The objective is to criminalize the production and dissemination of non-consensual deepfake pornography and compel websites to remove such content from their platforms. Although these laws may not completely eradicate the issue, they would greatly undermine the current business model that depends on violating women's consent and privacy.