216.696.8700

Nudify Me: The Legal Implications of AI-Generated Revenge Porn

February 15, 2023
NCAA

As technology evolves, so does the issue of revenge porn. Revenge porn is the non-consensual sharing of the nude image of another. Revenge porn can be criminally or civilly actionable, depending on the laws of the applicable state. But the interplay of Artificial Intelligence or AI technology adds a new layer of complication for revenge porn victims.

AI technology has now evolved to generate nude photos and videos of any person at the click of a mouse. So long as these images are created and shared without the consent of the subject, and for the purposes of harming them, they likely also constitute revenge porn.

This article will discuss the history of revenge porn, its evolving legal landscape and the potential legal implications of generating “fake” revenge porn or “deepfake pornography” with Artificial Intelligence technology.

A Brief History of Revenge Porn and Its Evolving Legal Landscape

The phenomenon of revenge porn started in the 1980s and quickly evolved into a widespread issue at the turn of the century when cell phones and user generated content (UGC) websites became widely adopted by the general public. As intimate image abuse spread, states began enacting criminal statutes and/or allowing victims to recover under existing civil causes of action. Nevertheless, laws differ from state to state and revenge porn continues to be a new phenomenon in the context of the existing legal landscape.

As the laws evolve, so does technology. While revenge porn in the traditional sense started as image abuse between partners in an intimate relationship, it quickly evolved into a larger issue. Hackers began accessing the private devices of others, stealing intimate photos, threatening victims and publishing the photos on the internet. We now face a new development in the realm of revenge porn – Artificial Intelligence manufactured nudes.

AI-generated nude images pose unique challenges when it comes to intimate image abuse. Specifically, does a nude image have to be “real” for a victim to recover? With new AI programs, a wholly or partially synthetic image can be created or manipulated to create a convincing “nude” image of a real person, even though that person never actually posed for the explicit picture. While the image created may not be “real,” the technology is, and will inevitably give rise to revenge porn litigation.

Using AI Technology to Manufacture Sexually Explicit Images

Recent advances in Artificial Intelligence and Machine Learning in image creation, analysis and manipulation have created the ability to manufacture sexually explicit images of recognizable individuals. These AI programs work by examining the billions of images available on databases (including the World Wide Web) to understand the statistical nature of images. For the purposes of this article, there are three types of AI image generation programs:

1.) Image Generators that Generate Fully Synthetic Images

AI programs like Lensa work by feeding actual images of a person or object into an AI database and then generating new AI images from the database. A Lensa user would, for example, upload a dozen pictures of himself, and the program would use these images to generate stylized images of the subject as, say, an astronaut, movie star, cowboy, etc. While programs like Lensa generate stylized images, they can also be used to generate photorealistic images. Thus, based on the scan of real images, an AI program could be asked to generate an image of Neil Armstrong walking with Orville Wright in downtown Cleveland. The images would be wholly synthetic but based on actual images. For these purposes, it is not necessary to discuss whether the resulting AI image would constitute a “derivative work” for copyright purposes. The question here is what happens when this type of program generates a nude or explicit image of an identifiable person.

2.) “Deep Fake” Images

Related to the wholly synthetic image is the issue of deep fakes. In a deep fake, an AI program is “trained” by loading and analyzing pictures and videos of the reference target – like Tom Cruise or Nicholas Cage. Once the program is “trained,” a photograph or image is created with a reference subject. The subject image is then “swapped” with the target image, such that a convincing blend is created with Cruise or Cage appearing in a movie they never were in or, in our case, the face of a potential revenge porn victim being swapped into a convincing pornographic video. The technology has advanced now to the point that the deep fake image can be generated in real time (for video conferencing and the like) and can be accompanied by deep fake voices. Thus, the resulting images and videos can be virtually indistinguishable from a nude image or video of an identifiable person, with voice and context to match.

3.) “Nudified” Images

AI technology can be (and has been) used to “nudify” existing images. Indeed, there are dozens of free applications and websites that will permit users to upload an image of a real person, and with one click, generate a new image which is identical to the original, with the exception that the individual’s clothing has been removed, resulting in a convincing “nude” image of that person. The resulting effect causes a litany of potential legal issues, whereby a fake nude image of any person could be manufactured and disseminated at the click of a button.

Can AI-Generated Nudity Be Prosecuted Under Revenge Porn Statues in Ohio?

Ohio, like many other states, does not entirely contemplate the issue of AI-generated nudity. Ohio’s criminal revenge porn statute, R.C. 2917.211, prohibits “revenge pornography” as a form of harassment. Specifically, R.C. 2917.211 makes it an offense to knowingly disseminate “an image of another person” who can “be identified from the image itself or from information displayed in connection with the image and the offender supplied the identifying information” when the person in the image is “in a state of nudity or is engaged in a sexual act,” and “the image is disseminated without consent from the person in the image” if the dissemination is done “with intent to harm the person in the image.”

But how would Ohio’s revenge porn statute apply to a scenario involving an AI-generated nude image? The legal issue becomes whether a synthetic image which appears to be a nude or sexually explicit image of a genuine person constitutes “an image of another person…” under the law. This is an issue that has yet to be decided.

Can AI-Generated Nudity Be Prosecuted Under Revenge Porn Statues in Other States?

In most states, revenge porn claims are legally recognized when a victim can show that an image or video of that victim, which is identifiable to or attributed to an actual person, is improperly disseminated or sold with the intent to cause the victim some specific harm, and in fact does cause such harm. Thus, the legal issue remains similar to the issue raised above in applying Ohio’s criminal statute to deepfake content, potentially leaving victims with fewer criminal options for AI-generated image abuse.

Some revenge porn statutes specifically address the issue of artificially created nudes or pornography. For example, Virginia’s revenge porn statute, VA Code § 18.2-386.2, makes it a crime to maliciously disseminate a video or picture “created by any means whatsoever” that depicts another person who is nude. The definition specifies that “another person” includes a person whose image was used in creating, adapting or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person’s face, likeness or other distinguishing characteristic. Thus, if Titanic’s “Jack” disseminated his pencil drawing of “Rose” without her consent and for malicious purposes, this action would violate Virginia’s revenge porn statute – even if (and maybe especially if) Rose never sat for the drawing and the nudity was a function of the artist’s brush/pencil. Likewise, a “fake” nude image generated by AI technology would likely be criminally actionable in Virginia.

Unfortunately for the victims of AI-generated revenge porn, most states simply make it a crime to improperly disseminate an “image” or “video” which is identifiable and depicts nudity or sex acts, raising the possibility that a court might limit the application of the statute to “actual” images of the person depicted, rather than “virtual” images which appear to be of the person depicted but were, in fact, created by AI technology. But how would our scenario affect an AI revenge porn victim who is under the age of 18?

The Child Pornography Analogy

In 1996, Congress passed the Child Pornography Prevention Act of 1996 to criminalize the dissemination of child pornography, which was defined in 18 U.S.C. Section 2256(8) as including:

“Any visual depiction, including any photography, film, video, picture or computer-generated image or picture … of sexually explicit conduct, where — (A) the production of such visual depiction is, or, appears to be, of a minor engaging in sexually explicit conduct; (B) such visual depiction is, or, appears to be, of a minor engaging in sexually explicit conduct; (C) such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct; or (D) such visual depiction is advertised, promoted, presented, described, or distributed in such a manner that conveys the impression that the material is or contains a visual depiction of a minor engaged in sexually explicit conduct.”

The statute was designed to deal with two related, but not identical, problems. The first was that of “virtual child porn” – computer generated images which were indistinguishable from child pornography with the exception that they were not visual depictions of actual children, and no actual children were forced to pose nude to create the images. The second problem was that of nude images which had been manipulated or morphed to depict an “identifiable minor” as having engaged in sexually explicit activity. Here, it is important to note that, while obscenity (which broadly defined likely includes most child pornography) is illegal because of its impact on the viewer or on society in general, non-obscene child pornography (if such a thing exists) is unlawful because of its impact on the person depicted in the image itself. Because a real child was abused in the creation, and later dissemination of the child pornography (now called Child Sexual Exploitation Materials or “CSAM”), each act of creation, sharing, posting or dissemination of the material constitutes a separate offense. However, in Ashcroft v. Free Speech Coalition, 535 U.S. 234 (2002), the Supreme Court struck down both of these provisions as being overbroad.

In Ohio v. Tooley, 872 N.E.2d 894 (2007) the Ohio Supreme Court distinguished Ashcroft’s requirement that the visual depiction mandated a depiction of an actual minor, noting that the Supreme Court in Ashcroft:

“… did not address Section 2256(8)(C), Title 18, U.S. Code, which prohibits a more common and lower tech means of creating fake images, known as computer morphing… Rather than creating original images, pornographers can alter innocent pictures of real children so that the children appear to be engaged in sexual activity. The Ashcroft court excluded this category from consideration, stating, “Although morphed images may fall within the definition of virtual child pornography, they implicate the interests of real children and are in that sense closer to the images in Ferber. Respondents do not challenge this provision, and we do not consider it.”

The Tooley court found that, unlike synthetic child porn, “morphed” child pornography which includes as part of the image either the use or creation of images of an identifiable minor was not protected speech under the First Amendment. Similar results were obtained in other courts. This reflects the position that, unlike fully virtual CSAM, when images depict an identifiable minor engaged in sexual activity, that minor is harmed irrespective of whether the image is real or virtual. In many respects, the minor is harmed even more by the virtual image, which can be manipulated to be even more graphic, and which creates the false impression that the minor in any way participated in or “consented” to the creation of the images (despite the obvious lack of ability of minors to so consent).

Indeed, in 2003, Congress passed the PROTECT Act of 2003 which, inter alia, amended 18 USC 2256(8)(B) to prohibit the creation or dissemination of computer-generated child pornography when “(B) such visual depiction is a computer image or computer-generated image that is, or appears virtually indistinguishable from that of a minor engaging in sexually explicit conduct.” Thus, if AI technology is used to generate CSAM, it would be criminally actionable.

Civil Recovery for the Victims of AI-Generated Revenge Porn

Fortunately for the victims of AI-generated nudity, options for civil recovery would likely remain similar to those pursued by the victims of more traditional revenge porn. Revenge porn victims often bring tort claims for false light invasion of privacy, intrusion on seclusion, publication of private facts, intentional/negligent infliction of emotional distress, and/ or defamation, among others. A victim of technology created nude images may still be able to recover under many of the same theories; however, an issue is created by the defenses that can be asserted by the creator of the content.

In many jurisdictions, defendants in defamation or false light litigation have the option to assert the defenses of satire or parody to avoid liability. Simply put, if an image is found to have been created for and is understood by the average person to be a joke, it may not be actionable. Because many deepfake pornography images tend to be over-dramatized, this defense is likely to be asserted by creators. Whether AI deepfakes are created for the purpose of hurting someone or for the purposes of parody, these images clearly have the potential to cause immense harm to the subject.

The Impact of Revenge Porn on Victims

Like traditional revenge porn, AI-generated fake pornography is a growing issue that disproportionately affects women. According to an article in MIT Technology Review, a study conducted by Sensity Al tracked deepfake videos since December of 2018 and found that between 90% and 95% of the videos constituted nonconsensual pornography. About 90% of the nonconsensual pornography targeted women.

It is important to note that revenge porn is more than simply posting naked images. It is an attack on an individuals’ right to sexual privacy. Revenge porn is about control, sexual violence, harassment and threats. Intimate image abuse causes immense reputational damage, prevents victims from securing employment and financially supporting themselves, and invites others to stalk, threaten, and perpetuate sexual violence against the victim.

Most criminal revenge porn statutes require proof of intent to harass, threaten or annoy — or at a minimum, knowledge that the actions will have that impact. Recognizing the purposes behind the creation of revenge porn statutes would lead to the conclusion that it makes no difference whether the image depicted is an actual or virtual image, so long as it was created with some intent to harass, threaten or annoy the target. Nevertheless, the application of existing law to AI-generated “fake” nudes remains to be seen.

Conclusion

Revenge porn is a huge issue further complicated by rapidly evolving technology. Thankfully, perpetrators can be traced and revenge porn can be removed. If you are the victim of revenge porn and you have questions about your legal options, or if you want to discuss the evolving nature of revenge porn in relation to artificial intelligence applications, please contact KJK attorney Alexandra Arko (ALA@kjk.com) or Mark Rasch at (MDR@kjk.com).