Protecting Creators and Platforms from Nonconsensual Content
By: Lawrence G. Walters, Esq.
Date: May 27, 2025
The Importance of Consent
Consent lies at the heart of adult content production and distribution. Adult industry participants, processors, banks, and hosts have a vested interest in ensuring that the recording and publication of any sexually explicit content is supported by informed consent. Industry standards for adult content production focus on obtaining and documenting voluntary consent from all participants in the production. Professional producers carefully screen performers for any signs of impairment or duress which may suggest a lack of consent to engage in sexual activity on film, or to authorize distribution of the content as agreed by the parties. The adult industry takes a strong and definite stance against the creation or publication of nonconsensual materials.
Concerns about issues of consent incentivized MasterCard to release its Updated Guidelines for adult user-generated content sites in 2021. These Guidelines impose obligations on any payment processor that accepts MasterCard payment transactions to ensure that their adult merchants require documented consent to recording, publication, and downloading (if allowed) of explicit materials by content creators. Online platforms therefore routinely mandate the collection of written consent forms signed by all performers depicted in any uploaded content. By complying with industry standards and processor obligations in both the production and distribution of adult content, the likelihood of nonconsensual intimate images (NCII) is dramatically reduced.
Developments in AI, Deepfakes, and Takedowns
Developments in technology, including artificial intelligence models, have allowed for the creation of realistic depictions of individuals engaging in sex acts that never occurred. So-called “deepfakes” are not created with the consent of the depicted individuals, even if the underlying materials were voluntarily recorded or published. Such NCII can cause reputational harm and emotional distress to the depicted individuals. Similarly, voyeuristic material depicting body parts that were not intentionally displayed to the public constitutes another category of NCII. Finally, NCII can arise in the circumstance when an individual consents to the creation of the imagery, and/or discloses it to a friend or partner, but does not consent to more widespread distribution. These variants of NCII create more difficult issues for both the individuals depicted and the platforms where the content might appear.
Importantly, an online platform may have no knowledge of any limitations on consent that an individual has imposed in connection with the creation or circulation of specific images or video. While collection of consent forms mitigates these concerns to an extent, consent to publication of some depictions may not authorize wholesale distribution of any content depicting the creator who signed the form into the indefinite future. In some instances, creators may seek to revoke prior written consent. Separately, images that are subject to voluntary consent may be manipulated or altered to depict the creator in a way that he or she did not agree. Fortunately, responsible online platforms promptly respond to abuse complaints asserting NCII concerns. The MasterCard Guidelines require that platforms publish a complaints policy guaranteeing such prompt resolutions as a condition of continued processing. A list of all abuse complaints, and their resolution, must be maintained by the platform and shared with the processor. By promptly addressing NCII complaints, adult platform operators can reduce the potential harm of NCII distribution and maintain healthy relationships with their processors.
The abuse reporting process, itself, can be subject to abuse. Consider the scenario where a performer is paid for the release of rights to record and publish adult content, but later changes his or her mind. Rapid takedown of content labeled as NCII can injure legitimate content producers both monetarily and from a reputational standpoint. Contract rights should be respected irrespective of whether the contract involves adult materials. A separate issue arises where a competitor or harasser wants to harm a creator by taking down their lawful content from online platforms. Wrongful takedowns based on false claims of NCII can wreak havoc on creators and publishing outlets.
Legislation
Many states have laws prohibiting the recording or dissemination of NCII. Congress legislated in this arena in 2022, by passing the law now codified at 15 U.S.C. § 6851. This statute allows an individual to file a civil action for damages against any person or company who transfers, publishes, distributes, or otherwise makes accessible any intimate visual depiction of a person, knowing or recklessly disregarding the fact that the person did not consent to the depiction. The consent must be affirmative, conscious, and voluntary – free from force, fraud, misrepresentation, or coercion. Manipulated images are included within the ambit of this law, so long as an individual is identifiable by face, likeness, or other distinguishing characteristics. This could include a tattoo or birthmark. Successful claimants can recover actual damages plus liquidated damages in the amount of $150,000, in addition to costs and attorneys’ fees. This law is a powerful weapon that can be used by victims of NCII to seek justice. Recognizing that commercial model releases should remain enforceable, the statute notes that its prohibitions do not apply to an intimate image that is “commercial pornographic content” unless such content was produced by force, fraud, misrepresentation, or coercion. Given the broad protection afforded by Section 230, any claims would likely be unsuccessful if asserted against online platforms in relation to user generated content. However, individuals, producers, or even pay sites that produce or publish content alleged to be nonconsensual are potentially liable.
The TAKE IT DOWN Act
On May 19, 2025, President Trump signed the TAKE IT DOWN Act which imposes criminal prohibitions on disclosure of (or threats to disclose) NCII. Offenses involving adults can result in up to 2 years in prison, while offenses involving minors carry up to 3 years. The fact that an individual consented to the creation of the underlying content, or consented to disclosure to another individual, does not establish consent to publication or distribution by a third party. Unlike the law allowing civil claims, this criminal law makes no specific exception for commercial pornography. However, violations are not triggered if the individuals depicted voluntarily exposed themselves in a public or commercial setting. Further, the law imposes a “notice and takedown” regime on covered online platforms which requires the publication of a clear and conspicuous policy detailing how reports of NCII can be submitted. If an NCII takedown notice contains the required information, such as identification of the location of the content, a physical or electronic signature, and a good faith statement that the content was published without the consent of the complainant, platforms must remove the content within 48 hours of receipt, along with all known copies of the depiction. Unlike the DMCA, on which this bill is seemingly patterned, there is no requirement that the statements in the takedown notice be sworn under the penalty of perjury, and there is no provision allowing claims against those who abuse the takedown procedure. Failure to comply with the requirements applicable to online platforms is punished as a deceptive and unfair trade practice by the Federal Trade Commission. Platforms have 1 year from enactment to implement the required procedures.
Numerous civil liberties groups have warned against the unintended consequences of this law, and the threats of censorship posed by compliance with its requirements. Given the potential civil and criminal penalties triggered by the law, the response will likely lead to severe moderation of sexually explicit content to mitigate the risks. We saw this with the passage of FOSTA/SESTA which criminalized online materials that promote or facilitate prostitution or contribute to sex trafficking. All sexually oriented content was banned on many platforms and some service providers shut down completely in response. Similar censorship efforts can be expected in light of this new law. The requirement that a platform promptly remove any identified NCII, and all known copies, could pose an insurmountable burden, particularly on those platforms which offer encrypted messaging features. Again, the likely response would be to cease offering such features which have become invaluable for private, secure online communication. The mandated 48-hour response time may be impractical for startup platforms who employ a small staff, which thereby stifles online innovation. By failing to require that all takedown notices include sworn statements, by omitting any appeal process, and by not offering any method to punish malicious actors, the required takedown procedure invites abuse by frivolous claimants or even competitors. The lack of a specific exemption for commercial pornography compounds the potential for abusive claims. More broadly, the law criminalizes a new category of speech that is deemed unprotected by the First Amendment, which the U.S. Supreme Court has rejected on several recent occasions.
Balancing Free Speech with NCII Enforcement
Restricting the recordation and publication of NCII is a laudable goal that enjoys widespread support within the adult entertainment industry. State and federal laws provide numerous options for victims of this wrongful activity to seek redress in the courts. The ability to create deepfakes, involuntarily depicting individuals in a state of undress or engaging in sexual activity, creates new risks for creators and publishers which should be carefully evaluated. However, given the potential First Amendment concerns, any new criminal legislation in this area requires a scalpel, not a sledgehammer. Imposing penalties on platforms that inadvertently host NCII or fail to remove such content within a very short timeframe, creates a chilling effect on speech resulting in censorship of sexually-oriented materials. Any legislation criminalizing the publication of NCII must make room for satire, political speech, and other forms of protected expression. An appeal provision should have been included to counter unfounded takedown requests. Laws like this must also recognize the practical limitations facing online intermediaries in identifying and removing such content. Finally, any such law should include a specific provision for punishing abusers to prevent misuse and the resulting harm to creators, publishers, and distributors. By failing to strike the necessary balance in protecting free speech and restricting NCII, lawmakers have invited censorship and abuse.
About the Author
Lawrence Walters heads up Walters Law Group and represents clients involved in all aspects of the adult entertainment industry. Nothing in this article is intended as legal advice. You can reach Mr. Walters through his website, www.firstamendment.com or on social media @walterslawgroup.
Read more:
Removing Nonconsensual Adult Content
Also see:
Publications
Adult Entertainment Law