User Generated Content Sites – Formula for Profit, or Recipe for Disaster?

By: Lawrence G. Walters, Esq.

www.FirstAmendment.com

Given the smashing success of YouTube™ and other video-sharing websites, it was inevitable that the adult industry would see a surge of similar business models involving adult material. Adult dating sites, ex-girlfriend sites, ‘tube’ sites, freak sites, community sites – you name it, and users are posting it. This online business method seems to be the latest rage in the industry, with about half of our firm’s clients looking to sue an adult tube site and the other half looking to open one. A major lawsuit has already been filed against the operators of PornoTube.com by powerhouse studio producer, Vivid Entertainment, asserting some interesting claims, including a novel take on the legal effect of § 2257 non-compliance.[1]

 

Given the viral spread of this new business and entertainment model, it is important to examine the legal risks associated with operating a user generated adult content website, and consider whether the legal risks outweigh the potential rewards. This article will delve into the varied legal concerns associated with online erotic video-sharing. As with all legal issues, an article is no substitute for competent legal advice. Before considering the operation of a user generated adult content site, it is essential to consult with an experienced adult industry attorney.

 

Section 2257 Issues
An obvious place to begin the discussion is with the complicated and nuanced issue of Section 2257 compliance. User generated website content is not, strictly speaking, produced by the operator. Rather, it is provided by the site’s users, who may, or may not, have created the content. Everything from erotic spousal activity, to shocking body modifications, to hidden bathroom cams, and even clips from unreleased, professionally-produced adult films, may appear on a typical user generated, adult content site. Most of this content is posted automatically to the website for immediate viewing by the site’s users, without prior screening or approval by the webmaster.[2] Naturally, the following question comes to mind: Does the operator of such a website need to comply with either the records keeping or labeling requirements of Title 18, U.S.C. § 2257?

 

As with most 2257 records keeping questions, the issue of compliance comes down to whether the business operation in question “produces” the actual sexually-explicit content. For purposes of this discussion, the term “produces,” as defined in Title 18, U.S.C. § 2257(h)(1)(2)(a), includes:

digitizing an image, of a visual depiction of sexually-explicit conduct; or, assembling, manufacturing, publishing, duplicating, reproducing, or reissuing a…digital image…[and],

 

inserting on a computer site or service a digital image of, or otherwise managing the sexually-explicit content, of a computer site or service that contains a visual depiction of, sexually-explicit conduct.

The term “produces” does not include activities that are limited to:

…digitization of previously existing visual depictions, as part of a commercial enterprise, with no other commercial interest in the sexually-explicit material

 

* * *

 

Distribution

 

* * *

 

[or]

 

the transmission, storage, retrieval, hosting, formatting…of a communication, without selection or alteration of the content of the communication, except that deletion of a particular communication or material made by another person in a manner consistent with 230(c) of the Communications Act of 1934 (47 U.S.C. § 230(c)) shall not constitute such selection or alteration of the content of the communication.

 

Therefore, the relevant consideration comes down to whether the operation of a video-sharing site constitutes digitization of an image, assembly of an image, publication of an image, or managing the sexually-explicit content of a computer site on the one hand; or whether such operation will be considered digitization of a previously-existing visual depiction, mere distribution of the content, or the transmission, storage, retrieval, hosting, or formatting of a communication, on the other hand. Thus far, the courts have not addressed whether the current definitions of compliance-triggering activities, contained in § 2257, apply to the operation of a user generated content website.

 

In light of the Sixth Circuit’s decision striking down § 2257[3], records keeping compliance issues have taken somewhat of a back seat to other legal concerns, given the widespread industry perception that neither 2257 inspections nor prosecutions will occur until the constitutional issues are sorted out in court. Notwithstanding the accuracy or inaccuracy of this perception, the lull in 2257 activity has provided a unique opportunity for proliferation of user generated content sites.[4]

 

The trial court’s decision in the above-referenced case involvingConnection Distributing[5]held that the operation of a “swingers” classified website, which allowed users to post sexually-explicit images of themselves on profile pages, did not require records keeping compliance so long as the website operator did not control the areas of the site where the users posted the 2257 triggering content.[6] While this decision was ultimately superseded by the Sixth Circuit Court of Appeal’s opinion declaring § 2257 unconstitutional on it face, the trial court’s opinion provides some insight into how this issue might be interpreted by the courts.

 

The mechanics of the content submission process can significantly impact whether a video-sharing site is exposed to § 2257 compliance burdens. While a final determination whether to comply rests with the site’s operator, in consultation with the its attorney, the following factors might be considered in making this important decision:

  1. Whether the content is reviewed and approved by the operator prior to its appearing on the site, or whether it is posted automatically (or some combination thereof);
  2. Whether specific categories of content are solicited for, or permitted on, the site, or whether all content types are accepted;
  3. Whether any content subjects or categories are deleted after posting, and how decisions relating to deletion of content are made;
  4. Whether users are restricted in the manner in which content is submitted; i.e., whether profiles or restrictive forms are used, or whether content can be provided in a free form manner;
  5. Whether the site is promoted as featuring specific subjects, or is more general interest in erotic fare;
  6. Whether any ‘seed’ content is included by the operator, and the impact of this content on the ultimate nature of the content submitted by users.

Generally, the less control the operator has over the posting of content by the user, the less likely the site will need to comply with 2257. However, this lack of control comes with a price, since the site operator will therefore be unable to prevent potentially illegal images or video from appearing – even temporarily – on the site. This heightens some of the other legal concerns such as child pornography, obscenity, or copyright infringement, as discussed below.

 

Even if the webmaster’s activities do not fall within the definition of “produces” in the Statute, so as to trigger records keeping obligations, they may meet the definition of the term “distribution.” While distributors are not subjected to records keeping obligations under the Statute, they are required to ensure that a proper “label” appears on the 2257-triggering content. From the standpoint of a user generated content website, this means that the webmaster would need to ensure that each depiction of actual sexually-explicit conduct is accompanied by a proper 2257 disclosure statement, identifying the full name and physical address of the records custodian along with the date and title of the work. Given the natural hesitation of many amateur content producers to provide information about themselves when posting sexually-explicit materials, non-compliance by users is virtually guaranteed if webmasters are considered distributors. The labeling issue therefore presents a significant concern.

 

The legal validity of § 2257 is far from over, since the Government is seeking further judicial review of the Sixth Circuit’s opinion. Thus, it is too early to discount the potential impact of 2257 on user generated content sites. The courts could reinstate the current version of 2257 by subsequent decision, or Congress could act to correct the constitutional concerns noted by the court. While passing new laws may lead to new challenges, it bears noting that the appellate court’s decision in the Connection Distribution case came after more than 12 years of litigation and repeated constitutional challenges. Given the prevalent attitude hostility towards the adult industry that currently exists in Washington D.C., it is likely that some sort of performer age verification law will be on the books for years to come.

 

With respect to the current Statute, there are valid arguments on both sides with respect to its applicability to user generated content. On the one hand, website operators often act to digitize images hosted on user generated sites – even if the process is automated. Erotic images are “inserted” on these user generated content sites by both the user and the operator, and webmasters have some role in “managing” the content of the website — if only to “assemble” or “arrange” the categories or profiles.

 

On the other hand, the visual depictions posted to the site are “previously existing” and usually not created by the operator. Moreover, the webmaster’s activities may well be limited to activities like “transmission, storage, retrieval, hosting, and/or formatting” of the preexisting images, so as to fall within the plain meaning of the records-keeping exemption language. Notwithstanding the above, it remains an open question whether website transmission constitutes ‘distribution’ within the meaning of Section 2257. If so, labeling is required, at a minimum. Obviously, risks of non-compliance exist. Therefore, a decision can only be intelligently made in consultation with an adult industry lawyer, and must be based on the operator’s unique risk tolerance level.

 

Obscenity & Child Pornography
One of the strengths of the user generated content business model is the fact that the webmaster need not purchase or create content – instead it comes free of charge from the users. This strength is also one of the business model’s great weaknesses.

 

Since the content comes directly from users, the webmaster has no opportunity to make decisions as to what kind of content will appear on the site in question. While other adult website operators only purchase or create content that falls within their own risk tolerance levels, user generated content can (and often does) depict just about anything under the sun. This includes content that might be considered obscene or – worse yet – child pornography.

 

With respect to obscenity, the inability to review and approve each image or video clip before it is posted to the website means that somecontent, that the webmaster may not like, ends up on the website – at least for some period of time. The operator can delete the content – either in response to user complaints or its own review process – without losing any exemption that would otherwise be applicable, pursuant to § 2257(h)(b)(V).[7] However, removing the content after it has been posted does not change the fact that it may have appeared on the website at one time, and may have been viewed or downloaded by claimants or government agents. One obvious answer is to review and authorize all content before it goes live to the website. In theory, this is a good option, but it has both legal and practical drawbacks. Initially, as a site’s popularity increases, so does the amount of content posted to it at any given time. The manpower necessary to review each second of every video clip or every image posted to the website may be cost-prohibitive. Moreover, the decision to review all content ahead of time may impact the viability of any claimed exemption under § 2257, as well as the immunity from civil suits provided by § 230 of the Communications Decency Act (discussed later). Pre-selection of acceptable content may well put the webmaster in the position of “assembling” or “managing” the sexually-explicit content, and thus trigger § 2257 obligations. The impact on these legal issues of any particular posting policy should be properly evaluated by the operator and the legal department. However, the foregoing helps illustrate the “Catch 22” facing many user generated content website operators, since both auto-post and content review policies come with associated legal pros and cons.

 

One of the key advantages of 2257 compliance is the almost automatic defense to child pornography claims. The child pornography issues associated with user generated content are serious. Given the lack of 2257 performer records, website operators will usually be in the position of being unable to prove the age of individuals depicted on their user generated content website. No records mean no proof of age and possibly no defense to child pornography charges.

 

Prosecutors in the federal system use something called the “Tanner Scale” to prosecute individuals for child pornography charges, particularly where the actual birth date of the individual depicted in the images is unknown. The Tanner Scale allows prosecutors to call a pediatrician to the stand to testify regarding such factors as breast development, presence/absence of pubic hair,[8] and maturity of the inner thigh tissue, when reviewing images of suspected child pornography, to make prognostications about the suspected age of the individual depicted. Therefore, federal prosecutors need only secure the testimony of a friendly pediatrician “expert” who is willing to testify that the individual depicted on the user generated post appears to be approximately 16-17 years of age, based on these factors. While the operator may ultimately win his or her criminal trial on issues of reasonable doubt, etc., by that time most of the serious consequences of a child pornography prosecution have been experienced, and the victory is quite hollow. Therefore, child pornography risks constitute one of the major drawbacks of the user generated content business model.

 

A strict review policy may be necessary to weed out any even arguable underage images, or obscene material, to avoid the serious consequences. However the details and mechanics of how such a policy is instituted, will affect other issues such as § 2257 compliance and § 230 immunity.

 

Copyright
User generated content sites are copyright minefields. Much of the material posted on both the mainstream and adult-oriented user generated sites is clearly infringing. Some of the larger content producers and website operators blame their declining revenues on the widespread availability of their stolen content on ‘tube’ sites. The adult industry has taken steps to combat this rampant piracy, and is organizing in an effort to present a unified front against these infringers.[9]

 

While the posters of infringing content are directly liable for copyright infringement, they are often penniless individuals sitting in their basement playing on the Internet. Even the RIAA would pass on the opportunity to sue most of them. The only deep pocket here is the website operator. That leads to the following question: Can user generated content sites be sued for vicarious or indirect copyright infringement for allowing routine use of their services to display copyrighted material without a license? The answer to that vexing question will likely come from the courts in the case filed by Viacom Entertainment against YouTube.com.[10] The legal issues are thorny. Ordinarily, websites which merely allow others to post material online, without any other interest in, or selection of, the content of the material posted, can argue that they are protected by the ‘safe harbor’ provisions of the Digital Millennium Copyright Act (DMCA).[11] If safe harbor applies, the site cannot be held liable for damages in a copyright case. Before DMCA safe harbor can be asserted, the website must take certain steps to perfect its status as a protected site, including designation of an agent for receipt of copyright notices, posting of a Notice and Takedown policy, and filing the proper forms with the U.S. Copyright Office. The website must also properly respond to any DMCA notices it receives, in order to maintain safe harbor protection. Repeat offenders must be terminated, or the site could lose its safe harbor arguments.[12]

 

But, what if the website’s user-posting technology is routinely used as a device to disseminate infringing materials? Can copyright liability be imposed under those circumstances? A similar argument was made against the Sony Corporation in the days of the Betamax video recorders, by the mainstream movie studios.[13] They alleged that this device’s primary purpose was to facilitate duplication of copyrighted movies and TV shows, and the company should therefore be held liable. The Supreme Court disagreed, and concluded that the Betamax VCR had substantial non-infringing uses, such as making personal backup copies or playing home movies.

 

However, when the Courts considered the Napster[14] and Grokstercases, involving the downloading and copying of mp3 music files, the websites lost. In those cases, the Court found that the primary purpose of both systems was to infringe on copyrights, despite any lawful uses they might have had. In the Grokster case, the Court observed that the device was intentionally marketed to the public as a means to download and trade mainstream music files, which otherwise enjoyed copyright protection.[16]

 

The outcome of the Viacom case against Google’s Youtube.com site, will be governed by the legal principles established by Sony, Napster andGrokster. Good arguments can be made either way, and much will depend on the copyright policing and protections undertaken by Youtube.com. Accordingly, future cases may be dependent on the specific facts relating to the operating policies of the sites in question. To the extent that efforts are made to protect copyright holders’ rights, that will sit well with the courts when DMCA safe harbor is asserted. For now, operators of user generated content sites take a risk when allowing users to upload copyrighted material.

 

Trademark
The trademark issues are similar, but not identical to, the copyright issues. Trademark issues usually arise, in this context, when a trademark owner seeks to hold a website operator responsible for trademark infringement as a result of the webmaster’s involvement in display of the protected mark on the website. Where the webmaster intentionally uses content containing the trademarked word or phrase in a commercial manner, the liability issues are clearer. However, where the webmaster merely creates an online venue allowing third party users to post information without the operator’s prior review or approval, the liability for infringement is less certain. Unlike copyright claims, Congress has not created a safe harbor allowing website operators – or even true ISP’s – to avoid liability as exists with the DMCA notice procedure. Lawmakers apparently overlooked potential trademark liability when designing the DMCA safe harbor, thus creating something of a “no-man’s land” of liability when protected marks are improperly included in user generated content posts.

 

The author has defended hosts, and others, against trademark claims resulting from user generated, or customer generated, content. Concepts of fair use may come into play when the marks are not prominently featured in the content, or only a passing reference is made to them. However, some companies take an aggressive enforcement policy when it comes to any unauthorized display of their marks on websites, thus creating a potential liability concern for operators of user generated content sites. The law has not developed to the point of any type of certainty, thus far. Accordingly, liability resulting from unauthorized publication of protected trademarks on user generated content sites remains a potential area of concern for operators.

 

Online Agreements, Terms & Conditions
Some of the legal concerns referenced above can be mitigated substantially by proper implementation of a good set of User Terms & Conditions. Members authorized to post content to an adult-oriented website should be constrained by a specific set of policies governing the type of content that is acceptable, and the grounds for suspension/termination of the user’s account. It goes without saying that uploading of obscene and child pornographic material must be categorically prohibited by the website’s user agreement. However, the website operator may want to adopt more specific policies as to the type of sexual material that is authorized to be posted to the website. Some operators will restrict depictions of certain fetish practices or depictions of violence, mutilation, amputation, menstruation, bodily fluids, and other distasteful topics. Other operators will try to avoid § 2257 liability by prohibiting any content containing actual sexually-explicit conduct. None of these depictions will be considered obscene, automatically, since the obscenity determination depends on a number of factors, including the local community standards where the case is brought. Certainly, the website operator is free to exclude any type of material that the operator believes will pose an inordinate risk to his or her business operation. However, care should be taken to avoid being so selective to result in a loss of § 230 immunity, loss of DMCA safe harbor, or imposition of § 2257 obligations, as discussed above. Once the site’s policies are adopted, they should be enforced consistently through a meticulous content review procedure. It does little good to adopt strong content posting guidelines which result in little or no actual enforcement activity.

 

The member terms for a user content website must also focus on taking advantage of the immunities provided by the Communications Decency Act,[17] (“CDA”), and the DMCA safe harbor. Section 230 of the CDA provides immunity to certain websites against claims based on the content of messages created by third parties and posted on those websites. Websites protected by Section 230 will be immune from claims like defamation, negligence, infliction of emotional distress, false light, invasion of privacy, etc.[18] The website operator is permitted to delete certain content posted by third parties from the website, which is believed to be obscene, indecent, defamatory, or otherwise illegal, without losing the immunity protection, under the so-called “Good Samaritan” provisions of the Statute. A well-written set of User Terms can outline the nature of this protection, and advise all users of the existence of the immunity protection against claims. At the same time, the Terms can outline the site’s Good Samaritan removal policy. Relatedly, the Terms & Conditions should include a “Notice and Takedown Policy” referenced above, to protect the site’s DMCA compliance efforts. This policy must include the name and contact information for the website’s DMCA Agent, who is appointed to receive and process copyright infringement notices. Done correctly, the inclusion of this information can help protect against damages claims resulting from copyright infringement. Finally, the User Terms should adopt some sort of age verification policy and procedure.[19] Of course, user generated content sites need all of the other legal goodies like Privacy Policies, Age Verification, Affiliate Agreements, SPAM Policies, etc. Cutting edge legal documents are essential for all adult-oriented websites, but given the increased potential for legal claims arising out of the often uncontrollable content submitted by users, all forms of legal protection become even more important. Needless to say, user generated content website operators will be thankful for all the protection that legal agreements can offer, in the event a claim arises.

 

Conclusion
As can be seen by the foregoing, the legal issues associated with user generated content sites are numerous, unsettled, and interrelated. Given the relative recent popularity of this particular business model, little law exists to specifically guide operators or their lawyers. However, legal decisions involving similar websites can be consulted in an effort to predict how the law will develop. Thus far, online companies that merely provide a venue or system for others to communicate on the Internet are treated surprisingly favorably by the courts. Therefore, decisions relating to services such as hosts, ISPs, and chat rooms, have tended to come down on the side of the service provider. However, as webmasters blur the line between access provider and content provider, the courts will be forced to take a closer look at how far the law should go in imposing liability on the operator for content submitted by users. The more involvement that the operator has in the ultimate selection or arrangement of content displayed, or the manner in which it is displayed and promoted, the more likely that the operator will be subjected to the ordinary liability of a content provider. The outcome of theViacom Entertainment and Vivid Entertainment cases will significantly impact the law in this area, as the first cases to interpret these cutting edge issues. Until then, operators should diligently educate themselves as to the potential legal concerns, and work with trained professionals in an effort to reduce liability to reasonably tolerable levels.

 

Lawrence G. Walters, Esquire, is a partner with the law firm Walters Law Group. Mr. Walters represents clients involved in all aspects of the adult industry. The firm handles First Amendment cases nationwide, and has been involved in much of the significant Free Speech litigation before the United States Supreme Court over the last 45 years. All statements made in the above article are matters of opinion only, and should not be considered legal advice. Please consult your own attorney on specific legal matters. You can reach Lawrence Walters at [email protected], www.FirstAmendment.com or AOL Screen Name: “Webattorney.”

[1] Vivid Entertainment LLC v. Data Conversions, Inc. et al, Case No. 2:2007cv08023 (C.D. Ca. Dec. 10, 2007), wherein the Plaintiff claims that Pornotube.com’s failure to comply with § 2257 results in unfair competition, with respect to all the other websites that are required to comply.
[2] Sometimes all of the content is pre-screened by the operator; however this practice may impact some of the legal issues, as discussed below.
[3] Connection Distributing Co. v. Keisler, 505 F.3d 545 (6th Cir. 2007).
[4] It should be noted, however, that failure to comply with 2257 requirements could result in civil claims based on unfair competition, as have been asserted in the lawsuit by Vivid Entertainment against Pornotube.com. See Vivid Entertainment LLC, supra.
[5] Connection Distributing Co. v. Gonzalez, Case No. 1:95CV1993 (N.D. Oh. 2006).
[6] The case did not decide whether the “labeling” obligations were triggered by this activity, however.
[7] Some deletion policies may be so broad as to result in “management” or “assembly” of the content, so caution is urged when developing a deletion policy.
[8] Given current industry grooming trends, this factor is becoming increasingly irrelevant.
[9] Bourne, Justin “‘Piracy Roundtable’ Offers Solutions From Producers,”AVNOnline.com (Jan. 16, 2008), which can be viewed at:http://www.avn.com/index.cfm?objectID=839050DA-AAC2-F716-33386ED5B50B24EC.
[10] Viacom Int’l, Inc. v. YouTube, Inc., et al., Case No. 1:07-cv-02103-LLS (S.D. N.Y. March 13, 2007).
[11] Digital Millennium Copyright Act, 17 U.S.C. § 512. Notably, not all user generated websites will enjoy DMCA safe harbor, depending on their level of control over the content posted to the site. See: Fair Housing Council of San Fernando Valley v. Roommates.com, LLC, 489 F.3d 921 (9th Cir. 2007).
[12] 17 U.S.C. §512(i)(1)(A).
[13] Sony Corp. of America v. Universal City Studios, Inc., 464 U.S. 417 (1984).
[14] A&M Records, Inc. v. Napster, Inc., 239 F.3d 1004 (9th Cir. 2001).
[15] MGM Studios, Inc. v. Grokster, Ltd.545 U.S. 913 (2005).
[16] Id. at 913.
[17] Communications Decency Act of 1996, 47 U.S.C. § 223
[18] E.g., Doe v. American Online, 783 So.2d 1010 (Fla. 2001).
[19] For an example of such a procedure, see, www.BirthDateVerifier.com.