General

9th Circuit Limits Protections Afforded by Communications Decency Act

Published: Jun. 03, 2016

Updated: Oct. 05, 2020

The Communications Decency Act (“CDA”) generally protects website operators against claims arising from user-generated content. Echoing its withdrawn 2014 decision on the same issues, the Ninth Circuit’s ruling in Jane Doe v. Internet Brands, Inc. again limits the protections afforded by the CDA. The court held that the CDA does not preclude a ‘failure to warn’ negligence suit against a website operator that allegedly knew about criminals using their website to identify victims to sexually assault, but did not notify users.

Internet Brands owns ModelMayhem.com, a social and professional networking website for the modeling industry. Jane Doe, who posted a profile on the site to pursue modeling jobs, alleged that Lavont Flanders and Emerson Callum used the site to lure her to a fraudulent modeling audition. The men drugged and raped Doe and recorded the assault for distribution as a pornographic film. Doe alleged that Internet Brands knew that Flanders and Callum were using ModelMayhem.com to identify victims for their rape scheme. Doe argued that Internet Brands was liable for negligence under California’s duty to warn law, which requires a person with a “special relationship” to either the potential victim or the possible perpetrator of foreseeable harmful conduct to affirmatively warn the at-risk party.

In 2011, the district court held Jane Doe’s action was barred by Section 230 of the Communications Decency Act. Under Section 230, website operators are precluded from liability for tortious actions arising from user-generated content uploaded or posted on their website. CDA immunity has been widely credited as an important factor in the emergence and success of websites through which users can connect, interact, and exchange content (such as social networking sites, marketplaces, forums, and video/image sharing platforms). In February 2014, the Ninth Circuit overturned the district court and ruled that Doe’s negligence lawsuit could proceed; however, this decision was subsequently vacated and the case was re-argued.

Upon rehearing, Internet Brands contended that the Ninth Circuit viewed CDA’s protections too narrowly and ignored its own precedent. An amicus coalition (including the Computer and Communications Industry Association, the Internet Association, and Facebook) further argued that the decision threatened the Internet industry and could discourage website operators from “policing harmful third-party activity because doing so may create knowledge of a particular risk that could be used against the company in litigation.”

In this new decision, the Ninth Circuit disagreed. It found the CDA inapplicable because the negligence suit did not allege Internet Brands was liable as the publisher or speaker. Much of the analysis in the new opinion was identical to the Ninth Circuit’s 2014 ruling. The decision undertook no analysis of the merits of Doe’s claim.

The Ninth Circuit’s revised opinion does little to address industry questions regarding a website operator’s potential obligation to warn users of a known or possible third-party threat, or to assuage concerns over the chilling effects of narrowing CDA protections. The court suggested that an alleged obligation to warn could be accomplished without altering user-generated content or conducting a detailed investigation, musing that operators could “perhaps… post[] a notice on the website or… inform[] users by email.” However, other questions remain. When does a duty to warn materialize? Could an anonymous tip be enough to establish the duty? With corroboration and by whom? If there is a duty, how and to whom must that warning be communicated? The court also did not address when a “special relationship” exists between a website and its users or whether a duty to warn could incentivize operators to not monitor activity. The answers to these questions likely will only be elucidated through subsequent litigation; for now website operators developing policies to regulate user content and conduct must do so in an uncertain legal landscape.