CA Digital Eraser Law’s Unresolved Ambiguities Make Compliance Tricky

Published On November 25, 2014 | By Melissa Maalouf | Uncategorized
TwitterLinkedInFacebookRedditCopy LinkEmailPrint

With the fast-approaching compliance deadline of January 1, 2015 and no guidance from the California AG’s office, online and mobile service operators are still wondering if and how California’s new “Privacy Rights for California Minors in the Digital World” law will impact them. A recent BNA webinar focused on the new law and its many ambiguities and featured Drew Liebert, who has served on the CA Assembly Judiciary Committee for 25 years. The webinar’s panelists agreed that the new law is still a “work in progress” fraught with ambiguities that will make compliance—and enforcement—tricky. While many are looking to the federal Children’s Online Privacy Protection Act (“COPPA”) to fill in the gaps, a number of key distinctions between the two laws undermine this approach.

As previously discussed on our blog, the new law affects services with minor users, defined as under 18 years old, in two ways. It will (1) prohibit advertising a list of high-risk items (such as tobacco, firearms, drugs) to minors and (2) require website operators to permit registered minors to delete content they have posted.


Both provisions apply to services that are either “directed” to minors, or that have “actual knowledge” that minors are using their services. Unlike in COPPA, the definition of a website “directed to a minor” appears to be more subjective, and does not contain a list of factors to consider in making the determination. Specifically, under the CA law, a website is directed towards minors if it “is created for the purpose of reaching an audience that is predominately comprised of minors, and is not intended for a more general audience comprised of adults.” Not only is this definition unclear, but also it appears to look to the intended audience of a service, as opposed to who actually uses it. For example, if a website has content that the operator intended to attract users of all ages, but ends up being used primarily by users under 18, would that mean the law would not apply? Conversely, if a website has content aimed at an under-18 audience, but ends up being used mainly by adults, would that mean the law would apply? Unfortunately, during the legislative process, the only examples of sites “predominantly comprised of minors” were sites like Sesame Street on that would clearly be aimed at a very young audience. Guidance to determine if a site is predominantly aimed at older children in their teens, but not intended for a more general audience of all ages, was not given.

To Age Gate or Not to Age Gate?

The panelists also mentioned that because the law applies to services that are “directed to minors” or who have actual knowledge of under 18 users, many services that are not “directed to minors” may be dissuaded from asking users for their birthdays to avoid having actual knowledge, and thus triggering the requirements of the law. This is another distinguishing factor from COPPA, under which a number of general audience websites currently choose to age-gate to prevent users under 13 from signing up to avoid inadvertently collecting minors’ information.

Advertising Restrictions.

The advertising restrictions are also arguably narrower than COPPA’s restrictions on the collection and use of “personal information.” Specifically, if a service has actual knowledge of a minor user, the operator cannot direct prohibited advertising to that minor “based upon information specific to that minor, including, but not limited to, the minor’s profile, activity, address, or location … and excluding IP address and product identification numbers.” (emphasis added). Under COPPA, IP addresses and other identifiers are considered “personal information” that cannot be used for behavioral advertising purposes. Therefore, in contrast to COPPA, it would appear that a service with actual knowledge that it has minor users could still advertise the list of risky content to such users, provided such advertising is done solely through the use of IP addresses or identifiers that are not “specific” to a minor. Since most websites tie IP addresses and other identifiers to personal information, or could easily do so, it is unclear how this will play out in practice.

Digital Eraser

Similarly, the “digital eraser” part of the law contains a number of ambiguities and features that make it different from COPPA. First, the common name for this section of the law, the “digital eraser,” is really a misnomer. This provision does not give a minor the right to permanently delete his/her data from a service. Rather, it allows a minor to remove a post’s public visibility on a service. An operator is under no obligation to permanently delete the data and can continue to store such data on its servers; thus, the information is not “erased.” Second, it is unclear whether the removal right would extend to a minor’s parent. COPPA allows parents to access, request deletion of, and control the online information of children under 13. In contrast, the removal right appears to be solely the minor’s right. Third, although the law requires operators of services aimed at minors to provide notice regarding their removal procedures, the law doesn’t delineate how to convey such notice. The panelists remarked, however, that notice buried in a privacy policy or terms of service may not be the best way to inform minors of their rights, and all agreed that additional guidance regarding the notice requirement is needed. Finally, in contrast to COPPA, which applies solely to “personal information” collected from a child, the CA law permits a minor to request removal of any content the minor posted, regardless of whether the content contained identifiable information. This broad removal right is subject to several exceptions, including its limitation to allowing removal of a minor’s original post, but not subsequent republication by others.


Due to the law’s numerous ambiguities, the panelists agreed that it will take some time for operators to determine how to comply, especially in the absence of guidance. Mr. Liebert, however, stressed that there is no private right of action under the new law, and due to the law’s ambiguities, operators should not expect the California AG to come knocking on their door on January 1. Instead, if the CA AG’s office’s handling of other amendments to CA privacy laws are any indication, the industry can probably expect guidance documents to be issued before enforcement begins. That being said, services with actual notice or those that believe they are “predominantly comprised” of users under 18 should determine how they can begin complying with the new law by January 1, 2015.

By Melissa Maalouf and Michaelene Hanley

Photo by Waag Society from Flickr

About The Author

Melissa Maalouf’s practice focuses on advising a broad range of clients, from start-ups to established companies, on both U.S. and international data privacy and security issues. Melissa assists clients in drafting appropriate website disclosures, implementing legally-compliant e-commerce flows, responding to FTC Section 5 and state AG enforcement actions, analyzing advertising claims, and children’s online privacy and safety issues. She also regularly helps clients obtain certification under the EU-US Safe Harbor and navigate compliance with divergent international privacy laws.