We have previously reported on the requirements of the California Age-Appropriate Design Codes Act (CAADCA or the Act), including mandatory risk assessments, which state Federal District Court injunction arguably violating publishers’ free speech rights under the First Amendment. The Ninth Circuit upheld the decision only with respect to Data Protection Impact Assessments (DPIAs), and further held that such assessments are subject to strict scrutiny and are face unconstitutional. See Netchoice LLC v. California Attorney General Rob Bonta (9th Cir. Aug. 16, 2024). A copy of the decision is available here. hereHowever, the Court declined to grant an injunction against other provisions of the CAADCA, such as restrictions on the collection, use, and sale of minors’ personal data and how data practices should be communicated. Today, we focus on the implications of the ruling for DPIA requirements under consumer protection laws, including 18 (of 20) state consumer privacy laws that require DPIAs for certain “high risk” processing activities.
The Ninth Circuit first concluded that “the DPIA reporting requirement clearly compels speech by requiring covered companies to comment on potential harm to children.” [and] Mandated disclosure, even for purely commercial information, is well known to trigger First Amendment scrutiny, and the court concluded that “in any situation in which a covered company prepares a DPIA report for a particular service, the company must ask whether the new service could lead to the viewing or receipt of harmful or potentially harmful material to children.”[,]”Justify a facial challenge, not a challenge as applied. The Court then applied strict scrutiny (the standard of limitations for non-commercial/editorial/expressive speech) rather than moderate scrutiny (the standard generally applied to commercial speech) because “[t]The DPIA reporting requirements, which require covered companies to express an opinion about and mitigate the risk of children being exposed to harmful content online, go beyond simply regulating commercial speech. ” Applying strict scrutiny requirements that the regulation of protected non-commercial speech is the least restrictive way to achieve the supposed compelling interest of protecting children from harmful content, the Court ruled that “a disclosure regime that requires the mandatory production and disclosure of highly subjective opinions about content-related harms to children is unnecessary to foster a proactive environment in which companies, states, and the public work to protect children’s safety online. For example, states could have developed a disclosure regime that defines data management practices and product designs regardless of whether children are exposed to harmful or potentially harmful content, or a surrogate for content. Instead, states are seeking to indirectly censor material available to children online…”
So what about DPIA requirements for data processing activities that do not affect the types of content that is restricted or made available through a business? While not directly at issue in Netchoice, the Court addressed an amicus party’s argument that the district court’s invalidation of the DPIA reporting requirements in the CAADCA necessarily jeopardizes the same requirements in the CCPA. [and other US privacy laws]The mandatory consumer rights statistics reporting requirements for businesses that process large amounts of personal information under the CCPA regulations are merely “duties to collect, retain, and disclose purely factual information,” and are not subject to the CAADCA’s “specifically, [online services] Noting that children may be exposed to harmful or potentially harmful content online, the Court turned to California Civil Code § 1798.185(a)(15)(B), which states:
(15) … require a business whose processing of a consumer’s personal information poses a significant risk to the consumer’s privacy or security to: … (B) periodically submit to the California Department of Privacy Protection a risk assessment for its processing of personal information, including whether the processing involves sensitive personal information, and to identify and balance the interests of the business, consumers, other stakeholders, and the public that are posed by the processing against the potential risks to consumer rights related to such processing, with a view to restricting or prohibiting such processing if the risks to consumer privacy outweigh the interests of consumers, businesses, other stakeholders, and the public that are posed by the processing.
Again, in its opinion, the Court wrote the following without directly discussing the specific requirements of 1798.185(a)(15)(B) or the current, highly complex DPIA requirements in the proposed implementing regulations for 1798.185(a)(15)(B):
That might not be a problem, since the DPIA reporting requirements require companies to measure and disclose to the government certain types of risks posed by their services. The problem here is that the risk that companies must measure and disclose to the government is the risk that children will be exposed to objectionable speech online. So: [Amicus’] Concerns that the District Court’s decision will inevitably threaten other DPIA schemes across the country are misplaced.
The Court also considered CAADCA’s prohibition on dark patterns, practices deemed to impede consumer notice and choice, rejecting its face challenge that “it is far from certain that such prohibitions should be scrutinized as content-based restrictions rather than content-neutral speech restrictions.” It also reached a similar conclusion about CAADCA’s obligations for clear, conspicuous, and understandable privacy notices and terms of service and policies, finding them to be of a factual nature and subjecting them to deferential review.
The Ninth Circuit carefully limited its decision to DPIAs that require content assessment and judgment, and that require the government to censor unfavorable content. However, it left room for challenges to more traditional DPIAs that require assessment of risky data processing activities, even if they involve limitations on the production and dissemination of information based on the risks and impact on data subjects, and their documentation. Such requirements seem to go beyond the purely factual, uncontroversial, and respectful standard of review that the Court seems to suggest should apply to the CCPA’s statistical reporting and privacy policy requirements. Assuming that intermediate rather than strict review applies to DPIAs that do not affect the choice to release content, a challenged state would need to demonstrate that its general data practices risk assessment requirements (1) directly advance a significant government interest (such as protecting consumer privacy) and (2) the restrictions do not go beyond what is necessary to meet that interest. The Supreme Court suggested that such assessments “may not be problematic,” and noted that its decision, and the district court’s Netchoice decision on which it is based, “do not necessarily require that the requirements be met.”[] As for “other national DPIA systems”, this issue will no doubt need to be resolved on an application basis.
Meanwhile, 18 of the 20 states with consumer privacy laws (excluding Iowa and Utah) require completed DPIAs for “high risk” data processing to be made available for inspection, California will require some sort of yet-to-be-determined submission system, and Minnesota goes further, requiring documentation of a written privacy program and data inventory designed to comply with all aspects of the privacy law.