California Governor Gavin Newsom recently signed into law AB1305, another in the line of bills that reflect California’s efforts to tackle climate change. AB1305 amends California’s Health and Safety Code to require certain disclosures from companies that affect claims such as carbon neutral, net zero, and the like, in reliance on voluntary carbon offsets (VCOs). The law applies to business entities selling VCOs in California; those operating in California and relying on California VCOs for any claims, regardless of where the claims are made; and those operating in California and making VCO claims within the state. For more details on the key disclosure requirements and potential questions arising from the new law, click here.
FTC Proposes New Rule Targeting “Junk Fees”
“Service fees.” “Convenience fees.” Whatever a business calls them, consumers don’t like them, and neither does President Biden. The President has repeatedly pledged to end the practice of imposing what he calls “junk fees.” The Federal Trade Commission (FTC) has now issued a new proposed rule (proposed Rule) requiring more transparency in imposition of these fees. Click here for the full article.
CA Court Blocks Enforcement of Controversial Children’s Online Privacy Law
On September 18, 2023, the United States District Court for the Northern District of California granted a preliminary injunction to NetChoice, a tech umbrella group, against California Attorney General Rob Bonta from enforcing the California Age-Appropriate Design Code Act (CAADCA). The court found the CAADCA, which was slated to take effect on July 1, 2024, was likely to violate the First Amendment, marking a major win for online businesses in at least this initial stage of the litigation. To read more about this important development, click here.
The FTC and the Mere Mention of AI
The FTC recently announced an enforcement action involving generative artificial intelligence (AI). The most interesting part: it hardly involves AI at all. There is no alleged misuse of AI, and not even allegations of AI actually being used. Rather, the case is a business opportunities case.
The FTC alleges that three individuals and several inter-related businesses claimed to be able to help consumers launch lucrative e-commerce stores on Amazon.com and Walmart. But, according to the FTC, promises of significant earnings proved untrue for most consumers who had paid $10,000 to $125,000 for the services. The FTC alleges that these practices constituted deceptive advertising and violated the Business Opportunity Rule, among other laws and regulations. The FTC has already obtained a temporary restraining order in the case, and is now seeking preliminary and permanent injunctions, as well as civil penalties. So what does this all have to do with AI?
The advertising for the services allegedly included claims like, “We’ve recently discovered how to use AI tools for our 1 on 1 Amazon coaching program, helping students achieve over $10,000/month in sales,” and “That is how you make $6000 net profit and that is how you find a product in 5 minutes using AI, Grabbly, Priceblink.”
The mere mention of AI turned a fairly ordinary business opportunity case into an AI case. “AI” made it into the headline of the FTC press release and at least some mainstream media has reported on the case – which is normally uncommon except for high-profile FTC enforcement actions, like those against Meta or Amazon.
The lesson here: all eyes (including regulators’ eyes) are on AI. Any novel use of AI in business must be carefully vetted, as does any mention of AI in advertising.
SEC Finalizes New Data Breach Reporting Rule; NIST Releases Cybersecurity Framework 2.0
How should companies respond to and report data security breaches nationally? What cybersecurity practices and procedures reflect current best practices? Two federal agency actions provide new rules and guidance and show that the cybersecurity landscape is changing. First, the U.S. Securities and Exchange Commission (SEC) adopted new rules earlier this month that will (among other things) require publicly-traded companies to disclose “material” cybersecurity incidents on SEC Form 8-K within four business days and make certain cybersecurity disclosures. Second, the National Institute of Standards and Technology (NIST) recently released its latest Cybersecurity Framework, which now includes a section on corporate governance. Cybersecurity issues are directly related to environmental and social governance (ESG) reporting issues and are increasingly important to businesses from a compliance and governance standpoint. The new SEC requirements have garnered industry criticism, and industry organizations are seeking a delay in the September 5, 2023, effective date. Read the full article here.
Ninth Circuit Affirms State Privacy Law Claims Not Preempted by COPPA
On July 13, 2023, a three-judge Ninth Circuit panel denied Google’s challenge of its earlier decision in Jones v. Google, which held that state privacy law claims in a putative class action are not preempted by the federal Children’s Online Privacy Protection Act (COPPA). The December decision reversed a lower court’s dismissal of the action on the grounds that COPPA preempted identical state law claims. Google petitioned the Ninth Circuit to have the case reheard by the full court, and the panel asked the Federal Trade Commission (FTC) to weigh in on the preemption question. In May, the FTC submitted an amicus brief in support of the Ninth Circuit’s finding that COPPA does not preclude identical state law claims. The panel’s decision affirms its December opinion and amends it to note the FTC’s support.
Click here to read more.
FTC Seeks Comments on Proposed Facial Age Mechanism under COPPA
The Children’s Online Privacy Protection Act Rule (COPPA Rule) requires that online sites and services directed to children under 13 obtain parental consent before collecting or using children’s personal information and lists existing methods for such consent. Now the Federal Trade Commission (FTC) is seeking comments on whether it should expand its parental consent methods to include a potential new mechanism. On July 20, 2023, the FTC published a notice in the Federal Register seeking comments on an application from the Entertainment Software Rating Board (ESRB) and tech companies Yoti and Super Awesome that proposes using facial age estimation technology that analyzes the geometry of the face to confirm a person is an adult. The deadline for comments is August 21, 2023.
To read the full article, click here.
EU Seeks Input on Proposed Digital Product Passport Framework
On March 30, 2022, the European Commission (EC) unveiled a proposal for a framework eco-design regulation aimed at creating a policy framework for sustainable products. Among the tools proposed by the EC is the EU Digital Product Passport (DPP), a product-specific data set that would apply nearly to all non-food products sold in the EU and would require disclosure of a vast array of information, much of it currently deemed confidential business information. Through DPPs, both competent authorities and users across the supply chain will have access to information including origin, materials, and sustainability and recyclability, via a scannable QR code. DPPs are intended to promote circularity and economic growth, help consumers make sustainable choices, and improve enforcement. If adopted as envisaged in 2024, the DPP framework would likely enter into force in 2027. In addition to environmental and product safety considerations, intellectual property rights, usefulness of the data, privacy and security are all important issues for affected companies to consider.
To read the full article, click here.
FTC Publishes Updated Endorsement Guides
After an extended public comment period, the Federal Trade Commission (FTC) adopted revised Guides Concerning the Use of Endorsements and Testimonials in Advertising (Endorsement Guides) on June 29, 2023. As we previously posted, the FTC voted to publish proposed revisions for public comment in May 2022. The updated Endorsement Guides and companion FAQs, which include 40 new questions, are intended to provide more specific guidance for companies that engage third parties to promote their brands, products, and services, or which encourage consumers or their own employees or agents to do so. The revisions better reflect the ways companies advertise now, and they address issues such as online influencers, social media tools, fake reviews, virtual or fabricated endorsers, and children’s advertising.
To read the full article, click here.
Contract as Legal Ground? New CJEU Ruling Creates Risks Re Personalisation
What kinds of processing are necessary for the performance or conclusion of a contract?
This is one of the questions the Court of Justice of the European Union (CJEU) was asked to examine in case C-252/21 between Meta Platforms and the German Federal Cartel Office, in which it delivered a judgment on July 4th, 2023.
Before we look at the judgment, it is useful to recall that the General Data Protection Regulation (GDPR) allows the processing of personal data to be based on “contract” as a legal ground (as opposed to e.g., legitimate interests, consent, and others). The European Data Protection Board has repeatedly referred to the need for an “objective link” between that processing and the contractual framework, and a controller must demonstrate such necessity, in accordance with its accountability obligation.
This case specifically examined the question of whether certain processing activities were effectively justified by “contract” as a legal ground in the context of a provision of an online social media service.
The CJEU held that this necessity must be demonstrated, and that the criterion is that the processing must be “objectively indispensable.” In its reasoning, however, the CJEU made an unusual factual assessment regarding personalized services – comments that may have far-reaching implications and may create significant uncertainty.
It is worthwhile quoting key excerpts to show the CJEU’s reasoning:
- “98. […] in order for the processing of personal data to be regarded as necessary for the performance of a contract, within the meaning of that provision, it must be objectively indispensable for a purpose that is integral to the contractual obligation intended for the data subject. The controller must therefore be able to demonstrate how the main subject matter of the contract cannot be achieved if the processing in question does not occur.”
- This means, in practice, not only that without the processing, the contract could not be performed, but also that internal documentation is required to be able to support the “contract” as a legal ground.
- “99. The fact that such processing may be referred to in the contract or may be merely useful for the performance of the contract is, in itself, irrelevant in that regard. The decisive factor for the purposes of applying the justification set out in point (b) of the first subparagraph of Article 6(1) of the GDPR is rather that the processing of personal data by the controller must be essential for the proper performance of the contract concluded between the controller and the data subject and, therefore, that there are no workable, less intrusive alternatives.”
- This suggests that controllers can establish necessity by showing that “less intrusive alternatives” are not workable.
So far, so good. These paragraphs of the CJEU’s judgment show that it is possible to properly justify reliance on “contract” as a legal ground if the service description is not artificial and there are objective reasons to build a service in a particular manner.
However, a little further, the CJEU provides a very significant caveat to this reasoning, by providing its own factual analysis of “personalisation”:
- “102. As regards, first, the justification based on personalised content, it is important to note that, although such a personalisation is useful to the user, in so far as it enables the user, inter alia, to view content corresponding to a large extent to his or her interests, the fact remains that, subject to verification by the referring court, personalised content does not appear to be necessary in order to offer that user the services of the online social network. Those services may, where appropriate, be provided to the user in the form of an equivalent alternative which does not involve such a personalisation, such that the latter is not objectively indispensable for a purpose that is integral to those services.”
- The CJEU always makes an assessment of the way in which EU law should be interpreted and it normally uses the facts of the case purely as context, in order to understand the questions asked to it. This particular paragraph contains an opinion on the facts themselves – in the CJEU’s view (and it was likely provided extensive background on the facts), content personalisation is not objectively indispensable to the provision of “the services of the online social network.” It may be difficult for a national judge (mentioned through the wording “subject to verification by the referring court”) to reach an opposite conclusion, though, due to the moral authority of the CJEU. This makes this particular paragraph unusual.
Next to being unusual, this particular paragraph raises significant questions for other controllers who might rely on “contract” in the context of the provision of personalised services. After all, if personalisation of a social media service is not deemed to be objectively indispensable by the CJEU, what is? The statement also appears to contradict the CJEU’s position that the absence of workable and less intrusive alternatives shows necessity: in our experience, businesses (like Meta and all others) do not usually randomly choose to offer a service in a personalised or non-personalised manner; there are normally objective reasons internally for disregarding or moving away from a particular business model. Yet, the CJEU seems to suggest that a non-personalised social media service is, in any event, workable, without any obvious justification for this position. In this context, this particular paragraph appears unfortunate, as it creates, in our view, a risk that supervisory authorities (whether of their own initiative or spurred on by complaints) and courts might consider without apparent justification that a particular alternative that has been disregarded or left behind by a controller (for valid reasons) is in fact workable. This may even happen to controllers who have built a service as a personalised service from the very beginning.
If anything, this ruling shows the need to carefully consider documentation and the justification for using “contract” as a legal ground.
It is available online, in multiple languages.
For any questions on data protection issues or on how to document necessity of processing, reach out to Peter Craddock or any other member of the Keller and Heckman LLP data law team.