Photo of Katie BondPhoto of Samuel Butler

The FTC recently announced an enforcement action involving generative artificial intelligence (AI). The most interesting part: it hardly involves AI at all. There is no alleged misuse of AI, and not even allegations of AI actually being used. Rather, the case is a business opportunities case. 

The FTC alleges that three individuals and several inter-related businesses claimed to be able to help consumers launch lucrative e-commerce stores on Amazon.com and Walmart. But, according to the FTC, promises of significant earnings proved untrue for most consumers who had paid $10,000 to $125,000 for the services. The FTC alleges that these practices constituted deceptive advertising and violated the Business Opportunity Rule, among other laws and regulations. The FTC has already obtained a temporary restraining order in the case, and is now seeking preliminary and permanent injunctions, as well as civil penalties. So what does this all have to do with AI?

The advertising for the services allegedly included claims like, “We’ve recently discovered how to use AI tools for our 1 on 1 Amazon coaching program, helping students achieve over $10,000/month in sales,” and “That is how you make $6000 net profit and that is how you find a product in 5 minutes using AI, Grabbly, Priceblink.” 

The mere mention of AI turned a fairly ordinary business opportunity case into an AI case. “AI” made it into the headline of the FTC press release and at least some mainstream media has reported on the case – which is normally uncommon except for high-profile FTC enforcement actions, like those against Meta or Amazon. 

The lesson here: all eyes (including regulators’ eyes) are on AI. Any novel use of AI in business must be carefully vetted, as does any mention of AI in advertising.

Photo of Sheila MillarPhoto of Tracy P. Marshall

How should companies respond to and report data security breaches nationally? What cybersecurity practices and procedures reflect current best practices? Two federal agency actions provide new rules and guidance and show that the cybersecurity landscape is changing. First, the U.S. Securities and Exchange Commission (SEC) adopted new rules earlier this month that will (among other things) require publicly-traded companies to disclose “material” cybersecurity incidents on SEC Form 8-K within four business days and make certain cybersecurity disclosures. Second, the National Institute of Standards and Technology (NIST) recently released its latest Cybersecurity Framework, which now includes a section on corporate governance. Cybersecurity issues are directly related to environmental and social governance (ESG) reporting issues and are increasingly important to businesses from a compliance and governance standpoint. The new SEC requirements have garnered industry criticism, and industry organizations are seeking a delay in the September 5, 2023, effective date. Read the full article here.

Photo of Sheila MillarPhoto of Antonia Stamenova-DanchevaPhoto of Anushka N. Rahman

On July 13, 2023, a three-judge Ninth Circuit panel denied Google’s challenge of its earlier decision in Jones v. Google, which held that state privacy law claims in a putative class action are not preempted by the federal Children’s Online Privacy Protection Act (COPPA). The December decision reversed a lower court’s dismissal of the action on the grounds that COPPA preempted identical state law claims. Google petitioned the Ninth Circuit to have the case reheard by the full court, and the panel asked the Federal Trade Commission (FTC) to weigh in on the preemption question. In May, the FTC submitted an amicus brief in support of the Ninth Circuit’s finding that COPPA does not preclude identical state law claims. The panel’s decision affirms its December opinion and amends it to note the FTC’s support. 

Click here to read more.

Photo of Sheila MillarPhoto of Tracy P. Marshall

The Children’s Online Privacy Protection Act Rule (COPPA Rule) requires that online sites and services directed to children under 13 obtain parental consent before collecting or using children’s personal information and lists existing methods for such consent. Now the Federal Trade Commission (FTC) is seeking comments on whether it should expand its parental consent methods to include a potential new mechanism. On July 20, 2023, the FTC published a notice in the Federal Register seeking comments on an application from the Entertainment Software Rating Board (ESRB) and tech companies Yoti and Super Awesome that proposes using facial age estimation technology that analyzes the geometry of the face to confirm a person is an adult. The deadline for comments is August 21, 2023.

To read the full article, click here.

Photo of Sheila MillarPhoto of Ales Bartl

On March 30, 2022, the European Commission (EC) unveiled a proposal for a framework eco-design regulation aimed at creating a policy framework for sustainable products. Among the tools proposed by the EC is the EU Digital Product Passport (DPP), a product-specific data set that would apply nearly to all non-food products sold in the EU and would require disclosure of a vast array of information, much of it currently deemed confidential business information. Through DPPs, both competent authorities and users across the supply chain will have access to information including origin, materials, and sustainability and recyclability, via a scannable QR code. DPPs are intended to promote circularity and economic growth, help consumers make sustainable choices, and improve enforcement. If adopted as envisaged in 2024, the DPP framework would likely enter into force in 2027. In addition to environmental and product safety considerations, intellectual property rights, usefulness of the data, privacy and security are all important issues for affected companies to consider.

To read the full article, click here.

Photo of Sheila MillarPhoto of Tracy P. Marshall

After an extended public comment period, the Federal Trade Commission (FTC) adopted revised Guides Concerning the Use of Endorsements and Testimonials in Advertising (Endorsement Guides) on June 29, 2023. As we previously posted, the FTC voted to publish proposed revisions for public comment in May 2022. The updated Endorsement Guides and companion FAQs, which include 40 new questions, are intended to provide more specific guidance for companies that engage third parties to promote their brands, products, and services, or which encourage consumers or their own employees or agents to do so. The revisions better reflect the ways companies advertise now, and they address issues such as online influencers, social media tools, fake reviews, virtual or fabricated endorsers, and children’s advertising.

To read the full article, click here.

Photo of Peter Craddock

What kinds of processing are necessary for the performance or conclusion of a contract?

This is one of the questions the Court of Justice of the European Union (CJEU) was asked to examine in case C-252/21 between Meta Platforms and the German Federal Cartel Office, in which it delivered a judgment on July 4th, 2023.

Before we look at the judgment, it is useful to recall that the General Data Protection Regulation (GDPR) allows the processing of personal data to be based on “contract” as a legal ground (as opposed to e.g., legitimate interests, consent, and others). The European Data Protection Board has repeatedly referred to the need for an “objective link” between that processing and the contractual framework, and a controller must demonstrate such necessity, in accordance with its accountability obligation. 

This case specifically examined the question of whether certain processing activities were effectively justified by “contract” as a legal ground in the context of a provision of an online social media service.

The CJEU held that this necessity must be demonstrated, and that the criterion is that the processing must be “objectively indispensable.” In its reasoning, however, the CJEU made an unusual factual assessment regarding personalized services – comments that may have far-reaching implications and may create significant uncertainty.

It is worthwhile quoting key excerpts to show the CJEU’s reasoning:

  • “98. […] in order for the processing of personal data to be regarded as necessary for the performance of a contract, within the meaning of that provision, it must be objectively indispensable for a purpose that is integral to the contractual obligation intended for the data subject. The controller must therefore be able to demonstrate how the main subject matter of the contract cannot be achieved if the processing in question does not occur.”
    • This means, in practice, not only that without the processing, the contract could not be performed, but also that internal documentation is required to be able to support the “contract” as a legal ground.
  • “99. The fact that such processing may be referred to in the contract or may be merely useful for the performance of the contract is, in itself, irrelevant in that regard. The decisive factor for the purposes of applying the justification set out in point (b) of the first subparagraph of Article 6(1) of the GDPR is rather that the processing of personal data by the controller must be essential for the proper performance of the contract concluded between the controller and the data subject and, therefore, that there are no workable, less intrusive alternatives.”
    • This suggests that controllers can establish necessity by showing that “less intrusive alternatives” are not workable.

So far, so good. These paragraphs of the CJEU’s judgment show that it is possible to properly justify reliance on “contract” as a legal ground if the service description is not artificial and there are objective reasons to build a service in a particular manner.

However, a little further, the CJEU provides a very significant caveat to this reasoning, by providing its own factual analysis of “personalisation”:

  • “102. As regards, first, the justification based on personalised content, it is important to note that, although such a personalisation is useful to the user, in so far as it enables the user, inter alia, to view content corresponding to a large extent to his or her interests, the fact remains that, subject to verification by the referring court, personalised content does not appear to be necessary in order to offer that user the services of the online social network. Those services may, where appropriate, be provided to the user in the form of an equivalent alternative which does not involve such a personalisation, such that the latter is not objectively indispensable for a purpose that is integral to those services.”
    • The CJEU always makes an assessment of the way in which EU law should be interpreted and it normally uses the facts of the case purely as context, in order to understand the questions asked to it. This particular paragraph contains an opinion on the facts themselves – in the CJEU’s view (and it was likely provided extensive background on the facts), content personalisation is not objectively indispensable to the provision of “the services of the online social network.” It may be difficult for a national judge (mentioned through the wording “subject to verification by the referring court”) to reach an opposite conclusion, though, due to the moral authority of the CJEU. This makes this particular paragraph unusual.

Next to being unusual, this particular paragraph raises significant questions for other controllers who might rely on “contract” in the context of the provision of personalised services. After all, if personalisation of a social media service is not deemed to be objectively indispensable by the CJEU, what is? The statement also appears to contradict the CJEU’s position that the absence of workable and less intrusive alternatives shows necessity: in our experience, businesses (like Meta and all others) do not usually randomly choose to offer a service in a personalised or non-personalised manner; there are normally objective reasons internally for disregarding or moving away from a particular business model. Yet, the CJEU seems to suggest that a non-personalised social media service is, in any event, workable, without any obvious justification for this position. In this context, this particular paragraph appears unfortunate, as it creates, in our view, a risk that supervisory authorities (whether of their own initiative or spurred on by complaints) and courts might consider without apparent justification that a particular alternative that has been disregarded or left behind by a controller (for valid reasons) is in fact workable. This may even happen to controllers who have built a service as a personalised service from the very beginning.

If anything, this ruling shows the need to carefully consider documentation and the justification for using “contract” as a legal ground.

It is available online, in multiple languages.

For any questions on data protection issues or on how to document necessity of processing, reach out to Peter Craddock or any other member of the Keller and Heckman LLP data law team.

Photo of Peter Craddock

Until now, fines by the Belgian Data Protection Authority (BDPA) had, compared to its neighbouring countries (France, Luxembourg, and the Netherlands), appeared on the low side in absolute numbers.

Last year we carried out an analysis of over 300 fines related to (alleged) infringements of the General Data Protection Regulation (GDPR), including the top 250 fines imposed on companies with an identified or identifiable turnover, and Belgium appeared in 18th position among EU data protection authorities when comparing the average of the fines examined.

A judgment of 14 June 2023 of the Belgian Market Court (the division of the Court of Appeal of Brussels) may have the indirect effect of significantly changing this.

That judgment followed an appeal by a controller (in this case, bpost, the largest Belgian postal services company) against a 10.000 EUR fine. The Market Court has often overruled decisions by the Belgian Data Protection Authority on procedural grounds, as well as on the merits, i.e., the actual assessment of allegations of infringements, but in this particular case, it confirmed the Belgian Data Protection Authority’s decision in those respects.

It nevertheless decided to follow the controller’s arguments that the fine itself was not properly justified and reduced the fine to a symbolic Euro.

Preliminary point: do GDPR fines have to be paid even pending an appeal?

In Belgium, the tax authorities are the ones who send a request for payment to a controller or processor fined by the Belgian Data Protection Authority, and the procedure they follow is wholly separate from the appeals process.

In addition, the law instituting the Belgian Data Protection Authority does not foresee an automatic stay of enforcement in case of an appeal. Since a Market Court judgment that we obtained in September 2020, it is nevertheless possible to obtain a stay of enforcement, including the payment of fines, while an appeal against a Belgian Data Protection Authority is pending, but the Market Court has refined its approach over the years and imposes strict conditions.

In this particular case, the text of the Market Court judgment shows that the fine was paid, and reimbursement was requested.

Why was the fine reduced, and what was the Market Court’s reasoning?

The Market Court explains its reasoning as follows in its judgment (rough translation from the original Dutch):

“The Market Court tries to detect which methodology the Litigation Chamber [of the BDPA] applies that allows it to render objective the choice of sanction, including the number of possible fines.

The Market Court agrees with [the relevant controller] that the Litigation Chamber has in a manifestly insufficient manner taken into account, in the determination of the amount of the fine, the specific situation and context […] and the following mitigating circumstances.”

The Market Court goes on to list a range of circumstances that should have been taken into account when assessing the fine, including the fact that the Data Protection Officer’s advice had been sought and the fact that no damages were claimed by data subjects.

Based on that, the Market Court says that the data protection fine is not “properly” justified, from a factual perspective or from a legal perspective.

What does this mean for the future – a new methodology for GDPR fines?

The Litigation Chamber of the Belgian Data Protection Authority has, over time, improved its decision-making process to take into account all of the criticisms from the Market Court, with more detailed decisions and a more balanced process as a result.

In this case, because the Market Court said that it was “[trying] to detect” which methodology was used and that the fine itself was not “properly” justified, it is likely that the Belgian Data Protection Authority will reflect on how to improve the clarity of its methodology for determining which sanction to apply and for determining the amount of a fine.

This could easily be achieved in two ways: by publishing its current methodology or by adopting one that is already public. One like the one finalised on 24 May 2023 by the European Data Protection Board (EDPB), the group of all supervisory authorities within the European Union.

What is the EDPB fining methodology?

The EDPB issues recommendations and guidelines, as well as binding decisions in cross-border cases where there is a disagreement among the supervisory authorities involved in a case.

In its Guidelines 04/2022 on the calculation of administrative fines under the GDPR, as finalised in May 2023, the EDPB proposed the following methodology for calculating GDPR fines:

  1. Identification of the processing operations in the case and evaluation of the application of Article 83(3) GDPR
  2. Identification of the starting point for further calculation of the amount of the fine (by evaluating the classification of the infringement in the GDPR, evaluating the seriousness of the infringement in light of the circumstances of the case, and evaluating the turnover of the undertaking)
  3. Evaluation of aggravating and mitigating circumstances related to past or present behaviour of the controller/processor, and increasing or decreasing the fine accordingly
  4. Identification of the relevant legal maximums for the different infringements (increases applied in previous or next steps cannot exceed this maximum amount)
  5. Analysis of whether the calculated final amount meets the requirements of effectiveness, dissuasiveness, and proportionality, and adjusting the fine accordingly (without exceeding the relevant legal maximum)

Step 2, in particular, takes the form of a mathematical formula – we published DeFine, a tool to help use the 2022 version of the formula (when the guidelines were still merely subject to public consultation and not yet finalised). We will be updating it in the coming weeks to take into account some increases that the finalisation in 2023 has brought with it.

Based on our aforementioned assessment of GDPR fines, use of this methodology would likely lead, throughout the European Union, to higher GDPR fines, purely because the percentages for the “starting point” of the calculation are already higher than those applied in practice by supervisory authorities.

In practice, therefore, adoption of the EDPB methodology would likely trigger (much) higher GDPR fines in Belgium.

Would this not happen anyway?

Since the adoption of the finalised EDPB guidelines, the Belgian Data Protection Authority has already referenced them in a recent decision (available in French) when assessing which mitigating and aggravating circumstances must be taken into account. It is, therefore, already possible that in future fining decisions, the Belgian Data Protection Authority would, in any event, have applied the EDPB fining methodology.

In that context, the Market Court judgment of 14 June 2023 may end up being an additional trigger that accelerates adoption by the Belgian Data Protection Authority of the EDPB fining methodology.

What should I do if my company or organisation is under investigation?

In practice, organisations facing regulatory investigations regarding alleged GDPR infringements – in Belgium or elsewhere – always have to prepare their legal defence well, and the adoption of a new methodology (or publication of an existing one) merely reinforces the need to ensure that you have a team to support you, both internally (in-house legal team, data protection specialists, product teams, communication team) and externally (external legal counsel) in handling such an investigation.

And make sure that you are prepared to challenge the newly adopted methodology, too!

In that context, if you require any assistance in that respect or for any data governance, AI governance, or technology law issues, reach out to Peter Craddock or our Data & Tech team.

Where can I find the new judgment of the Market Court?

The Market Court judgment of 14 June 2023 is available online in Dutch.

Photo of Sheila MillarPhoto of Anushka N. Rahman

On May 23, 2023, the U.S. Federal Trade Commission (FTC) held a public workshop to examine recyclable claims as part of its review of the Guides for the Use of Environmental Marketing Claims (Green Guides). The workshop was split into three panels, discussing current trends, consumer perception, and potential updates to the Commission’s current guidance on recyclable claims.

While recyclable claims for aluminum, paper, glass, and plastics were all discussed during the workshop, the bulk of the discussion focused on the recycling of plastic waste. Panelists addressed several topics, including the following:

  • Should the “substantial majority” basis for an unqualified recyclable claim be changed?
  • Should ability to be recycled or actual reprocessing be considered?
  • Should the Green Guides recognize advanced recycling technologies for plastics?
  • What is the role of the resin identification code (RIC)?
  • Should the FTC engage in a rulemaking to create federal requirements for recyclable claims?

It remains to be seen if information from the workshop will result in changes to the Green Guides or if the FTC elects to initiate a rulemaking to make the Guides binding, but the latter seems unlikely. We provide more details regarding topics discussed during the workshop here.

Photo of Sheila MillarPhoto of Antonia Stamenova-DanchevaPhoto of Anushka N. Rahman

On May 3, 2023, the U.S. Consumer Product Safety Commission (CPSC) issued a provisional order with a $15.8 million civil penalty against Wisconsin-based Generac Power Systems, Inc. (Generac) over charges that for more than two years, Generac failed to report serious hazards caused by some of its portable generators. According to the CPSC, from October 2018 to 2020, Generac was aware of defects in 32 models of its portable generators. During that time and prior to Generac reporting to the CPSC in 2020, five consumers reported suffering finger amputations caused by unlocked handles on the generators. On July 29, 2021, a recall of the portable generators was jointly announced by the company and CPSC.

Section 15(b) of the Consumer Product Safety Act (CSPA) requires manufacturers of consumer products to report to the CPSC defects that could create a substantial product hazard. Section 19 of CSPA makes it illegal to delay such reporting, and companies who fail to comply can be liable for both civil and criminal penalties. In addition to the fine, the settlement agreement requires Generac to implement and maintain a detailed compliance program and system of internal controls to ensure compliance with CSPA. Generac must report to CPSC annually for three years on the actions the company has taken to ensure compliance and must retain CPSC compliance-related records for at least five years. The Commission vote to approve the provisional settlement agreement was 4-0, despite disagreement among commissioners over the penalty amount.

Commissioner Richard Trumka warned that “this Commission will use every tool at its disposal to stop bad actors from harming consumers, including maximum civil penalties and, where warranted, criminal referrals.” He was joined in his support for the penalty amount by Chair Alexander Hoehn-Saric and Commissioner Mary Boyle. Commissioner Peter Feldman, while also voting to approve the settlement, disagreed with the amount of the penalty, which was close to the maximum statutory amount ($120,000 for each violation, and $17,150,000 for any related series of violations). He suggested that imposing the maximum fine in failure-to-report cases should be reserved for cases “where a product hazard results in death, poses a significant risk of death from incidents such as fires, or where there are aggravating factors such as a history of misconduct by the company’s senior management.” Commissioner Feldman also voiced concern about what he considers is a lack of consistency in CPSC’s civil penalty structure. He noted that the case did not involve fatalities, and Generac was a first-time offender. He wrote, “A reasonable reading of the evidence in this case could support a conclusion that the initial reporting delay was born out of a failure to appreciate the nature of the hazard rather than a concealment of the problem from CPSC.”

Indeed, CPSC’s penalty calculus has remained somewhat of a mystery in recent failure-to-report cases. For example, on January 25, 2022, CPSC issued a provisional order fining fitness equipment manufacturer Core $6.5 million for failing to immediately disclose 55 incidents tied to defects in Core’s cable crossover machines between 2012 and 2017. While 11 of those injuries involved head lacerations, none resulted in death. More than five months later, on July 5, 2022, the CPSC issued a provisional order fining portable fan and heater maker Vornado $7.5 million – less than half of Generac’s civil penalty – after Vornado did not immediately notify the CPSC of multiple consumer reports of overheating and fire involving their VH101 Personal Vortex electric space heater, including one fire that allegedly resulted in the death of a 90-year-old man.

One possible explanation for CPSC’s decision to seek a much higher (and near maximum) penalty in the Generac matter is that the Commission is making good on Chair Hoehn-Saric’s warning following the Vornado settlement that “while the penalty announced today is significant, companies should be on notice that the agency will be even more aggressive in the future.” Facts, of course, do vary, and there is considerable subjectivity in decisions about whether a product has a “defect” that could create a potential safety risk. The stakes involved in these decisions, however, appear to be increasing.