Photo of Sheila MillarPhoto of Tracy MarshallPhoto of Liam Fulling

As expected, Congress’ renewed focus on expanding protections for minors online has resulted in legislative developments that attempt to mitigate harms while adhering to the Constitution’s free speech and preemption parameters. Last month, updates to both the Kids Online Safety Act (KOSA) and the Children’s Online Privacy Protection Act (COPPA) 2.0 bills were released with increased support from Congressional members, but scrutiny of their legality and scope abounds.

In our article Children’s Online Privacy: KOSA and COPPA Updates, we provide information on the latest modifications made to KOSA, which now includes a reasonable care standard and less enforcement power afforded to State Attorneys General, among other things. We also explain how COPPA 2.0 expands upon existing protections for minors online by covering all minors under the age of 17 and including platforms that are “reasonably likely” to be used by minors under its umbrella.

We then explain how these proposed bills include similar infirmities to recent state legislation that has been met with judicial pushback. The California Age-Appropriate Design Code Act (CAADCA) was found to be likely violative of the First Amendment and is currently awaiting further judicial review, and state social media laws in Florida and Texas were appealed all the way to the Supreme Court on claims that they were violative of social media companies’ First Amendment rights.

The bottom line is that the courts will have just as much influence establishing legal parameters for the protection of kids and teens in the online ecosystem as state legislatures, Congress, and federal agencies.

To access Children’s Online Privacy: KOSA and COPPA Updates, click here.

To access the compendium 2023 U.S. Advertising and Privacy Trends and 2024 Forecast: Focus on Kids and Teens, click here.

Photo of Sheila MillarPhoto of Tracy Marshall

During 2023, legislative, congressional, and executive actions aimed at protecting children and teens online took center stage. Such actions included: legislative attempts to raise the age of a “child” at both the federal and state levels for advertising and privacy purposes; bans on behavioral advertising targeting minors; efforts to restrict access to social media by minors; First Amendment legal challenges; and the Federal Trade Commission’s (FTC) long-awaited proposed changes to the Children’s Online Privacy Protection Act (COPPA) Rule.

Over the last few years, we have reported on regulatory, legal, and voluntary initiatives aimed at expanding protections for minors online, from calls by advocates for the FTC to up-age COPPA, federal and state restrictions on targeted advertising directed at children and teens, and debates about the scope of COPPA preemption, to friction between First Amendment rights and the public policy objectives around protecting minors from certain types of content online. In 2024, we expect that policy and legal actions related to advertising and privacy will continue to focus on protecting children and teens.

In our article 2023 U.S. Advertising and Privacy Trends and 2024 Forecast: Focus on Kids and Teens, we provide a historical look at children’s privacy, beginning with how different jurisdictions and agencies define the age of a “child.” We dive into federal and state legislation and discuss the plethora of bills surrounding this highly debated topic, which affect online services, social media platforms, and other businesses targeting U.S. audiences. COPPA was the first children’s privacy law anywhere, and the FTC has brought many enforcement actions, but a debate regarding how to safeguard all minors online that was once thought settled continues. This has led to increasing calls to “up-age” the definition of “child” to cover all minors under 16 or 18.  We highlight the federal and state legislative landscape, key proposals in the FTC’s COPPA Notice of Proposed Rulemaking, and current litigation. This includes preemption, First Amendment decisions, and recent suits targeting online platforms such as Meta, TikTok, and YouTube, alleging that social media companies harm children and teens, physically and mentally, through addictive algorithms that expose them to inappropriate content. We end with our take on the year to come, which promises to involve ongoing constitutional, legal, and policy questions likely to affect a broad swath of businesses.

To access the compendium 2023 U.S. Advertising and Privacy Trends and 2024 Forecast: Focus on Kids and Teens, click here.

Photo of Peter Craddock

Keller and Heckman has submitted comments to the European Data Protection Board (EDPB) in the context of a public consultation on their draft guidelines 2/2023 on the Technical Scope of Art. 5(3) of ePrivacy Directive, on behalf of various organisations that wished to contribute in a meaningful manner without drawing attention to their identity.

What guidelines?

Based on the EDPB’s proposed guidelines, Article 5(3) of the ePrivacy Directive (ePD), commonly called the “cookie rule,” would be extended to cover not only cookies and similar active storage of, and active access to, information on a user’s device (e.g., phone, computer, home router), but also nearly any interaction with a computer or network (such as the automatic transmission of information to a web server when accessing a webpage, or the ephemeral storage that a computer generates for any website or application being loaded, purely in order to be able to run it).

Keller and Heckman Partner Peter Craddock (heading our EU Data & Technology law practice) has published several leading articles on the topic (see, for instance, the summary on “Why every company with digital activities should comment on the EDPB’s new ePrivacy guidelines,” and more in-depth articles published on LinkedIn), and several organisations have shared their concerns that the EDPB’s proposed guidelines could lead to turbocharged consent banners and could neuter various validation techniques that are notably critical for fighting fraud and ensuring compliance.

Requests brought forward in the submissions filed

The relevant organisations’ requests can be summarised as follows:

  • Re-evaluating the EDPB’s authority to adopt those (proposed) guidelines and (i) restricting them to only the material and territorial scope of the General Data Protection Regulation (GDPR) or (ii) transforming them into mere recommendations, ideally with also the support of all competent regulators;
  • Restricting the scope of the notions of “access” and “storage” under Art. 5(3) of the ePrivacy Directive to active storage specifically directed by the entity to whom the obligations under that provision apply, and active access to terminal equipment on the initiative of such entity;
  • Providing guidance on how the consent exemptions would apply, based on the EDPB’s (thus adapted) understanding of the notions of “access” and “storage,” and in particular on the scope of the “service” consent exemption to provide greater legal certainty, notably as regards to (i) “access” or “storage” that is statutorily authorised or required for the activities of the relevant service provider and (ii) activities that underlie a service, from its conception all the way to actual provision of the service to a given user, as well as the reuse of lessons from a given user’s interaction in order to improve the service for a subsequent user
  • In relation to consent, confirming that (i) organisations are permitted to bundle a broad range of technologies covered by Art. 5(3) ePD together into one or more simple terms in any consent request form, without this affecting the validity of any consent given, and that (ii) any such bundling of technologies further to an expansion of the scope of Art. 5(3) ePD (compared to the most recent guidance of authorities) does not negate any consent given beforehand

You can view the submissions in question here. The EDPB should also soon make them available through their public consultation page here.

Photo of Sheila MillarPhoto of Antonia Stamenova-Dancheva

The Consumer Product Safety Commission (CPSC or Agency) recently published a Supplemental Notice of Proposed Rulemaking (SNPR) (88 Fed. Reg. 85760 (December 8, 2023)) to revise the existing rule on Certificates of Compliance (CoC or certificates), 16 CFR § 1110 (Rule 1110). The last time CPSC proposed changes to Rule 1110 was in 2013, when the Agency received more than 500 comments responding to its Notice of Proposed Rulemaking (2013 NPR), many voicing specific legal and other objections to the proposed changes. A decade later, CPSC is reviving the CoC rulemaking process. This SNPR proposes a number of significant changes to Rule 1110, including the addition of an electronic filing (eFiling) requirement for all imported CPSC-regulated products or substances, an expanded definition of “importer” to include the importer of record and certain other entities, and new CoC content and recordkeeping requirements. 

Given the staggering number of imported consumer products and the expansive scope of proposed changes, the SNPR warrants close attention by anyone making CPSC-regulated products abroad for distribution in the United States or direct shipments to U.S. consumers. Comments are due February 6, 2024. Read more here.

Photo of Sheila MillarPhoto of Tracy MarshallPhoto of Peter CraddockPhoto of Liam Fulling

As the federal government continues to wrestle with the complex issue of regulating Artificial Intelligence (AI) in the wake of the release of President Biden’s Executive Order, states have already proposed or enacted AI regulation, and even more will attempt to tackle the issue in 2024. Two recent developments in AI regulation from California and New Hampshire highlight different approaches states are taking in the absence of federal preemption. In the meantime, the European Union is also proceeding with efforts to flesh out a regulatory framework for AI. Inconsistencies and operational challenges are already apparent in reviewing these frameworks. What does this mean for businesses and consumers? Read more here.

Photo of Sheila MillarPhoto of Antonia Stamenova-DanchevaPhoto of Anushka N. Rahman

Manufacturers and distributors of household cleaners and similar chemically-based consumer products sold in Canada should be aware of an uptick in recalls by Health Canada for violations of the labeling requirements of the Consumer Chemicals and Containers Regulations, 2001 (CCCR) under the Canada Consumer Product Safety Act (CCPSA). The CCCR establishes detailed product classification criteria and labeling and packaging requirements for potential hazards in certain consumer products.

In just the last three months, Health Canada has issued nine recalls for products from various countries, with goods ranging from stain removers to tire coatings. These include recalls of PURE RESIN Brand Epoxy Resin Kits (China), Quicksilver Corrosion Guard and Quicksilver Light Gray Primer Spray Paint (USA), “Salt Eraser” salt stain remover (also cited for lack of child-resistant packaging) (Canada), Flikrfire personal Firepots (USA), and Bubble Angel Kettle Cleaner (Japan). According to Health Canada’s website, mislabeling constitutes a potential hazard since “the lack of appropriate labelling information could result in unintentional exposure to the products and lead to serious illness or injury.”

It is worth noting that labeling requirements under the CCCR differ from labeling requirements under the U.S. Federal Hazardous Substances Act (FHSA), so compliance with one set of requirements does not guarantee compliance with the other. While a compliant FHSA-labeled product should not result in mishandling the product or unintentional exposure to hazardous substances, these recent actions underscore the importance of understanding the differing regulatory requirements and taking steps to confirm that products introduced into Canada are labeled in accordance with the CCCR.

Photo of Peter Craddock

So much for tackling consent fatigue. The short version: If unchanged, the new EDPB guidelines on what is known as the “cookie” rule would extend that rule to cover nearly every communication over the Internet and any use of software on a computer. Your business is probably more impacted than you might think, and it is important for you to take part in the public consultation that runs until 28 December 2023 – so reach out rapidly.

The ePrivacy Directive (2002/58/EC) is a misunderstood piece of legislation. While the public often links cookie banners to the General Data Protection Regulation (GDPR) of 2016, they actually stem from Article 5(3) of the ePrivacy Directive. This provision, as strengthened in 2009, requires EU Member States to ensure that “the storing of information, or the gaining of access to information already stored, in the terminal equipment of a subscriber or user” is prohibited unless (i) the subscriber or user has consented to this storage/access, (ii) that storage/access is “for the sole purpose of carrying out the transmission of a communication over an electronic communications network” or (iii) that storage/access is “strictly necessary” for the provision of an “information society service [read: digital service] explicitly requested by the subscriber or user”.

In practice, therefore: no cookies or similar files can be placed on your device (for instance a computer or a smartphone) or accessed on your device if none of those three conditions is met (consent – which must meet the GDPR consent requirements; “strictly necessary” for a digital service; or for the sole purpose of transmission of an electronic communication).

On 16 November 2023, the European Data Protection Board (EDPB) published its new Guidelines 2/2023 on Technical Scope of Art. 5(3) of ePrivacy Directive. The legal value of those guidelines is yet to be determined, given that the EDPB does not comprise all national authorities with the power to enforce the local implementation of that ePrivacy provision), and there are arguments to say that the EDPB did not have the authority to adopt “guidelines” on the topic (see a more detailed analysis here).

In these guidelines, the EDPB sets out a new interpretation of that provision: Article 5(3) of the ePrivacy Directive should apply not only to cookies and files stored on a device and (actively) accessed from a device but also to (i) any information that the device transmits automatically (such as the URL of a webpage being accessed or the public IP address of the Internet connection, sent automatically to make a connection possible with a website or application) as well as (ii) temporary information generated on that device and information stored ephemerally (i.e., not held on persistent storage, such as a hard drive). The consequence? Every bit of information that relates to a device, even indirectly (such as an IP address, tracking pixels, URLs visited, Internet-of-Things device reporting), is covered, regulating the simplest digital activities such as loading contextual advertising and running a script in Javascript to make a website seem more dynamic.

Is automatic receipt of information “access” to the terminal equipment sending that information?

The position in relation to Germany in particular reveals a fundamentally new approach regarding what constitutes “access”.

In 2021, the German regulators collectively stated the following regarding the notion of “access” to a terminal equipment:

“An access requires a targeted transmission of browser information that is not initiated by the end user. If only information, such as browser or header information, is processed that is transmitted inevitably or due to (browser) settings of the end device when calling up a telemedia service, this is not to be considered “access to information already stored in the end device.”” [machine translation]

One German regulator confirmed this again explicitly in 2022.

By way of comparison, in the new guidelines, the EDPB states that in its view, “access” can be both (i) a situation where “the accessing entity […] proactively send[s] specific instructions to the terminal equipment in order to receive back the targeted information” and (ii) a case where an “entity may have used protocols that imply the proactive sending of information by the terminal equipment which may be processed by the receiving entity”. This second point may seem innocent and harmless, but it is leaned upon heavily by the EDPB to conclude that the fact that information is sent automatically following a communication protocol (e.g., an IP address) shows that there is an “entity instructing the sending of information”.

Beyond being linguistically problematic (“access” has an active connotation), this position makes any communication over the Internet “access” by the recipient, because Internet communications all require the transmission of certain information as defined by the relevant communication protocol.

To take an illustration perhaps not anticipated by the EDPB, e-mails could, following that logic, be passive “access”. After all, the recipient is “accessing” the e-mail content and the sender’s e-mail headers, such as the name that he or she configured for that e-mail account, i.e., information that was stored even temporarily on the sender’s device during the drafting and sending of the e-mail and that is sent automatically because the developers of the relevant communication protocol (IMAP and POP being the main ones for sending e-mails) decided that this information would be sent for all e-mails. As a result, any further use of the content of e-mails would be subject to Article 5(3) of the ePrivacy Directive, following the EDPB’s approach – which in turn means that the recipient would have to prove (i) the consent of the sender to the use of the e-mail or (ii) the fact that such use of that e-mail is strictly necessary to the provision of a digital service that the sender explicitly requested. In practice, e-mail retention would overnight become illegal.

Why did the EDPB create this idea that the designer of a protocol is giving “instructions” to a device that mean that information will be “accessed” on the device? Likely because this was the only way for the EDPB to extend the scope of Article 5(3) of the ePrivacy Directive to cover IP addresses, which are frequently used in support of delivery of content and ads (whether personalised or not) and in support of analytics (for instance, monitoring usage of a website or app).

However, the end should not justify the means, and the means here create a significant broadening of the ePrivacy Directive’s scope in a manner that does not appear to have ever been the intent of the EU legislator.

Is the transmission of information stored ephemerally really a form of access of information “already stored”?

The EDPB’s position regarding information generated on a device and not stored in persistent storage such as a hard drive likely stems from the fact that the ePrivacy Regulation (if ever adopted) would foresee that it applies not only to the use of the storage capabilities of terminal equipment but also to the use of the processing capabilities of terminal equipment.

The technologies that the EDPB lists as being covered by Article 5(3) of the ePrivacy Directive (RAM and CPU cache) are inherently at best ephemeral “storage”: information is temporarily “stored” in them purely because that information is actively being used by the device (RAM) or because it is frequently used and the computer decides on its own to store that information in a place that is even more rapidly available (CPU cache).

The EDPB’s approach is problematic, first because the actual legal text talks about “the gaining of access to information already stored” (which introduces a notion of time – an instantaneous calculation could hardly be seen as “already” stored) and only uses examples of actual storage (such as cookies), without providing any illustrations that are more ephemeral.

Second, because – due to the central role of RAM and CPU cache in the way computers work – it means that no interaction with a computer is permitted unless you can show that there is (i) consent, (ii) strict necessity for provision of an explicitly requested digital service or (iii) necessity for the sole purpose of transmission of a communication. While many such interactions will fall within the “service” scenario, most companies will probably want to prepare an “information processing notice” (along the same lines as a cookie notice) for their software and websites, just to be on the safe side, in case a complainant or regulator challenges the applicability of the “service” exception. In other words, what may have been part of an initiative to combat consent fatigue may end up worsening it dramatically.

Why then should your company take part in the public consultation?

Irrespective of your sector and activities, if you have any digital activities, they will be impacted by these guidelines. Even if you only have one website, there may be elements on that website that could be challenged (e.g., an ad banner, even if it is contextual advertising or a “sign up for a newsletter” popup that appears to an individual who has not yet seen it on his or her device). If you use IP addresses for anti-fraud checks, if you use URL parameters to track how many people read your newsletters, this will be covered. Even the developer of the “phone” core application on your smartphone needs to pay attention, as the phone number of the recipient of a call could be considered to be “stored” on the sender’s phone (temporarily) and “accessed” by the recipient as a result of the communication protocol – so any further use by the recipient beyond the communication (e.g., lists of past calls) could be regulated under Article 5(3) ePrivacy Directive following the EDPB’s approach.

That broadening of the scope can severely impact your company’s ability to deploy or use digital services and tools if you cannot rely on (i) GDPR-compliant consent or (ii) a strict necessity to provide a digital service explicitly requested by the user.

This may seem to be a significant overreach and twisting of the words of the ePrivacy Directive – and to limit the risk for your business, it may be worthwhile responding to the public consultation (which runs until 28 December 2023).

Past consultations have shown that the EDPB usually sticks to its position. One notable exception was its adoption of a slightly more pragmatic approach in (only some parts of) its recommendations on “supplementary measures” for data transfers, but more often the changes appear to lead to a slight hardening of the EDPB’s position (see e.g., the recent administrative fines guidelines, which were barely modified and in fact were modified to propose even higher fines).

It therefore appears unlikely that the EDPB will suddenly restrict the scope of these guidelines or to clarify why the legislator might have intended for both ephemeral processing and passive “access” to be covered, but it may see fit to at least clarify its legal reasoning – which may come in handy in case of (likely) litigation in relation to the enforcement of the positions it sets out.

From that perspective, it may remain useful to submit comments on the guidelines.

Not yet convinced, or would you like a more in-depth analysis of the new guidelines? Read our in-depth review of the content of the guidelines, containing additional points of criticism that may be useful to your assessment of the impact of the guidelines on your business.

And would you like to send comments to the EDPB, but are you concerned about the fact that your name as responder will be made public? Get in touch – we have helped clients submit responses confidentially, by serving as the intermediary and signatory.

Photo of Sheila MillarPhoto of Antonia Stamenova-Dancheva

On November 13, 2023, the Federal Trade Commission (FTC or Commission) sent warning letters to the American Beverage Association (AmeriBev), The Canadian Sugar Institute, and a dozen dietitians and influencers promoting the safety of artificial sweetener aspartame or the consumption of sugar-containing products on TikTok and Instagram. The letters allege that the dieticians and influencers did not adequately disclose that the associations paid for the endorsements. The FTC’s Guides Concerning the Use of Endorsements and Testimonials in Advertising (the Endorsement Guides) state that “paid endorsements should clearly and conspicuously disclose any unexpected material connections to ensure that consumers have the information they need to make informed purchasing decisions.” The warning letters put the trade groups and influencers on notice that further violations of the Endorsement Guides could result in fines of up to $50,120 per violation. Read more here.

Photo of Sheila Millar

Children’s and teen’s online privacy and safety – particularly their mental health – continues to be an area of intense scrutiny for lawmakers, regulators, and enforcers. Last May, the Biden administration announced the creation of a new task force focused on the safety, privacy, and wellbeing of children online, linked to an Advisory on Social Media and Youth Mental Health issued by the U.S. Surgeon General the same day. The task force is slated to produce voluntary guidance, policy recommendations, and a toolkit on safety, health, and privacy-by-design for industry developing digital products and services by Spring 2024. As part of this initiative, the National Telecommunications and Information Administration (NTIA) of the Department of Commerce (DOC) published a Request for Comments (RFC) in the Federal Register on October 10, 2023. The RFC seeks public feedback on the best ways to protect the mental health, safety, and privacy of minors online, now characterized as an urgent public health issue by the Surgeon General.

But there’s more. Proving that both red states and blue states can agree on some issues, a bipartisan group of state attorneys general (AGs) filed a federal lawsuit against social media giant Meta Platforms Inc (Meta) and other Meta entities on October 24, 2023, and nine AGs filed complaints in their states. The complaints allege violations of the Children’s Online Privacy Protection Act (COPPA) and other legal violations related to allegedly harmful design features and practices by the Meta entities that, the complaints allege, contribute to body dysmorphia, sadness, suicidal thoughts, and other mental health harms. Read the full article here.