Photo of Sheila A. MillarPhoto of Jean-Cyril Walker

At the request of multiple stakeholders, the Federal Trade Commission (FTC) announced today that it is giving additional time for the public to weigh in on proposed changes to its Green Guides for the Use of Environmental Claims (Green Guides). The FTC has extended the public comment period for 60 days, now ending on April 24, 2023.

The extension is good news for organizations who want to raise points about the changes the FTC is contemplating.

Information about how to submit comments can be found in the Federal Register notice.

On January 27, 2023, CalRecycle, the California agency that oversees the state’s waste management, recycling, and waste reduction programs, published a Notice of Proposed Rulemaking regarding amendments to the Recycling and Disposal Reporting System (RDRS). The proposed regulations are intended to update the state’s RDRS to better comply with various state recycling and disposal laws and objectives, including California law SB 343, which restricts recyclability claims by narrowly defining what “consumer goods” and packaging are considered “recyclable” in California.

According to CalRecycle, the proposed amendments would revise RDRS to collect information on what material types and forms are actively recovered by facilities and how that material was collected and to classify exports of mixed plastic waste as disposal if the mixed waste stream excludes the “more recyclable plastics. Stakeholders are asked to weigh in on the proposed changes. Read more here.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

When the California legislature passed the California Age-Appropriate Design Code Act (CAADCA or Act) AB 2273 in September of this year, it generated considerable controversy. Companies, trade associations, and even some non-governmental organizations questioned whether the law’s broad reach was not just counterproductive and likely to invade consumer privacy, but preempted by federal law and unconstitutional. The legal questions are about to be tested in a court case that could have far-reaching repercussions for online platforms and legislators alike. On December 14, 2022, NetChoice, an umbrella organization of tech companies including Amazon, Google, Meta, TikTok, and Twitter, announced that it had filed a complaint against California Attorney General Bonta, alleging that the CAADCA violates the Constitution and is preempted by federal law, including the Children’s Online Privacy Protection Act (COPPA).

For more details, click here

Photo of Sheila A. MillarPhoto of Jean-Cyril Walker

Update: The formal Federal Register notice was published on December 20, 2022, and comments are due on February 21, 2023.

On December 14, 2022, at an open meeting of the Federal Trade Commission (“FTC” or “Commission”), FTC Commissioners voted unanimously to publish a Notice in the Federal Register announcing a Request for Public Comments on potential amendments to the Commission’s Guides for the Use of Environmental Marketing Claims (“Green Guides” or “Guides”). The FTC solicits comments on the ongoing need for the Guides and on specific claims addressed in the Guides, including “recyclable,” “recycled content,” “degradable,” “compostable,” and more. It also asks if it should initiate a rulemaking process and address claims it declined to consider during the last review, such as “organic” and “sustainable.” Importantly, given the growth in some state laws that purport to restrict claims, the FTC asks for input on whether the Guides conflict with federal or state laws. This proceeding is expected to garner significant input, with comments due 60 days after publication in the Federal Register.

For more details, click here.  

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

Deceptive reviews and endorsements have been an increasing area of scrutiny by the Federal Trade Commission (FTC or Commission). In the last few years, the Commission has brought myriad complaints against companies for engaging in such practices. Last year, the FTC resurrected its long-dormant Penalty Offense Authority, warning more than 700 companies in a Notice of Penalty Offenses that they could be liable for significant civil penalties if they post misleading reviews or endorsements. Last month, for the second time in just a few months, the FTC announced an Advance Notice of Proposed Rulemaking (ANPR) under Section 18 of the Federal Trade Commission Act with a view to issuing a possible Trade Regulation Rule (TRR), asserting that the Commission’s current enforcement tools are insufficient to deter bad actors. The October 20, 2022 announcement, as with this summer’s announcement of an ANPR on commercial surveillance and data security, is the first step in a formal rulemaking process. The FTC seeks public comment on whether a mandatory rule is needed to address the growing problem of fake or hijacked reviews, suppression of negative reviews, or reviews that lack proper disclosures of a connection with the advertiser.

The FTC argues in the October ANPR that its “current remedial authority is limited. Monetary relief is no longer available under Section l 3(b), disgorgement is not available under Section 19(b), 15 U.S.C. 57b(b), and, while the Commission has deployed new tools to combat this problem, in many cases, it remains difficult to obtain monetary relief. Under these circumstances, the availability of a civil penalty remedy may provide a potent deterrent.” The FTC goes on to stress that the Commission’s Guides Concerning the Use of Endorsements and Testimonials in Advertising (Endorsement Guides) provide useful recommendations to businesses on how to avoid making deceptive marketing claims, but they are not enforceable. The FTC asserts that the deterrence effect of a rule with the possibility of civil penalties “would benefit consumers, help level the playing field, and not burden legitimate marketers.”

The vote approving publication of the ANPR in the Federal Register was 3-1. Commissioner Christine S. Wilson, voting against issuance, cited the cost of the rulemaking and the sufficiency of the FTC’s existing tools for holding businesses accountable for deceptive reviews and endorsements. In a statement, Wilson said that “the Commission already has a multi-pronged strategy in place to combat this issue,” which includes the Endorsement Guides to educate businesses regarding their obligations and a companion business guidance piece. Wilson also referenced the Commission’s request for comments on potential updates and revisions to the Endorsement Guides issued earlier this year, and the Notice of Penalty Offenses issued in October 2021, which may enable the Commission to obtain civil penalties from marketers that use fake or deceptive endorsements or reviews.

The FTC invites stakeholders to weigh in on any aspect of the issues covered in the ANPR but is particularly interested in feedback on the prevalence of fake or deceptive reviews and endorsements, the costs and benefits of a rule that would address them, and alternatives to such a rulemaking, such as additional consumer and business education. The deadline for comments is 60 days after the ANPR’s publication in the Federal Register (which has not occurred as of the time of this writing). While the FTC’s commercial surveillance ANPR is expected to draw significant opposition from industry, false reviews do create competitive harms to businesses, and this proceeding is a chance for businesses and consumers to address those harms.

Photo of Sheila A. Millar

On October 19, 2022, the Consumer Product Safety Commission (CPSC or Commission) approved a Final Rule imposing mandatory safety requirements for clothing storage units (CSUs) that will, according to the CPSC, “significantly change the way clothing storage units are tested and labeled” to address furniture tip-over hazards. The rulemaking has been in the works since 2017, after the CPSC reviewed incident data for furniture tip-overs and determined that CSUs were the primary furniture category involved in fatal and non-fatal incidents caused by tip-overs.

The Rule will supersede voluntary standards, including ASTM F2057-19, Standard Consumer Safety Specification for Clothing Storage Units.

Application

The Rule becomes effective 180 days after publication in the Federal Register (which has not occurred as of this writing) and applies to CSUs manufactured after its effective date. The Rule defines a “CSU” as:

  • “a consumer product that is a freestanding furniture item,
  • with drawer(s) and/or door(s), that may be reasonably expected to be used for storing clothing,
  • that is designed to be configured to be greater than or equal to 27 inches in height (including units with adjustable heights that reach 27 inches),
  • has a mass greater than or equal to 57 pounds with all extendable elements filled with at least 8.5 pounds/cubic foot times their functional volume (cubic feet),
  • has a total functional volume of the closed storage greater than 1.3 cubic feet, and
  • has a total functional volume of the closed storage greater than the sum of the total functional volume of the open storage and the total volume of the open space.”

The Rule excludes lightweight units weighing under 57 lbs. when filled with clothing and products such as shelving units or office furniture that do not meet the definition of a CSU.

New Stability Requirements

CSU stability requirements are now required to “reflect real-world factors, like multiple open drawers, drawers containing clothing-representative loads, angling CSUs to replicate the effects of placement on the carpet and forces a child exerts while climbing or pulling on a CSU, all of which are shown to occur during CSU tip-overs and contribute to their instability.” It also includes testing methods for interlocks that prevent drawers from being opened all at the same time.

Information Marking Requirements

The new standard mandates that CSUs are marked and labeled with safety and identification information and that the information is accompanied by a tag that provides performance and technical data about the product’s stability.

*   *  *

The Commission voted 3-to-1 to approve the standard. Commissioner Peter Feldman, voting no, issued a statement in which he expressed concern that a Supplemental Notice of Proposed Rulemaking (SNPR) was needed to minimize the chance of a successful legal challenge because the draft Final Rule differed from the version issued for public comment. He also suggested that issuing a Supplemental Notice would have allowed the House to consider the Stop Tip-Overs of Unstable, Risky Dressers on Youth (STURDY) Act, which passed the Senate. It remains to be seen if furniture industry members will seek to challenge the Rule or if Congress will pass the STURDY Act.

Photo of Peter Craddock

The Internet of Things (IoT) segment has grown, and with it have come many examples of vulnerable products, from babycams whose feeds could be viewed by strangers online to hackable implantable cardiac devices. There are also infamous examples of botnets (i.e., clusters of hacked devices) featuring millions of IoT devices with one common trait: weak security.

The U.S. has had in place both laws and standards designed to address data security. While there is a general obligation to secure data in the General Data Protection Regulation (GDPR), recent developments in Europe show a greater focus on security of information in general, not just personal data.

In 2020 in the United Kingdom, the British government announced that it would work on legislation to require compliance with security requirements or specific standards for consumer connected products. One of the requirements touted was, for instance, a prohibition on setting universal default passwords. This requirement, in turn, would trigger an obligation to ensure that all passwords within a connected device are unique and strong to avoid granting hackers easy access to millions of products once a default password has been cracked. The resulting Product Security and Telecommunications Infrastructure Bill, currently being considered by the House of Lords, will give the UK Secretary of State authority to impose specific security requirements for “internet-connectable” and “network-connectable” products or require compliance with a given standard.

In the European Union, the European Commission published on September 15, 2022 a proposal for a “Cyber Resilience Act,” an EU Regulation “on horizontal cybersecurity requirements for products with digital elements.” This Regulation would require any manufacturer of a “product with digital elements” (i.e., “any software or hardware product and its remote data processing solutions”) to meet minimum cybersecurity requirements to be able to place that product on the EU market.

The concept of a “product with digital elements” does not appear to be limited to hardware + software combinations, as a number of categories of products listed in an annex to the draft Cyber Resilience Act are today pure “software” products, such as a wide range of cybersecurity tools. Thus, the scope of the Cyber Resilience Act is not limited only to IoT products.

The draft Cyber Resilience Act calls in effect for security by design by requiring manufacturers to design, develop, and produce products in accordance with cybersecurity requirements. Notably, manufacturers will be required to undertake an “assessment of the cybersecurity risks associated with [the] product and take the outcome of that assessment into account during the planning, design, development, production, delivery and maintenance phases […] with a view to minimising cybersecurity risks, preventing security incidents and minimising the impacts of such incidents.” This echoes provisions of the draft “NIS 2” Directive (a proposal for a Directive “on measures for a high common level of cybersecurity across the Union”) as well as the principle of “data protection by design and by default” found in the GDPR.

Under the provisions of the draft Cyber Resilience Act, manufacturers will have reporting obligations in relation to actively exploited vulnerabilities on the one hand and security incidents on the other. They will be required to inform ENISA, the EU Cybersecurity Agency, of (i) “any actively exploited vulnerability” contained in the product and (separately) (ii) “any incident having [an] impact on the security” of the product, in each case “within 24 hours of becoming aware of it.” In addition, manufacturers will have to inform users of the incident “without undue delay and after becoming aware” of it. Beyond information regarding the incident, they would also have to inform users, “where necessary, about corrective measures that the user can deploy to mitigate the impact of the incident.”

Moreover, the draft Cyber Resilience Act requires manufacturers to carry out conformity assessment procedures, draw up technical documentation, and ensure that the product bears a relevant CE marking. The interrelationship between this document and existing conformity assessment procedures for products must be carefully evaluated.

The draft Cyber Resilience Act does not place the regulatory burden only on manufacturers. Importers and distributors involved in placing products on the EU market are subject to specific obligations as well, notably in relation to documentation and CE markings. An importer or distributor will moreover be subject to the full obligations of a manufacturer if, for example, the product is marketed under the importer/distributor’s name or trademark, or if the importer/distributor carries out “a substantial modification” of the product already placed on the market.

The security requirements themselves appear to be future-proof and technology-neutral, for instance, the obligation to ensure products are “delivered with a secure by default configuration, including the possibility to reset the product to its original state” or that they are “designed, developed and produced to limit attack surfaces, including external interfaces”. In many ways, these requirements appear to reflect the common principles underlying information security best practices. Products belonging to a “critical” category (this includes a wide range of categories, such as identity management systems, password managers, malware detection software, microcontrollers, operating systems, routers, smart meters, etc.) are then subject to stricter rules, in particular a specific conformity assessment procedure.

The draft Cyber Resilience Act includes links to the draft AI Regulation as well (also under discussion at the Commission). If a product is classified as a “high-risk” AI system under the draft AI Regulation, compliance with the Cyber Resilience Act requirements will automatically be considered as compliance with the cybersecurity requirements under the AI Regulation.

As with other examples of recent legislation (from the GDPR to the Digital Markets Act and Digital Services Act), the draft Cyber Resilience Act includes tough penalties to ensure compliance, as non-compliance can lead to recall or withdrawal of the product from the market or another corrective action and can also lead to fines of up to 15 million EUR or 2.5% of the total worldwide turnover, whichever is higher. These fines are not the maximum risk for companies in case of non-compliance, though, as the draft Cyber Resilience Act explicitly states that it is “without prejudice to [the GDPR]” – which could lead to important questions of liability if a particular action or behaviour constitutes an infringement upon both sets of rules.

Now is the time to ensure that your information security practices are up to speed and that all levels within your organization are properly involved in the devising, rolling out, and maintaining a strong cybersecurity strategy that takes into account all applicable legislation. Companies operating globally will, of course, also need to follow the relevant national policy and guidance as it develops.

For any questions on the Cyber Resilience Act or on how to build such a cybersecurity strategy, reach out to the authors or your usual contact at Keller and Heckman LLP.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

At a press conference on August 11, 2022, the Federal Trade Commission (FTC or Commission) announced an Advance Notice of Proposed Rulemaking (ANPR), which was published, along with a fact sheet, to explore potential new rules governing what the FTC characterizes as prevalent “commercial surveillance” and “lax data security practices.” The FTC issued the ANPR pursuant to its Section 18 authority under the Magnuson-Moss Act, which authorizes the Commission to promulgate, modify, and repeal rules that define with specificity unfair or deceptive acts or practices within the meaning of Section 5(a)(1) of the FTC Act. This broad and complex ANPR was published in the Federal Register on August 22 (87 Fed. Reg. 51273), and comments are due October 21, 2022. The FTC will host a public forum on September 8, 2022, featuring a structured panel discussion and an opportunity for stakeholders to share their views on the ANPR, subject to a two-minute time limit.

What’s Behind the ANPR?

FTC Chair Lina Khan said in a statement that “firms now collect personal data on individuals on a massive scale and in a stunning array of contexts, resulting in an economy that, as one scholar put it, ‘represents probably the most highly surveilled environment in the history of humanity’. This explosion in data collection and retention, meanwhile, has heightened the risks and costs of breaches—with Americans paying the price.” The FTC offers several reasons to justify the proposal. First, the FTC argues that its inability to fine companies for egregious first-time offenses under its Section 5 authority may “insufficiently deter future law violations.” Second, while the FTC can enjoin conduct that violates Section 5, such relief may be inadequate in the context of alleged commercial surveillance and lax data security practices. Third, the FTC argues that even in instances in which it can obtain monetary relief for violations of Section 5, such relief may be difficult to obtain if certain practices do not cause direct financial injury or the harm cannot be quantified. Lastly, the FTC claims that a rule governing commercial surveillance and data security could provide clarity and predictability about the FTC Act’s application to existing and emergent commercial surveillance and data security practices. The vast, unfocused scope of the ANPR should concern any business engaged in data collection from consumers, as virtually all data collection activities could be implicated.

Key Terms

The FTC proposes several specific definitions for purposes of the rule:

  • “Commercial surveillance” is “the collection, aggregation, analysis, retention, transfer, or monetization of consumer data and the direct derivatives of that information. These data include both information that consumers actively provide—say, when they affirmatively register for a service or make a purchase—as well as personal identifiers and other information that companies collect, for example, when a consumer casually browses the web or opens an app.”
  • “Data security” is described as “breach risk mitigation, data management and retention, data minimization, and breach notification and disclosure practices.”
  • “Consumer” “includes businesses and workers, not just individuals “who buy or exchange data for retail goods and services.”

Questions

The FTC has posed a variety of questions – 95 of them – that touch on both advertising and privacy issues. The FTC asks for public feedback generally on “(a) the nature and prevalence of harmful commercial surveillance practices, (b) the balance of costs and countervailing benefits of such practices for consumers and competition, and (c) proposals for protecting consumers from harmful and prevalent commercial surveillance practices.” More specifically, the FTC solicits feedback on subjects with headings ranging from “to what extent do commercial surveillance practices or lax security measures harm consumers?” to “automated decision-making systems.” Of note are a variety of questions pertaining to children and teens, although Section 18(h) of the FTC Act restricted the Commission’s ability to act on the then-pending infamous “kid-vid” proceeding, in which the FTC proposed to ban advertising to younger children, and which earned the FTC the moniker as the “national nanny.” Section 18 also restricts the Commission’s ability to issue rules in “any substantially similar proceeding on the basis of a determination by the Commission that such advertising constitutes an unfair act or practice in or affecting commerce.” This is expected to be a point raised in comments.

Commissioners Phillips and Wilson Dissent

The vote to approve publication of the ANPR was 3-2. Commissioners Noah Phillips and Christine Wilson, voting no, each issued dissenting comments. In a strongly worded statement, Commissioner Phillips, who has since announced that he is leaving the FTC, questioned whether the FTC was overstepping its authority and recasting itself “as a legislature, with virtually limitless rulemaking authority where personal data are concerned.” In addition, Phillips claimed the ANPR was too broad and “provides no notice whatsoever of the scope and parameters of what rule or rules might follow; thereby, undermining the public input and congressional notification processes. It is the wrong approach to rulemaking for privacy and data security.” In her statement, Commissioner Wilson expressed concern that the ANPR could undermine efforts to pass a federal privacy law. She also asserted that elements of the ANPR constituted agency overreach and wandered “far afield of areas for which we have clear evidence of a widespread pattern of unfair or deceptive practices.”

The finalized agenda for the FTC’s September 8 public forum is here. This proceeding, as well as the FTC’s October 19 event, “Protecting Kids from Stealth Advertising in Digital Media,” will no doubt generate lively debate.

Photo of Sheila A. Millar

On August 24, 2022, the Federal Trade Commission (FTC or Commission) submitted a report to the Congressional Committees on Appropriations detailing current resources and personnel dedicated to COPPA enforcement, the number of COPPA violation investigations over the past five years, and the types of relief obtained in completed investigations. The report was submitted in response to a request by Congress under the Consolidated Appropriations Act of 2022.

The FTC report affirms that protecting children’s privacy remains a Commission priority. The COPPA program is served by 9-11 full-time, dedicated staff members; staff from other divisions, such as the Bureau of Consumer Protection’s Division of Privacy and Identity Protection, also work on COPPA issues. Between May 1, 2017 and May 1, 2022, the FTC opened 80 investigations of potential COPPA violations. Over the last five years, the Commission has expanded its remedies, such as requiring WW International/Kurbo to delete its proprietary algorithms, or mandating that Google/YouTube re-review apps on its ad exchange to identify and ban additional child-directed apps and to track which apps and websites have been banned or removed from its platform. Requiring companies to implement a comprehensive privacy program, often subject to periodic, independent monitoring, is an increasingly frequent element in enforcement agreements.

The Commission has also imposed larger fines for COPPA violations. The FTC reports that “in six of the 10 cases alleging violations of COPPA, the Commission obtained a civil penalty of at least $1.5 million,” including a $170 million fine in the Google/YouTube matter, “one of the largest civil penalties ever obtained, worldwide, for a privacy violation.”

The report ends with a plea for additional funding from Congress: “The Commission makes every effort to use its resources efficiently: as noted in recent testimony, ‘for FY 2021, every $1 of the FTC’s cost returned an estimated $36 in FTC-provided benefits to consumers.’ With more resources, however, the FTC could do more.”

On a related note, the FTC has extended its comment period for its upcoming workshop on Protecting Kids From Stealth Advertising in Digital Media to be held October 19, 2022. The new deadline for interested parties to submit comments is now November 18, 2022.

Photo of Peter Craddock

Since it started in May 2018, enforcement of the rules of the General Data Protection Regulation (GDPR) across the EU has revealed various national trends and differences in approach. Yet one difference seems to dwarf all others: the variation in the amount of the fines for GDPR violations. This has led the European Data Protection Board (EDPB) to publish new guidelines in May 2022 on the calculation of administrative fines under the GDPR.

The EDPB’s proposed methodology includes a formula for reaching a “starting amount” for fines, one that can afterward be adapted based on mitigating and aggravating circumstances. This formula is what we included in our GDPR fine calculator, DeFine, available here.

But a new methodology could lead to changes, so we analyzed over 300 fines, notably the top 250 fines on companies with an identifiable turnover. Based on our analysis, Italy has by far imposed the largest number of fines that would be on the “high” end of the scale of the new EDPB methodology, while across all supervisory authorities, fines for companies with a turnover of more than 250 million EUR are overwhelmingly on the “low” end of the scale.

Our key conclusion: if unchanged, this methodology could lead to significantly higher fines in the future. Read our analysis here.