Photo of Sheila A. MillarPhoto of Tracy P. Marshall

Deceptive reviews and endorsements have been an increasing area of scrutiny by the Federal Trade Commission (FTC or Commission). In the last few years, the Commission has brought myriad complaints against companies for engaging in such practices. Last year, the FTC resurrected its long-dormant Penalty Offense Authority, warning more than 700 companies in a Notice of Penalty Offenses that they could be liable for significant civil penalties if they post misleading reviews or endorsements. Last month, for the second time in just a few months, the FTC announced an Advance Notice of Proposed Rulemaking (ANPR) under Section 18 of the Federal Trade Commission Act with a view to issuing a possible Trade Regulation Rule (TRR), asserting that the Commission’s current enforcement tools are insufficient to deter bad actors. The October 20, 2022 announcement, as with this summer’s announcement of an ANPR on commercial surveillance and data security, is the first step in a formal rulemaking process. The FTC seeks public comment on whether a mandatory rule is needed to address the growing problem of fake or hijacked reviews, suppression of negative reviews, or reviews that lack proper disclosures of a connection with the advertiser.

The FTC argues in the October ANPR that its “current remedial authority is limited. Monetary relief is no longer available under Section l 3(b), disgorgement is not available under Section 19(b), 15 U.S.C. 57b(b), and, while the Commission has deployed new tools to combat this problem, in many cases, it remains difficult to obtain monetary relief. Under these circumstances, the availability of a civil penalty remedy may provide a potent deterrent.” The FTC goes on to stress that the Commission’s Guides Concerning the Use of Endorsements and Testimonials in Advertising (Endorsement Guides) provide useful recommendations to businesses on how to avoid making deceptive marketing claims, but they are not enforceable. The FTC asserts that the deterrence effect of a rule with the possibility of civil penalties “would benefit consumers, help level the playing field, and not burden legitimate marketers.”

The vote approving publication of the ANPR in the Federal Register was 3-1. Commissioner Christine S. Wilson, voting against issuance, cited the cost of the rulemaking and the sufficiency of the FTC’s existing tools for holding businesses accountable for deceptive reviews and endorsements. In a statement, Wilson said that “the Commission already has a multi-pronged strategy in place to combat this issue,” which includes the Endorsement Guides to educate businesses regarding their obligations and a companion business guidance piece. Wilson also referenced the Commission’s request for comments on potential updates and revisions to the Endorsement Guides issued earlier this year, and the Notice of Penalty Offenses issued in October 2021, which may enable the Commission to obtain civil penalties from marketers that use fake or deceptive endorsements or reviews.

The FTC invites stakeholders to weigh in on any aspect of the issues covered in the ANPR but is particularly interested in feedback on the prevalence of fake or deceptive reviews and endorsements, the costs and benefits of a rule that would address them, and alternatives to such a rulemaking, such as additional consumer and business education. The deadline for comments is 60 days after the ANPR’s publication in the Federal Register (which has not occurred as of the time of this writing). While the FTC’s commercial surveillance ANPR is expected to draw significant opposition from industry, false reviews do create competitive harms to businesses, and this proceeding is a chance for businesses and consumers to address those harms.

Photo of Sheila A. Millar

On October 19, 2022, the Consumer Product Safety Commission (CPSC or Commission) approved a Final Rule imposing mandatory safety requirements for clothing storage units (CSUs) that will, according to the CPSC, “significantly change the way clothing storage units are tested and labeled” to address furniture tip-over hazards. The rulemaking has been in the works since 2017, after the CPSC reviewed incident data for furniture tip-overs and determined that CSUs were the primary furniture category involved in fatal and non-fatal incidents caused by tip-overs.

The Rule will supersede voluntary standards, including ASTM F2057-19, Standard Consumer Safety Specification for Clothing Storage Units.

Application

The Rule becomes effective 180 days after publication in the Federal Register (which has not occurred as of this writing) and applies to CSUs manufactured after its effective date. The Rule defines a “CSU” as:

  • “a consumer product that is a freestanding furniture item,
  • with drawer(s) and/or door(s), that may be reasonably expected to be used for storing clothing,
  • that is designed to be configured to be greater than or equal to 27 inches in height (including units with adjustable heights that reach 27 inches),
  • has a mass greater than or equal to 57 pounds with all extendable elements filled with at least 8.5 pounds/cubic foot times their functional volume (cubic feet),
  • has a total functional volume of the closed storage greater than 1.3 cubic feet, and
  • has a total functional volume of the closed storage greater than the sum of the total functional volume of the open storage and the total volume of the open space.”

The Rule excludes lightweight units weighing under 57 lbs. when filled with clothing and products such as shelving units or office furniture that do not meet the definition of a CSU.

New Stability Requirements

CSU stability requirements are now required to “reflect real-world factors, like multiple open drawers, drawers containing clothing-representative loads, angling CSUs to replicate the effects of placement on the carpet and forces a child exerts while climbing or pulling on a CSU, all of which are shown to occur during CSU tip-overs and contribute to their instability.” It also includes testing methods for interlocks that prevent drawers from being opened all at the same time.

Information Marking Requirements

The new standard mandates that CSUs are marked and labeled with safety and identification information and that the information is accompanied by a tag that provides performance and technical data about the product’s stability.

*   *  *

The Commission voted 3-to-1 to approve the standard. Commissioner Peter Feldman, voting no, issued a statement in which he expressed concern that a Supplemental Notice of Proposed Rulemaking (SNPR) was needed to minimize the chance of a successful legal challenge because the draft Final Rule differed from the version issued for public comment. He also suggested that issuing a Supplemental Notice would have allowed the House to consider the Stop Tip-Overs of Unstable, Risky Dressers on Youth (STURDY) Act, which passed the Senate. It remains to be seen if furniture industry members will seek to challenge the Rule or if Congress will pass the STURDY Act.

Photo of Peter Craddock

The Internet of Things (IoT) segment has grown, and with it have come many examples of vulnerable products, from babycams whose feeds could be viewed by strangers online to hackable implantable cardiac devices. There are also infamous examples of botnets (i.e., clusters of hacked devices) featuring millions of IoT devices with one common trait: weak security.

The U.S. has had in place both laws and standards designed to address data security. While there is a general obligation to secure data in the General Data Protection Regulation (GDPR), recent developments in Europe show a greater focus on security of information in general, not just personal data.

In 2020 in the United Kingdom, the British government announced that it would work on legislation to require compliance with security requirements or specific standards for consumer connected products. One of the requirements touted was, for instance, a prohibition on setting universal default passwords. This requirement, in turn, would trigger an obligation to ensure that all passwords within a connected device are unique and strong to avoid granting hackers easy access to millions of products once a default password has been cracked. The resulting Product Security and Telecommunications Infrastructure Bill, currently being considered by the House of Lords, will give the UK Secretary of State authority to impose specific security requirements for “internet-connectable” and “network-connectable” products or require compliance with a given standard.

In the European Union, the European Commission published on September 15, 2022 a proposal for a “Cyber Resilience Act,” an EU Regulation “on horizontal cybersecurity requirements for products with digital elements.” This Regulation would require any manufacturer of a “product with digital elements” (i.e., “any software or hardware product and its remote data processing solutions”) to meet minimum cybersecurity requirements to be able to place that product on the EU market.

The concept of a “product with digital elements” does not appear to be limited to hardware + software combinations, as a number of categories of products listed in an annex to the draft Cyber Resilience Act are today pure “software” products, such as a wide range of cybersecurity tools. Thus, the scope of the Cyber Resilience Act is not limited only to IoT products.

The draft Cyber Resilience Act calls in effect for security by design by requiring manufacturers to design, develop, and produce products in accordance with cybersecurity requirements. Notably, manufacturers will be required to undertake an “assessment of the cybersecurity risks associated with [the] product and take the outcome of that assessment into account during the planning, design, development, production, delivery and maintenance phases […] with a view to minimising cybersecurity risks, preventing security incidents and minimising the impacts of such incidents.” This echoes provisions of the draft “NIS 2” Directive (a proposal for a Directive “on measures for a high common level of cybersecurity across the Union”) as well as the principle of “data protection by design and by default” found in the GDPR.

Under the provisions of the draft Cyber Resilience Act, manufacturers will have reporting obligations in relation to actively exploited vulnerabilities on the one hand and security incidents on the other. They will be required to inform ENISA, the EU Cybersecurity Agency, of (i) “any actively exploited vulnerability” contained in the product and (separately) (ii) “any incident having [an] impact on the security” of the product, in each case “within 24 hours of becoming aware of it.” In addition, manufacturers will have to inform users of the incident “without undue delay and after becoming aware” of it. Beyond information regarding the incident, they would also have to inform users, “where necessary, about corrective measures that the user can deploy to mitigate the impact of the incident.”

Moreover, the draft Cyber Resilience Act requires manufacturers to carry out conformity assessment procedures, draw up technical documentation, and ensure that the product bears a relevant CE marking. The interrelationship between this document and existing conformity assessment procedures for products must be carefully evaluated.

The draft Cyber Resilience Act does not place the regulatory burden only on manufacturers. Importers and distributors involved in placing products on the EU market are subject to specific obligations as well, notably in relation to documentation and CE markings. An importer or distributor will moreover be subject to the full obligations of a manufacturer if, for example, the product is marketed under the importer/distributor’s name or trademark, or if the importer/distributor carries out “a substantial modification” of the product already placed on the market.

The security requirements themselves appear to be future-proof and technology-neutral, for instance, the obligation to ensure products are “delivered with a secure by default configuration, including the possibility to reset the product to its original state” or that they are “designed, developed and produced to limit attack surfaces, including external interfaces”. In many ways, these requirements appear to reflect the common principles underlying information security best practices. Products belonging to a “critical” category (this includes a wide range of categories, such as identity management systems, password managers, malware detection software, microcontrollers, operating systems, routers, smart meters, etc.) are then subject to stricter rules, in particular a specific conformity assessment procedure.

The draft Cyber Resilience Act includes links to the draft AI Regulation as well (also under discussion at the Commission). If a product is classified as a “high-risk” AI system under the draft AI Regulation, compliance with the Cyber Resilience Act requirements will automatically be considered as compliance with the cybersecurity requirements under the AI Regulation.

As with other examples of recent legislation (from the GDPR to the Digital Markets Act and Digital Services Act), the draft Cyber Resilience Act includes tough penalties to ensure compliance, as non-compliance can lead to recall or withdrawal of the product from the market or another corrective action and can also lead to fines of up to 15 million EUR or 2.5% of the total worldwide turnover, whichever is higher. These fines are not the maximum risk for companies in case of non-compliance, though, as the draft Cyber Resilience Act explicitly states that it is “without prejudice to [the GDPR]” – which could lead to important questions of liability if a particular action or behaviour constitutes an infringement upon both sets of rules.

Now is the time to ensure that your information security practices are up to speed and that all levels within your organization are properly involved in the devising, rolling out, and maintaining a strong cybersecurity strategy that takes into account all applicable legislation. Companies operating globally will, of course, also need to follow the relevant national policy and guidance as it develops.

For any questions on the Cyber Resilience Act or on how to build such a cybersecurity strategy, reach out to the authors or your usual contact at Keller and Heckman LLP.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

At a press conference on August 11, 2022, the Federal Trade Commission (FTC or Commission) announced an Advance Notice of Proposed Rulemaking (ANPR), which was published, along with a fact sheet, to explore potential new rules governing what the FTC characterizes as prevalent “commercial surveillance” and “lax data security practices.” The FTC issued the ANPR pursuant to its Section 18 authority under the Magnuson-Moss Act, which authorizes the Commission to promulgate, modify, and repeal rules that define with specificity unfair or deceptive acts or practices within the meaning of Section 5(a)(1) of the FTC Act. This broad and complex ANPR was published in the Federal Register on August 22 (87 Fed. Reg. 51273), and comments are due October 21, 2022. The FTC will host a public forum on September 8, 2022, featuring a structured panel discussion and an opportunity for stakeholders to share their views on the ANPR, subject to a two-minute time limit.

What’s Behind the ANPR?

FTC Chair Lina Khan said in a statement that “firms now collect personal data on individuals on a massive scale and in a stunning array of contexts, resulting in an economy that, as one scholar put it, ‘represents probably the most highly surveilled environment in the history of humanity’. This explosion in data collection and retention, meanwhile, has heightened the risks and costs of breaches—with Americans paying the price.” The FTC offers several reasons to justify the proposal. First, the FTC argues that its inability to fine companies for egregious first-time offenses under its Section 5 authority may “insufficiently deter future law violations.” Second, while the FTC can enjoin conduct that violates Section 5, such relief may be inadequate in the context of alleged commercial surveillance and lax data security practices. Third, the FTC argues that even in instances in which it can obtain monetary relief for violations of Section 5, such relief may be difficult to obtain if certain practices do not cause direct financial injury or the harm cannot be quantified. Lastly, the FTC claims that a rule governing commercial surveillance and data security could provide clarity and predictability about the FTC Act’s application to existing and emergent commercial surveillance and data security practices. The vast, unfocused scope of the ANPR should concern any business engaged in data collection from consumers, as virtually all data collection activities could be implicated.

Key Terms

The FTC proposes several specific definitions for purposes of the rule:

  • “Commercial surveillance” is “the collection, aggregation, analysis, retention, transfer, or monetization of consumer data and the direct derivatives of that information. These data include both information that consumers actively provide—say, when they affirmatively register for a service or make a purchase—as well as personal identifiers and other information that companies collect, for example, when a consumer casually browses the web or opens an app.”
  • “Data security” is described as “breach risk mitigation, data management and retention, data minimization, and breach notification and disclosure practices.”
  • “Consumer” “includes businesses and workers, not just individuals “who buy or exchange data for retail goods and services.”

Questions

The FTC has posed a variety of questions – 95 of them – that touch on both advertising and privacy issues. The FTC asks for public feedback generally on “(a) the nature and prevalence of harmful commercial surveillance practices, (b) the balance of costs and countervailing benefits of such practices for consumers and competition, and (c) proposals for protecting consumers from harmful and prevalent commercial surveillance practices.” More specifically, the FTC solicits feedback on subjects with headings ranging from “to what extent do commercial surveillance practices or lax security measures harm consumers?” to “automated decision-making systems.” Of note are a variety of questions pertaining to children and teens, although Section 18(h) of the FTC Act restricted the Commission’s ability to act on the then-pending infamous “kid-vid” proceeding, in which the FTC proposed to ban advertising to younger children, and which earned the FTC the moniker as the “national nanny.” Section 18 also restricts the Commission’s ability to issue rules in “any substantially similar proceeding on the basis of a determination by the Commission that such advertising constitutes an unfair act or practice in or affecting commerce.” This is expected to be a point raised in comments.

Commissioners Phillips and Wilson Dissent

The vote to approve publication of the ANPR was 3-2. Commissioners Noah Phillips and Christine Wilson, voting no, each issued dissenting comments. In a strongly worded statement, Commissioner Phillips, who has since announced that he is leaving the FTC, questioned whether the FTC was overstepping its authority and recasting itself “as a legislature, with virtually limitless rulemaking authority where personal data are concerned.” In addition, Phillips claimed the ANPR was too broad and “provides no notice whatsoever of the scope and parameters of what rule or rules might follow; thereby, undermining the public input and congressional notification processes. It is the wrong approach to rulemaking for privacy and data security.” In her statement, Commissioner Wilson expressed concern that the ANPR could undermine efforts to pass a federal privacy law. She also asserted that elements of the ANPR constituted agency overreach and wandered “far afield of areas for which we have clear evidence of a widespread pattern of unfair or deceptive practices.”

The finalized agenda for the FTC’s September 8 public forum is here. This proceeding, as well as the FTC’s October 19 event, “Protecting Kids from Stealth Advertising in Digital Media,” will no doubt generate lively debate.

Photo of Sheila A. Millar

On August 24, 2022, the Federal Trade Commission (FTC or Commission) submitted a report to the Congressional Committees on Appropriations detailing current resources and personnel dedicated to COPPA enforcement, the number of COPPA violation investigations over the past five years, and the types of relief obtained in completed investigations. The report was submitted in response to a request by Congress under the Consolidated Appropriations Act of 2022.

The FTC report affirms that protecting children’s privacy remains a Commission priority. The COPPA program is served by 9-11 full-time, dedicated staff members; staff from other divisions, such as the Bureau of Consumer Protection’s Division of Privacy and Identity Protection, also work on COPPA issues. Between May 1, 2017 and May 1, 2022, the FTC opened 80 investigations of potential COPPA violations. Over the last five years, the Commission has expanded its remedies, such as requiring WW International/Kurbo to delete its proprietary algorithms, or mandating that Google/YouTube re-review apps on its ad exchange to identify and ban additional child-directed apps and to track which apps and websites have been banned or removed from its platform. Requiring companies to implement a comprehensive privacy program, often subject to periodic, independent monitoring, is an increasingly frequent element in enforcement agreements.

The Commission has also imposed larger fines for COPPA violations. The FTC reports that “in six of the 10 cases alleging violations of COPPA, the Commission obtained a civil penalty of at least $1.5 million,” including a $170 million fine in the Google/YouTube matter, “one of the largest civil penalties ever obtained, worldwide, for a privacy violation.”

The report ends with a plea for additional funding from Congress: “The Commission makes every effort to use its resources efficiently: as noted in recent testimony, ‘for FY 2021, every $1 of the FTC’s cost returned an estimated $36 in FTC-provided benefits to consumers.’ With more resources, however, the FTC could do more.”

On a related note, the FTC has extended its comment period for its upcoming workshop on Protecting Kids From Stealth Advertising in Digital Media to be held October 19, 2022. The new deadline for interested parties to submit comments is now November 18, 2022.

Photo of Peter Craddock

Since it started in May 2018, enforcement of the rules of the General Data Protection Regulation (GDPR) across the EU has revealed various national trends and differences in approach. Yet one difference seems to dwarf all others: the variation in the amount of the fines for GDPR violations. This has led the European Data Protection Board (EDPB) to publish new guidelines in May 2022 on the calculation of administrative fines under the GDPR.

The EDPB’s proposed methodology includes a formula for reaching a “starting amount” for fines, one that can afterward be adapted based on mitigating and aggravating circumstances. This formula is what we included in our GDPR fine calculator, DeFine, available here.

But a new methodology could lead to changes, so we analyzed over 300 fines, notably the top 250 fines on companies with an identifiable turnover. Based on our analysis, Italy has by far imposed the largest number of fines that would be on the “high” end of the scale of the new EDPB methodology, while across all supervisory authorities, fines for companies with a turnover of more than 250 million EUR are overwhelmingly on the “low” end of the scale.

Our key conclusion: if unchanged, this methodology could lead to significantly higher fines in the future. Read our analysis here.

Photo of Sheila A. MillarPhoto of Anushka N. Rahman

In a notice published in the Federal Register on August 8, 2022, U.S. Consumer Product Safety Commission (CPSC) staff announced that the CPSC will hold a workshop on October 13, 2022, to discuss CPSC’s eFiling Program and the Commission’s plans for a joint Beta Pilot Test with U.S. Customs and Border Protection (CBP) (previously announced in the Federal Register on June 10, 2022).

During the test, which runs for six months, 30 to 50 participants will use the Partner Government Agency (PGA) Message Set to electronically file certificate data with CBP. The Beta Pilot is the second test carried out by the agencies to assess eFiling of data from a compliance certificate for regulated consumer products. Its purposes are “to develop and test the IT infrastructure necessary to support a full-scale eFiling requirement, inform CPSC’s pending rulemaking, develop internal procedures to support enforcement, and assist CPSC to target imports more accurately by enhancing targeting of non-compliant trade and facilitating the flow of legitimate trade.”

Workshop topics will include:

  • CPSC’s Enforcement at the Ports
    • CPSC and CBP Collaboration Overview
    • CPSC Targeting of Imported Products
  • CPSC’s Certificate Requirements
    • Statutory and Regulatory Requirements
    • Enforcement Efforts
    • Certificate Study
  • Overview of CPSC eFiling Program
    • Improved Enforcement/Facilitation of Legitimate Trade—Alpha Pilot
    • Beta Pilot Test Requirements
  • CPSC Procedures
    • CPSC’s Product Registry
  • CBP Procedures
    • CPSC’s draft Customs and Trade Automated Interface Requirements (CATAIR)
    • CPSC’s Risk Assessment Methodology (RAM) System and Use of Risk Scores for Enforcement
  • Third-Party Involvement in Certificate and eFiling Requirements
    • Role of brokers in meeting CPSC’s PGA Message Set requirement
    • Role of laboratories in meeting CPSC’s certificate requirement
  • Import Issues for eFiling
    • eCommerce
    • De minimis shipments
    • Direct-to-consumer shipments
    • International Mail Facilities
    • Foreign Trade Zones
    • Filing deadlines for different modes of transport

The October 13 workshop will be held from 9 a.m. to 4 p.m. ET both virtually and in person at the CPSC’s headquarters in Bethesda, MD. Interested parties must register by Thursday, October 6, 2022. Comments may be submitted following the workshop until November 11, 2022.

Photo of Sheila A. MillarPhoto of Anushka N. Rahman

Two recent Senate bills show that Congress is working to improve the nation’s patchwork of recycling laws. On July 28, 2022, the Senate voted unanimously to pass The Recycling Infrastructure and Accessibility Act of 2022 (RIAA) and The Recycling and Composting Accountability Act (RCAA). The first of these directs the U.S. Environmental Protection Agency (EPA) to establish a pilot grant project to fund recycling projects at the local and state level, while the latter aims to improve EPA reporting on recycling and composting.

The RIAA, introduced by Senator Shelley Moore Capito (R-WV), charges the EPA with establishing a pilot grant program to fund eligible state, local, Native, or public-private partnership projects “that will significantly improve accessibility to recycling systems through investments in infrastructure in underserved communities through the use of a hub-and-spoke model for recycling infrastructure development.” The RCAA, introduced by Senator Tom Carper (D-DE), directs the EPA to track and publish data on recycling and composting rates across the country. The information would be used to help improve performance and influence future projects, including a potential national composting strategy.

The two bills demonstrate that expanding the recycling and composting infrastructure remains a Congressional priority. The proposed legislation enjoys not only broad bipartisan support but has garnered widespread support from industry.

Photo of Sheila A. MillarPhoto of Tracy P. MarshallPhoto of Peter Craddock

On May 12, 2022, the European Data Protection Board published guidelines with a methodology for calculating fines for violations of the General Data Protection Regulation (GDPR). These guidelines were subject to a public consultation until June 27, 2022.

Because these guidelines are likely to have an influence on future decisions by data protection authorities in the European Union, Keller and Heckman LLP has developed DeFine, a GDPR fine calculator tool based on that methodology. It is accessible online and free of charge here.

While we hope that organizations will not need to use DeFine too often in dealing with regulators, it may serve as an internal company awareness-raising mechanism to enhance understanding of data privacy risks.

Feel free to reach out to the creator, Peter Craddock, or to any other Keller and Heckman LLP contact if you have any questions or suggestions, would like to share feedback on DeFine, or require assistance on data privacy or security.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

As we previously reported, the Federal Trade Commission (FTC) seeks comments on proposed updates to its Guides Concerning the Use of Endorsements and Testimonials in Advertising (Endorsement Guides). The FTC’s notice was published in the Federal Register on July 26, 2022 (87 Fed. Reg. 44288), and comments must be received by September 26, 2022.

The Endorsement Guides are intended to help businesses ensure that their advertising testimonials and endorsements are not deceptive or misleading and that material connections between endorsers and companies are disclosed. As we discussed earlier, the FTC’s proposed updates to the Endorsement Guides focus on advertisers that post fake positive reviews or delete negative reviews and advertisers whose disclosures fall short. The changes would also add, among other things, more illustrative examples to help clarify the Guides’ provisions and new sections on endorsements and consumer reviews.

Additionally, the FTC proposes adding a new, very general provision regarding children, namely, that “[p]ractices which would not ordinarily be questioned in advertisements directed to adults might be questioned” if they are directed to children. However, the preamble supplements this by noting that the FTC suggested a similar provision in 1972 (after the kid-vid proceeding) but withdrew it in 1976. Now, the FTC suggests that “even as more evidence is gathered about the effects of children’s advertising, there is ample basis to recognize that children may react differently than adults to endorsements in advertising or to related disclosures.” Chair Lina Khan’s statement notes that the FTC currently lacks the full evidentiary basis to support specific guidance or propose best practices, and she pointed to the planned October 19, 2022 workshop, “Protecting Kids from Stealth Advertising in Digital Media,” as a vehicle to obtain more information. (The comment deadline was July 18).

It will be important for businesses to weigh in on all aspects of the FTC’s proposals.