Photo of Sheila A. MillarPhoto of Mike Gentine

After completing its review of testing and labeling regulations for children’s products, staff of the Consumer Product Safety Commission (CPSC) recommended leaving the current product testing and component part testing regulations as is. The CPSC carried out this review of the “Testing and Labeling Regulations Pertaining to Product Certification of Children’s Products, Including Reliance on Component Part Testing” (testing rule) under section 610 of the Regulatory Flexibility Act (RFA), which requires a review 10 years after publication for any rule that has a significant impact on a substantial number of small businesses. Along with 16 C.F.R. part 1109, “Conditions and Requirements for Relying on Component Part Testing or Certification, or Another Party’s Finished Product Certification, to Meet Testing and Certification Requirements” (component part testing rule), the testing rule was up for review this year, as both rules do have a significant impact on many small businesses.

The testing rule lays out rules and standards for manufacturers to follow in obtaining third party testing for children’s products periodically and when there has been a material change in a product’s design or manufacturing process. It also specifies how products may be labeled to indicate compliance with Section 14 of the Consumer Product Safety Act (CPSA). The component part testing rule specifies how manufacturers can use third party tests of component parts of products to certify the compliance of the finished product. The component part testing rule was intended to reduce the costs and other burdens of testing finished children’s products.

Section 610 requires agencies to consider five factors in reviewing rules to minimize any significant economic impact of the rule on small entities:

  1. The continued need for the rule;
  2. The nature of complaints or comments received concerning the rule from the public;
  3. The complexity of the rule;
  4. The extent to which the rule overlaps, duplicates, or conflicts with other Federal rules, and, to the extent feasible, with State and local governmental rules; and
  5. The length of time since the rule has been evaluated or the degree to which technology, economic conditions, or other factors have changed in the area affected by the rule.

Following an analysis of the feedback received by staff during the 60-day public comment period and after considering the five factors, the CPSC concluded that no changes to the testing and component part testing rules were warranted at this time. The Commission acknowledged that the costs of third-party testing for compliance certification still pose significant costs on some small businesses, but rejected requests for test burden relief, such as reducing the required frequency of periodic testing or revising the definition of small batch manufacturer, as either inconsistent with ensuring compliance or precluded by statute. The CPSC did note that additional guidance on using the component part testing rule could help small businesses use the rule to reduce their costs. Input from children’s product companies on that point may be useful in developing approaches that achieve both compliance and cost reduction goals.

To learn more about current product safety issues and regulatory considerations for connected devices, register now for our free webinar: Product Safety and Regulation of Connected Products, June 24 at noon.

Photo of Sheila A. Millar

The Federal Trade Commission (FTC) and the Food and Drug Administration (FDA) recently sent warning letters to five dietary supplement companies– LeRoche Benicoeur/ConceiveEasy; EU Natural Inc.; Fertility Nutraceuticals LLC; SAL NATURE LLC/FertilHerb; and NS Products, Inc. – warning them that advertising their products as treatments that could cure or treat infertility without substantiating evidence violates the FTC Act. Such claims also subject the products to FDA scrutiny as drugs under the Federal Food, Drug, and Cosmetic Act, which prohibits the introduction or sale of new drugs into interstate commerce without prior FDA approval. In each case, the agencies asserted that the companies promoted their products as able to “cure, treat, mitigate, or prevent disease.” Such assertions “establish that the product is a drug under section 201(g)(1)(B) of the Federal Food, Drug, and Cosmetic Act because it is intended for use in the cure, mitigation, treatment, or prevention of disease” and require FDA approval even if they are labeled as dietary supplements.

For example, NS Products promises on their website that by using their NaturaCure supplement “You will get pregnant very fast and give birth to healthy children regardless of . . . how severe or chronic your infertility disorder.” Similarly, Fertility Nutraceuticals assured customers that their CONFLAM Forte supplements are “[W]ell suited for women with infertility, a history of implantation failure, chemical pregnancies and miscarriages or with known inflammatory conditions, like obesity, polycystic ovary syndrome (PCOS), severe allergies and autoimmune conditions” and were the “best fertility supplements to boost your chance of pregnancy or improve your IVF success rate.”

Warning letters have been an important tool the agencies have used during the COVID-19 pandemic to address COVID treatment claims. Claims that a product can treat a medical condition, whether it involves infertility, a virus, or something else, must be substantiated by competent and reliable scientific evidence and comply with applicable registration or other regulatory obligations or face some potentially expensive consequences in the form of civil penalties or other enforcement actions.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

The Federal Trade Commission (FTC) has released the final agenda for its first workshop on the use of “dark patterns” online, Bringing Dark Patterns to Light: An FTC Workshop, which will be held virtually on April 29, 2021. The workshop will explore how to define “dark patterns,” their prevalence, possible harms (including to vulnerable groups) and potential solutions, among other things. The agency is also soliciting comments on relevant issues (see our earlier post for a list of topics).

The issue is receiving broader attention on the policy front. The FTC workshop provides an opportunity to explore the issues in more detail, and interested parties are encouraged to submit comments.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

The long trudge towards final regulations implementing the California Consumer Privacy Act (CCPA) continues. In December of last year, the California Attorney General issued a fourth set of proposed regulations. These additions were approved by the California Office of Administrative Law (OAL) on March 15, 2021 and took effect immediately. Here are the key changes businesses should know about.

New “Do Not Sell” Icon

The new regulations offer a voluntary opt-out icon that may be used in addition to (but not in place of) posting the notice of a California consumer’s right to opt-out of the sale of personal information.

Businesses must post the notice of right to opt-out on the webpage that consumers are directed to after clicking on the “Do Not Sell My Personal Information” link on their homepage (or landing page/menu in the case of mobile apps).

Businesses Must Streamline the Opt-Out Request Process

Businesses must ensure that their notices of the right to opt-out use simple language, are easy for consumers to understand, and require minimal steps to complete. Businesses cannot require consumers to click through or listen to reasons why they should not submit a request to opt-out, provide personal information that is not necessary to implement the request, or search or scroll through a privacy policy, similar document, or webpage to submit a request to opt-out.

Offline Opt-Out Notices

Businesses that collect personal information from consumers offline must also inform consumers by an offline method of their right to opt-out, as follows:

  • Businesses that collect personal information from consumers in a physical location may inform consumers of their right to opt-out via paper forms or signage
  • Businesses may inform consumers of their right to opt-out during a phone call in which the business collects personal information

In both scenarios, businesses must tell consumers where to find the opt-out information online.

Authorized Agents

California residents are permitted to use authorized agents to submit requests to know or to delete their personal information. The new regulations clarify that businesses may require consumers to prove that an agent has permission to submit the request and to verify their own identity directly with the business.

California Privacy Protection Agency Board Appointments

While the state continues to fine-tune the CCPA regulations – and application of the CCPA to employee information remains deferred until 2022 – the clock is already ticking on the newest iteration of California’s privacy law, the California Privacy Rights Act (CPRA). Although CPRA does not take effect until 2023, the ballot initiative directed establishment of the California Privacy Protection Agency (CPPA) in advance of the effective date. Governor Gavin Newsom, in conjunction with state officials, has appointed the first slate of CPPA members.

With the enactment of the Virginia Consumer Data Protection Act, and with other states also considering privacy legislation, the U.S. landscape is quickly becoming more confusing for consumers and businesses alike.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

As Congress remains locked in a stalemate over the terms of a comprehensive federal privacy law, states continue to forge ahead. Following California, Virginia is the second U.S. state to enact its own comprehensive privacy law governing the collection and use of personal data. Governor Ralph Northam signed the Virginia Consumer Data Protection Act (CDPA) into law on March 2, 2021.

The CDPA applies to businesses that operate in Virginia or produce products or services that are targeted to Virginia residents, and (1) in any calendar year, control or process personal data of at least 100,000 Virginia residents, or (2) control or process personal data of at least 25,000 Virginia residents and derive more than 50% of gross revenue from the sale of personal data.

Concepts in the bill draw from other laws, such as the EU General Data Protection Regulation (GDPR), but the bill includes some pragmatic approaches designed to enhance privacy and to align with other laws, and in a manner that businesses can operationalize. Importantly, the CDPA does not authorize a private right of action.

Key Definitions

The CDPA provides several rights to “consumers,” defined as Virginia residents acting in an individual or household context, and not individuals acting in a commercial or employment context. The CDPA appears to borrow some of its terminology from the GDPR, namely, the terms “controller” (defined as “the natural or legal person that, alone or jointly with others, determines the purpose and means of processing personal data”), “processor” (defined as “a natural or legal entity that processes personal data on behalf of a controller”), and “personal data” (defined as “any information that is linked or reasonably linkable to an identified or identifiable natural person,” but excluding de-identified and publicly available information).

Consumer Rights

The CDPA grants consumers the right, subject to verification of their identity, to access, correct, delete, or obtain a copy of personal data, and the right to opt out of (1) the processing of personal data for the purposes of targeted advertising, (2) the sale of personal data, or (3) profiling “in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.” Businesses as controllers are prohibited from discriminating against consumers for exercising their rights, but with some exceptions, such as offers in connection with loyalty, rewards, club card, and similar programs.

The CDPA allows for a parent or legal guardian to invoke these rights on behalf of a child. The term “child” is defined as an individual under 13, which aligns with the Children’s Online Privacy Protection Act (COPPA). Parental consent rights for the collection, processing, and sale of children’s personal data also are consistent with COPPA.

Business Obligations

In addition to responding to consumer requests described above, any business subject to the CDPA as a controller must provide a privacy notice that describes the categories of personal data processed, the purposes for processing data, how consumers can exercise their rights, the categories of personal data shared with third parties, the categories of third parties with whom personal data is shared, and how consumers can opt out of the sale of personal data to third parties or the processing of personal data for targeted advertising (if applicable). Controllers are also required to follow data minimization principles and to establish, implement, and maintain reasonable security practices to protect personal data.

Processors are required to assist controllers in meeting their obligations under the CDPA and controllers must have contracts in place with processors that impose specific requirements, as set forth in the CDPA.

The CDPA also requires that controllers obtain consent before they collect and process “sensitive data,” which includes data collected from children. However, the CDPA is drafted in a manner that avoids the possible conflict with COPPA; it prohibits processing of sensitive data concerning a known child unless the processing is in accordance with COPPA. This approach preserves the commonsense exceptions to parental consent and the “sliding scale” options for obtaining it, as well as the important “support for internal operations” exception to COPPA.

Similar to the GDPR, the CDPA requires that controllers conduct and document a data protection assessment when processing data for targeted advertising, engaging in the sale of personal data, processing personal data for profiling purposes, processing sensitive data, or engaging in processing activities that present a heightened risk of harm to consumers. Importantly, the bill takes a practical approach, establishing that a single assessment may address “a comparable set of processing obligations that include similar activities,” and that assessments conducted for purposes of compliance with other laws may comply if they have a reasonably comparable scope and effect. Businesses are not obligated to conduct mandatory audits.


The Attorney General has exclusive authority to enforce violations of the CDPA; there is no private right of action. Civil penalties of up to $7,500 may be imposed for each violation of the Act.

The CDPA will take effect on January 1, 2023. The CDPA model merits strong consideration by other U.S. jurisdictions considering comprehensive privacy legislation. But the real solution for consumers and businesses is, of course, a thoughtful federal privacy policy that preempts state law and not a patchwork of different state requirements.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

The Federal Trade Commission (FTC) has issued orders to five e-cigarette manufacturers (JUUL Labs, Inc., R.J. Reynolds Vapor Company, Fontem US, LLC, Logic Technology Development LLC, and NJOY, LLC) seeking information about the companies’ 2019 and 2020 sales, advertising, and promotions. The FTC sent similar orders to the same companies in October 2019 seeking information for prior years as part of an ongoing FTC study of the rapidly expanding U.S. e-cigarette market.

The new compulsory orders request detailed information about flavors, the specific form of nicotine used in each product, sales, and giveaways for each brand; product placements, websites and social media accounts used to advertise, promote, or sell e-cigarette products; marketing and advertising expenditures for social media and other campaigns; promotional events (including those held on college campuses); and the use of influencers and brand ambassadors, and other advertising matters. Any company that has received any FTC compulsory order knows first-hand how time-consuming it can be to respond. Responses are due no later than May 12, 2021.

Both the FTC and the Food and Drug Administration (FDA) have been active in reviewing vaping companies’ advertising practices. In 2019, the agencies issued warning letters to four e-liquid manufacturers that used influencers to promote their products because the influencers failed to include the FDA-required nicotine warning. The FTC took the opportunity to remind companies of their obligation to ensure that their influencers clearly and conspicuously disclose their relationships to the brands when promoting or endorsing products, as required by the FTC’s Endorsement Guides.

In addition to actions by the FTC and FDA, social media sites have updated their guidelines over the years to address advertisements and posts pertaining to e-cigarettes and other regulated products. Thus, there are a host of regulations and guidelines for vaping companies to consider when promoting their brands and products online and hiring or encouraging others to help spread the word.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

On April 29, 2021, the Federal Trade Commission (FTC) will host a virtual public workshop to examine the nature and effects of “dark patterns” on online user behavior. “Bringing Dark Patterns to Light: An FTC Workshop” is expected to explore ways in which user interfaces can have the effect, intentionally or unintentionally, of obscuring, subverting, or impairing consumer autonomy, decision-making, or choice.

Topics include:

  • How dark patterns differ from sales tactics employed by brick-and-mortar stores
  • How they affect consumer behavior, including potential harms
  • Whether some groups of consumers are unfairly targeted or are especially vulnerable
  • What laws, rules, and norms regulate the use of dark patterns
  • Whether additional rules, standards, or enforcement efforts are needed to protect consumers

Suggestions for discussion topics and requests for panelists should be sent to by March 15, 2021.

The FTC will be publishing a formal request for comments on dark patterns soon . Comments must be submitted by June 29, 2021 to

As we indicated in an earlier post, the Consumer Product Safety Commission (CPSC) is planning to hold a webinar on March 2, 2021 to explore how the use of Artificial Intelligence(AI)  and Machine Learning (ML) in consumer products potentially affects consumer product safety.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

The proliferation of mobile devices and digital media allows consumers to take, post, and store more photos and videos than ever before. Since 2015, app developer Everalbum has operated the mobile app, Ever, which offers a means for users to store photos and videos on the company’s cloud servers. Everalbum told users they could deactivate their accounts at any time and the company would delete their images. According to a complaint from the Federal Trade Commission (FTC), however, despite assurances to the contrary, the company retained users’ photos and videos after they deactivated their accounts. In addition, Everalbum did more with users’ photos and videos than store them; it used them to create facial recognition technology without permission. The FTC alleged that these practices constituted unfair or deceptive acts or practices, in violation of Section 5(a) of the FTC Act.

Everalbum launched a feature called “Friends” that uses facial recognition technology to tag people in group photos. The app sent pop-up messages to Ever users in Texas, Illinois, Washington, and the European Union – jurisdictions with biometric laws in place – and provided an option to use facial recognition. For users who did not affirmatively consent, Everalbum disabled the “Friends” facial recognition feature. Users in other jurisdictions had no way to disable the facial recognition tool, which was activated by default.

Everalbum also assured users that it would delete their images if they deactivated their accounts. But according to the FTC, until at least October 2019, the company failed to do so. In addition, between September 2017 and August 2019, Everalbum allegedly combined the photos and videos it retained with millions of photos obtained from public sources and used the resulting datasets to develop facial recognition services, which it sold to its business customers and used to develop the Ever app.

Under the terms of the proposed agreement containing consent order with the FTC, Everalbum must delete (1) all photos and videos of Ever customers who deactivated their accounts, (2) all data developed from images of Ever users who did not consent to use of their images for facial recognition purposes, and (3) any facial recognition models or algorithms developed from Ever users’ photos or videos without explicit consent.

The Commission voted 5-0 to issue the proposed administrative complaint and accept the consent agreement. Commissioner Rohit Chopra issued a separate statement in which he voiced concerns over the use of facial recognition technology, believing it to be “fundamentally flawed.” He stressed the importance of state biometric laws, noting that “Everalbum took greater care when it came to these individuals in these states. The company’s deception targeted Americans who live in states with no specific state law protections. With the tsunami of data being collected on individuals, we need all hands on deck to keep these companies in check.”

The Everalbum settlement signals that the FTC is keeping close watch on how companies that use biometric technology handle consumer data.

Photo of Sheila A. Millar

Artificial Intelligence (AI), Machine Learning (ML), and related technologies have the potential to dramatically change the nature of consumer products, and a variety of agencies are considering the implications of these technologies. The Consumer Product Safety Commission (CPSC) staff has announced plans to hold a public webinar on Tuesday, March 2, 2021, from 9am to 4pm, Eastern Standard Time (EST) to discuss the ramifications of AI and related technologies on consumer products from a consumer safety perspective.

CPSC staff is interested in exploring the best way to provide guidance to manufacturers and importers of consumer products that use AI and ML. Questions and issues to be discussed include:

  • Determining the presence of AI and ML in consumer products.
  • Does the product have AI and ML components?
  • Differentiating what AI and ML functionality exists.
  • What are the AI and ML capabilities?
  • Discerning how AI and ML dependencies affect consumers.
  • Do AI and ML affect consumer product safety?
  • Distinguishing when AI and ML evolve and how this transformation changes outcomes.
  • When do products evolve/transform, and do the evolutions/transformations affect product safety?
  • Relevant voluntary standards.

Those who wish to attend the forum should register by February 15, 2021. The link for registration is here.


Photo of Sheila A. MillarPhoto of Tracy P. Marshall

Tapjoy, Inc., the operator of a mobile advertising platform that appears in certain mobile gaming applications, has settled Federal Trade Commission (FTC) allegations that the company deceived consumers by failing to provide them with promised rewards. Tapjoy’s platform allows mobile app users to interact with third-party advertisers and gain rewards, such as virtual currency, for completing certain tasks. In some cases, consumers pay real money and divulge personal information to earn the rewards.

In its complaint, the FTC not only charged that Tapjoy did not deliver the promised rewards, but also alleged that the company discouraged consumer complaints about the failure to pay rewards and did not respond to complaints. The FTC complaint refers to internal emails in which the company acknowledged that consumer complaints about unreceived rewards – in the hundreds of thousands – were “out of control.” In fact, the volume of complaints was so massive that, in 2017, Tapjoy allegedly made it difficult for consumers to submit complaints by blocking complaint submissions 24 hours after completion of an offer. In addition, until at least 2018, consumers who submitted complaints about not receiving virtual currency only had 72 hours to respond with proof that they had completed an offer or their complaint would be closed. Nonetheless, according to the FTC, Tapjoy continued to advertise prominently and falsely, without any qualification, that it would pay virtual rewards in exchange for the performance of advertised tasks.

Under the terms of the proposed Agreement Containing Consent Order, Tapjoy must clearly and conspicuously display the terms for receiving rewards. The Consent Agreement bars the company from expressly or implicitly misleading users about receiving rewards, including the requirements to receive rewards, when consumers will receive rewards, and any other material facts. In addition, the company must ensure that its third-party advertisers provide the promised rewards, investigate consumer complaints regarding nonpayment of rewards, and take action against advertisers that deceive consumers.

The vote to issue the proposed administrative complaint and to accept the Consent Agreement was 5-0. Commissioners Rohit Chopra and Rebecca Kelly Slaughter issued a joint statement in which they noted that mobile gaming is a fast-growing market in which revenues derive mainly from in-app purchases (including loot boxes, which they characterize as an “addictive phenomenon” that turn videogames into virtual casinos) and advertising. Against this background, advertising middlemen such as Tapjoy are “gatekeepers” that must be closely watched. Chopra and Slaughter commend the proposed settlement as reasonable to address Tapjoy’s practices, but they warn that “when it comes to addressing the deeper structural problems in this marketplace that threaten both gamers and developers, the Commission will need to use all of its tools – competition, consumer protection, and data protection – to combat middlemen mischief, including by the largest gaming gatekeepers.”