Photo of Sheila A. MillarPhoto of Tracy P. Marshall

The long trudge towards final regulations implementing the California Consumer Privacy Act (CCPA) continues. In December of last year, the California Attorney General issued a fourth set of proposed regulations. These additions were approved by the California Office of Administrative Law (OAL) on March 15, 2021 and took effect immediately. Here are the key changes businesses should know about.

New “Do Not Sell” Icon

The new regulations offer a voluntary opt-out icon that may be used in addition to (but not in place of) posting the notice of a California consumer’s right to opt-out of the sale of personal information.

Businesses must post the notice of right to opt-out on the webpage that consumers are directed to after clicking on the “Do Not Sell My Personal Information” link on their homepage (or landing page/menu in the case of mobile apps).

Businesses Must Streamline the Opt-Out Request Process

Businesses must ensure that their notices of the right to opt-out use simple language, are easy for consumers to understand, and require minimal steps to complete. Businesses cannot require consumers to click through or listen to reasons why they should not submit a request to opt-out, provide personal information that is not necessary to implement the request, or search or scroll through a privacy policy, similar document, or webpage to submit a request to opt-out.

Offline Opt-Out Notices

Businesses that collect personal information from consumers offline must also inform consumers by an offline method of their right to opt-out, as follows:

  • Businesses that collect personal information from consumers in a physical location may inform consumers of their right to opt-out via paper forms or signage
  • Businesses may inform consumers of their right to opt-out during a phone call in which the business collects personal information

In both scenarios, businesses must tell consumers where to find the opt-out information online.

Authorized Agents

California residents are permitted to use authorized agents to submit requests to know or to delete their personal information. The new regulations clarify that businesses may require consumers to prove that an agent has permission to submit the request and to verify their own identity directly with the business.

California Privacy Protection Agency Board Appointments

While the state continues to fine-tune the CCPA regulations – and application of the CCPA to employee information remains deferred until 2022 – the clock is already ticking on the newest iteration of California’s privacy law, the California Privacy Rights Act (CPRA). Although CPRA does not take effect until 2023, the ballot initiative directed establishment of the California Privacy Protection Agency (CPPA) in advance of the effective date. Governor Gavin Newsom, in conjunction with state officials, has appointed the first slate of CPPA members.

With the enactment of the Virginia Consumer Data Protection Act, and with other states also considering privacy legislation, the U.S. landscape is quickly becoming more confusing for consumers and businesses alike.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

As Congress remains locked in a stalemate over the terms of a comprehensive federal privacy law, states continue to forge ahead. Following California, Virginia is the second U.S. state to enact its own comprehensive privacy law governing the collection and use of personal data. Governor Ralph Northam signed the Virginia Consumer Data Protection Act (CDPA) into law on March 2, 2021.

The CDPA applies to businesses that operate in Virginia or produce products or services that are targeted to Virginia residents, and (1) in any calendar year, control or process personal data of at least 100,000 Virginia residents, or (2) control or process personal data of at least 25,000 Virginia residents and derive more than 50% of gross revenue from the sale of personal data.

Concepts in the bill draw from other laws, such as the EU General Data Protection Regulation (GDPR), but the bill includes some pragmatic approaches designed to enhance privacy and to align with other laws, and in a manner that businesses can operationalize. Importantly, the CDPA does not authorize a private right of action.

Key Definitions

The CDPA provides several rights to “consumers,” defined as Virginia residents acting in an individual or household context, and not individuals acting in a commercial or employment context. The CDPA appears to borrow some of its terminology from the GDPR, namely, the terms “controller” (defined as “the natural or legal person that, alone or jointly with others, determines the purpose and means of processing personal data”), “processor” (defined as “a natural or legal entity that processes personal data on behalf of a controller”), and “personal data” (defined as “any information that is linked or reasonably linkable to an identified or identifiable natural person,” but excluding de-identified and publicly available information).

Consumer Rights

The CDPA grants consumers the right, subject to verification of their identity, to access, correct, delete, or obtain a copy of personal data, and the right to opt out of (1) the processing of personal data for the purposes of targeted advertising, (2) the sale of personal data, or (3) profiling “in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.” Businesses as controllers are prohibited from discriminating against consumers for exercising their rights, but with some exceptions, such as offers in connection with loyalty, rewards, club card, and similar programs.

The CDPA allows for a parent or legal guardian to invoke these rights on behalf of a child. The term “child” is defined as an individual under 13, which aligns with the Children’s Online Privacy Protection Act (COPPA). Parental consent rights for the collection, processing, and sale of children’s personal data also are consistent with COPPA.

Business Obligations

In addition to responding to consumer requests described above, any business subject to the CDPA as a controller must provide a privacy notice that describes the categories of personal data processed, the purposes for processing data, how consumers can exercise their rights, the categories of personal data shared with third parties, the categories of third parties with whom personal data is shared, and how consumers can opt out of the sale of personal data to third parties or the processing of personal data for targeted advertising (if applicable). Controllers are also required to follow data minimization principles and to establish, implement, and maintain reasonable security practices to protect personal data.

Processors are required to assist controllers in meeting their obligations under the CDPA and controllers must have contracts in place with processors that impose specific requirements, as set forth in the CDPA.

The CDPA also requires that controllers obtain consent before they collect and process “sensitive data,” which includes data collected from children. However, the CDPA is drafted in a manner that avoids the possible conflict with COPPA; it prohibits processing of sensitive data concerning a known child unless the processing is in accordance with COPPA. This approach preserves the commonsense exceptions to parental consent and the “sliding scale” options for obtaining it, as well as the important “support for internal operations” exception to COPPA.

Similar to the GDPR, the CDPA requires that controllers conduct and document a data protection assessment when processing data for targeted advertising, engaging in the sale of personal data, processing personal data for profiling purposes, processing sensitive data, or engaging in processing activities that present a heightened risk of harm to consumers. Importantly, the bill takes a practical approach, establishing that a single assessment may address “a comparable set of processing obligations that include similar activities,” and that assessments conducted for purposes of compliance with other laws may comply if they have a reasonably comparable scope and effect. Businesses are not obligated to conduct mandatory audits.

Enforcement

The Attorney General has exclusive authority to enforce violations of the CDPA; there is no private right of action. Civil penalties of up to $7,500 may be imposed for each violation of the Act.

The CDPA will take effect on January 1, 2023. The CDPA model merits strong consideration by other U.S. jurisdictions considering comprehensive privacy legislation. But the real solution for consumers and businesses is, of course, a thoughtful federal privacy policy that preempts state law and not a patchwork of different state requirements.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

The Federal Trade Commission (FTC) has issued orders to five e-cigarette manufacturers (JUUL Labs, Inc., R.J. Reynolds Vapor Company, Fontem US, LLC, Logic Technology Development LLC, and NJOY, LLC) seeking information about the companies’ 2019 and 2020 sales, advertising, and promotions. The FTC sent similar orders to the same companies in October 2019 seeking information for prior years as part of an ongoing FTC study of the rapidly expanding U.S. e-cigarette market.

The new compulsory orders request detailed information about flavors, the specific form of nicotine used in each product, sales, and giveaways for each brand; product placements, websites and social media accounts used to advertise, promote, or sell e-cigarette products; marketing and advertising expenditures for social media and other campaigns; promotional events (including those held on college campuses); and the use of influencers and brand ambassadors, and other advertising matters. Any company that has received any FTC compulsory order knows first-hand how time-consuming it can be to respond. Responses are due no later than May 12, 2021.

Both the FTC and the Food and Drug Administration (FDA) have been active in reviewing vaping companies’ advertising practices. In 2019, the agencies issued warning letters to four e-liquid manufacturers that used influencers to promote their products because the influencers failed to include the FDA-required nicotine warning. The FTC took the opportunity to remind companies of their obligation to ensure that their influencers clearly and conspicuously disclose their relationships to the brands when promoting or endorsing products, as required by the FTC’s Endorsement Guides.

In addition to actions by the FTC and FDA, social media sites have updated their guidelines over the years to address advertisements and posts pertaining to e-cigarettes and other regulated products. Thus, there are a host of regulations and guidelines for vaping companies to consider when promoting their brands and products online and hiring or encouraging others to help spread the word.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

On April 29, 2021, the Federal Trade Commission (FTC) will host a virtual public workshop to examine the nature and effects of “dark patterns” on online user behavior. “Bringing Dark Patterns to Light: An FTC Workshop” is expected to explore ways in which user interfaces can have the effect, intentionally or unintentionally, of obscuring, subverting, or impairing consumer autonomy, decision-making, or choice.

Topics include:

  • How dark patterns differ from sales tactics employed by brick-and-mortar stores
  • How they affect consumer behavior, including potential harms
  • Whether some groups of consumers are unfairly targeted or are especially vulnerable
  • What laws, rules, and norms regulate the use of dark patterns
  • Whether additional rules, standards, or enforcement efforts are needed to protect consumers

Suggestions for discussion topics and requests for panelists should be sent to darkpatterns@ftc.gov by March 15, 2021.

The FTC will be publishing a formal request for comments on dark patterns soon . Comments must be submitted by June 29, 2021 to darkpatterns@ftc.gov.

As we indicated in an earlier post, the Consumer Product Safety Commission (CPSC) is planning to hold a webinar on March 2, 2021 to explore how the use of Artificial Intelligence(AI)  and Machine Learning (ML) in consumer products potentially affects consumer product safety.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

The proliferation of mobile devices and digital media allows consumers to take, post, and store more photos and videos than ever before. Since 2015, app developer Everalbum has operated the mobile app, Ever, which offers a means for users to store photos and videos on the company’s cloud servers. Everalbum told users they could deactivate their accounts at any time and the company would delete their images. According to a complaint from the Federal Trade Commission (FTC), however, despite assurances to the contrary, the company retained users’ photos and videos after they deactivated their accounts. In addition, Everalbum did more with users’ photos and videos than store them; it used them to create facial recognition technology without permission. The FTC alleged that these practices constituted unfair or deceptive acts or practices, in violation of Section 5(a) of the FTC Act.

Everalbum launched a feature called “Friends” that uses facial recognition technology to tag people in group photos. The app sent pop-up messages to Ever users in Texas, Illinois, Washington, and the European Union – jurisdictions with biometric laws in place – and provided an option to use facial recognition. For users who did not affirmatively consent, Everalbum disabled the “Friends” facial recognition feature. Users in other jurisdictions had no way to disable the facial recognition tool, which was activated by default.

Everalbum also assured users that it would delete their images if they deactivated their accounts. But according to the FTC, until at least October 2019, the company failed to do so. In addition, between September 2017 and August 2019, Everalbum allegedly combined the photos and videos it retained with millions of photos obtained from public sources and used the resulting datasets to develop facial recognition services, which it sold to its business customers and used to develop the Ever app.

Under the terms of the proposed agreement containing consent order with the FTC, Everalbum must delete (1) all photos and videos of Ever customers who deactivated their accounts, (2) all data developed from images of Ever users who did not consent to use of their images for facial recognition purposes, and (3) any facial recognition models or algorithms developed from Ever users’ photos or videos without explicit consent.

The Commission voted 5-0 to issue the proposed administrative complaint and accept the consent agreement. Commissioner Rohit Chopra issued a separate statement in which he voiced concerns over the use of facial recognition technology, believing it to be “fundamentally flawed.” He stressed the importance of state biometric laws, noting that “Everalbum took greater care when it came to these individuals in these states. The company’s deception targeted Americans who live in states with no specific state law protections. With the tsunami of data being collected on individuals, we need all hands on deck to keep these companies in check.”

The Everalbum settlement signals that the FTC is keeping close watch on how companies that use biometric technology handle consumer data.

Photo of Sheila A. Millar

Artificial Intelligence (AI), Machine Learning (ML), and related technologies have the potential to dramatically change the nature of consumer products, and a variety of agencies are considering the implications of these technologies. The Consumer Product Safety Commission (CPSC) staff has announced plans to hold a public webinar on Tuesday, March 2, 2021, from 9am to 4pm, Eastern Standard Time (EST) to discuss the ramifications of AI and related technologies on consumer products from a consumer safety perspective.

CPSC staff is interested in exploring the best way to provide guidance to manufacturers and importers of consumer products that use AI and ML. Questions and issues to be discussed include:

  • Determining the presence of AI and ML in consumer products.
  • Does the product have AI and ML components?
  • Differentiating what AI and ML functionality exists.
  • What are the AI and ML capabilities?
  • Discerning how AI and ML dependencies affect consumers.
  • Do AI and ML affect consumer product safety?
  • Distinguishing when AI and ML evolve and how this transformation changes outcomes.
  • When do products evolve/transform, and do the evolutions/transformations affect product safety?
  • Relevant voluntary standards.

Those who wish to attend the forum should register by February 15, 2021. The link for registration is here.

 

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

Tapjoy, Inc., the operator of a mobile advertising platform that appears in certain mobile gaming applications, has settled Federal Trade Commission (FTC) allegations that the company deceived consumers by failing to provide them with promised rewards. Tapjoy’s platform allows mobile app users to interact with third-party advertisers and gain rewards, such as virtual currency, for completing certain tasks. In some cases, consumers pay real money and divulge personal information to earn the rewards.

In its complaint, the FTC not only charged that Tapjoy did not deliver the promised rewards, but also alleged that the company discouraged consumer complaints about the failure to pay rewards and did not respond to complaints. The FTC complaint refers to internal emails in which the company acknowledged that consumer complaints about unreceived rewards – in the hundreds of thousands – were “out of control.” In fact, the volume of complaints was so massive that, in 2017, Tapjoy allegedly made it difficult for consumers to submit complaints by blocking complaint submissions 24 hours after completion of an offer. In addition, until at least 2018, consumers who submitted complaints about not receiving virtual currency only had 72 hours to respond with proof that they had completed an offer or their complaint would be closed. Nonetheless, according to the FTC, Tapjoy continued to advertise prominently and falsely, without any qualification, that it would pay virtual rewards in exchange for the performance of advertised tasks.

Under the terms of the proposed Agreement Containing Consent Order, Tapjoy must clearly and conspicuously display the terms for receiving rewards. The Consent Agreement bars the company from expressly or implicitly misleading users about receiving rewards, including the requirements to receive rewards, when consumers will receive rewards, and any other material facts. In addition, the company must ensure that its third-party advertisers provide the promised rewards, investigate consumer complaints regarding nonpayment of rewards, and take action against advertisers that deceive consumers.

The vote to issue the proposed administrative complaint and to accept the Consent Agreement was 5-0. Commissioners Rohit Chopra and Rebecca Kelly Slaughter issued a joint statement in which they noted that mobile gaming is a fast-growing market in which revenues derive mainly from in-app purchases (including loot boxes, which they characterize as an “addictive phenomenon” that turn videogames into virtual casinos) and advertising. Against this background, advertising middlemen such as Tapjoy are “gatekeepers” that must be closely watched. Chopra and Slaughter commend the proposed settlement as reasonable to address Tapjoy’s practices, but they warn that “when it comes to addressing the deeper structural problems in this marketplace that threaten both gamers and developers, the Commission will need to use all of its tools – competition, consumer protection, and data protection – to combat middlemen mischief, including by the largest gaming gatekeepers.”

Photo of Sheila A. MillarPhoto of Jean-Cyril Walker

On December 22, 2020, the Federal Trade Commission’s (FTC) announced adoption of a final rule requiring the use of the EnergyGuide labels on portable air conditioners (ACs). Effective October 1, 2022, portable AC manufacturers must attach yellow EnergyGuide labels on the principal display panel of their packaging and include an image of the required label on websites and catalogs advertising the product.

The FTC initially proposed that the labeling requirement would go into effect on January 10, 2025, the same day as new portable AC DOE efficiency standards. Given that these products are increasingly common in the marketplace, exhibit a wide range of energy efficiency and energy costs across similarly sized units, and sometimes consume more energy than currently labeled room air conditioners, the FTC decided that consumers would benefit from moving the effective date up to October 1, 2022.

The final amendments also update the energy efficiency ratings used at 10 C.F.R. Part 305 for central AC units from “Seasonal Energy Efficiency Ratio (SEER)” to “Seasonal Energy Efficiency Ratio 2 (SEER2).” A new ratings methodology goes into effect on January 1, 2023, and Part 305 will be consistent with this change. Manufacturers may begin to use the new terminology before then provided that the represented energy efficiencies comply with the minimum requirements going into effect in 2023.

The Commission considered but ultimately decided not to pursue broader changes to the Energy Label rule, such as a transition to electronic labeling, at this time. The FTC may seek further input on such changes on a later date after having had an opportunity to gather information sufficient to support significant changes to the entire rule. In the interim, the vote in favor of publishing the notice in the Federal Register was 4-1. Commissioner Christine S. Wilson voted no and issued a dissenting statement in which she expressed concern that the final changes to the Rule do not remove prescriptive aspects that she believed were an impediment to competition. Wilson called for a full review of the Rule “to consider removing all dated and prescriptive provisions, and to consider the recent comments suggesting changes. Nothing prevents the Commission from conducting this review now – we do not have to wait until the 10-year anniversary.”

Commissioner Rohit Chopra also issued a separate statement in which he commended the Commission for “finalizing a rule that will help to reduce the long-term burden of high energy bills on low-income families, promote greater energy efficiency, reduce carbon emissions from residential housing,” and for moving up the compliance date, which he believes would result in significant consumer savings in energy costs.

Photo of Sheila A. MillarPhoto of Tracy P. Marshall

Third-party service providers are vital to many companies and they handle a wide range of business activities essential for companies to deliver their own offerings. But a company is not adequately protecting consumers if it fails to perform proper due diligence on service providers and contractually require them to employ appropriate security measures to protect sensitive personal information, as Ascension Data & Analytics, LLC (Ascension) discovered. Ascension, a data analytics company serving the mortgage industry, recently settled with the Federal Trade Commission (FTC) over charges that it violated the Gramm-Leach-Bliley (GLB) Act Safeguards Rule, as well as its own policies, when it neglected to vet the data security practices of a service provider and require the vendor to adequately protect personal information of mortgage holders. While the settlement involves a financial institution subject to the GLB Act, it is instructive for all businesses that maintain consumers’ personal information and share it with third parties.

The GLB Act governs a range of business activities by “financial institutions” (a term that is broadly defined to include many types of companies), including lending, stockbroking and investing, banking, insuring, and providing financial advisory services. Under the GLB Act Safeguards Rule, all covered entities must develop, implement, and maintain a comprehensive, written information security program that contains administrative, technical, and physical safeguards appropriate to the size, complexity, nature, and scope of the company and the sensitivity of the personal information collected. In addition, they are required to ensure that third-party service providers can maintain appropriate safeguards to protect consumers’ personal information and are contractually bound to do so.

The FTC’s complaint alleged that Ascension hired a vendor, OpticsML, to process tens of thousands of  mortgage documents that contained personal information of more than 60,000 consumers, including names, dates of birth, Social Security numbers, loan information, credit and debit account numbers, drivers’ license numbers, credit files, and other financial information. According to the complaint, Ascension failed to review OpticsML’s security practices before providing OpticsML with documents containing sensitive personal information, which OpticsML stored on a cloud-based server without adequate security measures. As a result of such failure, sensitive personal information was accessible by unauthorized persons for about one year.

The proposed settlement requires Ascension to establish, implement, and maintain a comprehensive data security program overseen by a designated employee, undergo biennial security assessments by an independent entity, and provide an annual certification by a senior executive that the company is complying with the FTC’s order. The settlement serves as a reminder for businesses in all industries, and not just financial institutions, of the importance of (1) implementing and maintaining written security programs, (2) regularly reviewing the procedures and ensuring that appropriate personnel are aware of the requirements, and (3) ensuring that service providers have appropriate security programs and measures in place before sharing personal information with them. All businesses should keep abreast of the rapidly developing privacy and data security landscape and their obligations under federal and state laws.

Photo of Sheila A. MillarPhoto of Jean-Cyril Walker

For the second time since 2016, glue producer Chemence, Inc. (Chemence) has found itself adverse to the Federal Trade Commission (FTC) for making allegedly deceptive claims that its products are American made. And this time it cost them $1.2 million – the highest settlement amount ever paid in a “Made in USA” case.

In 2016, the FTC charged Chemence, among several other glue manufacturers,[1] with making unqualified, deceptive country-of-origin claims about their cyanoacrylate superglue products such as Kwifix, Krylex, and Hammer Tite, including labeling them “Made in USA,” “Proudly Made in USA,” and using images of the American flag on product packaging. According to the 2016 FTC complaint, a significant proportion of the costs of the chemical components in the glues came from imported chemicals. Further, the FTC alleged that Chemence induced sellers to deceive unwitting consumers by providing them with “Made in USA” promotional materials for the products. The Stipulated Court Order against Chemence fined the company $220,000 and prohibited it from representing, expressly or by implication (including in labelling and advertising) that its products were USA made unless it could show that the product’s final assembly or processing occurred in the United States, that all significant processing occurred in the United States, and that all or virtually all ingredients or components of the product were made and sourced in the United States. Otherwise, Chemence was required to make a “clear and conspicuous qualification [which] appears immediately adjacent to the representation that accurately conveys the extent to which the product contains foreign parts, ingredients, and/or processing.” The order also required the company to submit a compliance report to the FTC one year after the order.

But according to the FTC’s 2020 complaint, Chemence and its president, James Cooke, continued to sell the company’s glue products with “Made in USA” labels using the same foreign-sourced ingredients with no qualifying language, in violation of the 2016 Order. The FTC also asserts that in 2017 Cooke falsely claimed in the company’s annual compliance report that the company had relabeled its glue products to reflect that they are made with globally sourced materials.

The terms of the 2020 proposed settlement agreement bar Chemence and Cooke from making country-of-origin claims they cannot substantiate. Moreover, the agreement bans them from making any unqualified “Made in USA” claims unless they can show that the product’s final assembly and all significant processing occurs in the U.S. and that all or virtually all ingredients or components of the product are made and sourced in the United States. Qualified “Made in USA” claims must clearly and conspicuously disclose how much of a product contains foreign parts, ingredients or components, and/or processing. To support claims that a product is assembled in the U.S., they must demonstrate that the product is substantially transformed in the United States, with its principal assembly in the United States, and substantial U.S. assembly operations. The company must submit yearly compliance reports and inform all sellers that purchased the company’s glue products labeled as USA-made that the products contain imported materials.

The Commission vote to issue the complaint and accept the proposed consent order was 5-0. Commissioner Rohit Chopra issued a statement in which he applauded the Commission’s sanctions against Chemence and its president as “real consequences” and “another step forward in protecting the Made in USA brand and restoring the Commission’s law enforcement credibility.”

Many customers who want to support domestic industries look for “American Made” claims or symbols. Knowing this, the FTC continues to pursue “Made in USA” claims aggressively, and has a pending rulemaking on the topic. Those who are caught once making false U.S. origin claims and continue to make them face significant consequences, as the record-breaking penalty imposed on Chemence and Cooke demonstrates.

Many companies find the difference between country-of-origin rules imposed by Customs and Border Control and the FTC’s “Made in USA” guidance to be confusing, and, as we have previously noted, the landscape is complicated still more because of California’s law on U.S. origin claims.[2] But businesses wishing to advertise their products as American made would be well advised to familiarize themselves with the FTC’s Enforcement Policy Statement on U.S. Origin Claims guidance to avoid their “Made in USA” claims from coming unglued.

[1] See https://www.ftc.gov/news-events/blogs/business-blog/2016/02/ftc-challenges-companys-made-usa-claims.

[2] Cal. Bus. & Prof. Code § 17533.7