Consumer Protection Connection

Consumer Protection
Connection

UK ICO Finalizes Rules for Children’s Content

Posted in Children, Privacy

The UK Information Commissioner’s Office (ICO) recently finalized its Age-appropriate design: a code of practice for online services (the code). The code applies to any “relevant information society services which are likely to be accessed by children” (by which the ICO means minors under age 18), whether designed for kids or general audiences. The new version makes few significant changes from the consultation draft circulated in May 2019. The ICO added a 12-month transition period and issued industry-specific guidance for media companies, however, most of the substance of the code remains the same. It calls on companies to adopt a risk-based and proportionate approach to age verification and to determine whether their services are “likely to be accessed by children.” While the finalized code offers examples of how a business might ascertain age and whether minors are likely to visit a website or service, it fails to provide a specific, workable definition of “likely to be accessed by children” or technical guidance. The code is not a law, but “it sets standards and explains how the General Data Protection Regulation applies in the context of children using digital services.”

The updated code still defines “children” as minors under 18, citing the UN Convention on the Rights of the Child. It requires that the best interests of the child be foremost when processing personal data of children. Companies must adhere to 15 new standards, starting with privacy-by-design. The code directs businesses to carry out data protection impact assessments, apply data minimization principles, and avoid “nudge” techniques. The initial draft described “nudge” techniques broadly, generating strong criticism that the ICO was straying into advertising issues outside its purview; the final version clarifies that the focus is on nudge techniques that encourage children to disclose unnecessary personal data or to weaken or turn off privacy controls. Default settings for services should be “high privacy,” and geolocation tracking and profiling should be given a default setting of “off.”

The notion that all minors should be treated like children is problematic, reflecting a lack of real understanding of the developmental differences between kids, tweens, and teens. Even more onerous from an implementation standpoint are the obligations to provide very different and specific types of notices depending on the age of the “child.” For digital services that are targeted to different age ranges, the operational obligation will be significant, especially considering the small screen sizes of mobile devices. Importantly, the worry is that the code will force businesses to collect more, not less, data about a child and, specifically, to collect and retain data about a user’s age in circumstances where it is not permitted or is discouraged under other laws like the U.S. Children’s Online Privacy Protection Act (COPPA).

The code departs from existing, accepted definitions of a “child” reflected in privacy, advertising, and product safety laws. For example, COPPA applies to operators of websites or online services that are either directed to children under 13 or have actual knowledge that they are collecting personal information online from a child under 13. COPPA does not require operators to guess whether kids might visit a site not designed with them in mind. Such sites are expected to assume that visitors are under 13 rather than collect and retain birthdates. And COPPA does not obligate general audience sites, such as e-commerce sites, to seek out age information. Similarly, the U.S. Consumer Product Safety Improvement Act (CPSIA) defines a “children’s product” as one designed and intended primarily for children 12 and younger. Defining “children” to include all minors is likewise inconsistent with decades of child development research on advertising to children, which generally defines children as around age 12. Defining a child as anyone under 18 is also inconsistent with Article 8.1 of the EU General Data Protection Regulation (GDPR), which imposes a default age of 16 but allows member states to set the age of a child between 13 and 16. (Ironically, the UK set its GDPR age of consent at 13.) The International Chamber of Commerce Marketing and Advertising Reference Guide on Advertising to Children provides useful background on why it makes sense to distinguish between children and teens for advertising and privacy purposes.

While the code does not have the force of law, it is persuasive in ICO and court determinations and will be a key measure of compliance with the UK Privacy and Electronic Communications Regulations and the GDPR. And, like the GDPR, penalties can reach £17 million or 4% of global turnover. Businesses that fail to comply with the code therefore could face added scrutiny by the UK ICO, leaving them potentially vulnerable to punitive fines. If approved by Parliament, the code is expected to take effect in 2021.

Unfortunately, despite statements about the necessity for the code and its achievability, operationalizing its standards will be enormously difficult and the extent to which it will actually enhance children’s privacy is questionable. Nevertheless, the Ireland Data Protection Commission (DPC) has also been working on a consultation on children’s privacy and may also consider similar approaches.

The code presents some conflicts for global businesses who have applied COPPA as the gold standard for children’s privacy protection. And while merely making available a digital service to UK or international visitors is likely not enough to trigger application of the code, businesses may choose to geo-gate and block UK visitors instead. As more countries adopt additional proscriptive requirements and guidance on privacy, the possibility of conflicts and inconsistences are real, creating a confusing landscape for consumers and businesses alike.

FTC Seeks Comments on Revamping its Endorsement Guides

Posted in Advertising

At a time when influencers are making a living – and sometimes millions of dollars – for promoting everything from eye shadow to the latest smartphone, the Federal Trade Commission (FTC) is reassessing its Guides Concerning the Use of Endorsements and Testimonials in Advertising (the Guides). The Guides provide direction to businesses that use influencers and endorsers on when and how to make disclosures concerning a “material connection” or commercial relationship between the advertiser and influencer.

The Guides were enacted in 1980. The FTC amended the Guides in 2009 to include new requirements for influencers to disclose material connections – whether in the form of cash, free products, or other consideration – with companies whose products or services they recommend. But in 2009, the FTC could not predict the massive growth of global platforms such as YouTube and Instagram where some influencers have millions of followers. The FTC is now seeking public comments on a range of issues including:

  • whether the practices addressed by the Guides are prevalent in the marketplace and whether the Guides are effective at addressing those practices;
  • whether consumers have benefitted from the Guides and what impact, if any, the Guides have had on the flow of truthful information to consumers;
  • whether the FTC’s guidance document, The FTC’s Enforcement Guides: What People Are Asking, should be incorporated into the Guides;
  • how well advertisers and endorsers are disclosing unexpected material connections in social media;
  • whether children are capable of understanding disclosures of material connections and how those disclosures might affect children;
  • whether incentives like free or discounted products bias consumer reviews, even when a favorable review is not required to receive the incentive, and whether or how such incentives should be disclosed;
  • whether composite ratings that include reviews based on incentives are misleading, even when reviewers disclose incentives in the underlying reviews;
  • whether the Guides should address the use of affiliate links by endorsers; and
  • what, if any, disclosures should advertisers or operators of review sites make about the collection and publication of reviews to prevent them from being deceptive or unfair.

FTC Commissioner Rohit Chopra issued a separate statement in which he called for the FTC to perform a “self-critical analysis of the agency’s enforcement approach” and to focus on advertisers, not small influencers. He expressed a hope that after reviewing the comments, the Commission would consider going beyond the Guides by: (1) adopting requirements for technology platforms that facilitate and either directly or indirectly profit from influencer marketing; (2) codifying elements of the existing Guides into formal rules to allow for imposition of civil penalties; and (3) specifying the requirements that companies must adhere to in their contractual arrangements with influencers.

Interested parties should submit comments within 60 days of publication of the Request for Comments in the Federal Register, which is expected soon.

Boaz Green Authored Expert Spotlight Article, “CPSC Increases Focus on Regulatory Violations”

Posted in Product Safety

Keller and Heckman Counsel Boaz Green’s article, “CPSC Increases Focus on Regulatory Violations,” was featured in an Expert Spotlight published by Stericycle. The article discusses the Consumer Product Safety Commission’s (CPSC) growing focus on regulatory violations and the rising number of recalls of regulated products. Companies that import regulated products must also contend with CPSC inspections at the port. CPSC has been more aggressively demanding seizure and destruction of imported regulated products that CPSC had found to be non-compliant. The article provides advice for improving regulatory compliance and reducing the risk of a recall or port detention.

To read the full article, click here.

Maker of Purell Draws FDA Warning and Lawsuit for Hand Sanitizer Disease Prevention Claims

Posted in Advertising

GOJO Industries, the maker of Purell hand sanitizer, needs to clean up its advertising act according to the U.S. Food and Drug Administration (FDA). The FDA sent GOJO a letter on January 17, 2020 warning the company to stop making unsubstantiated claims about its hand sanitizers to avoid giving consumers the impression that they are pharmaceutical products. The FDA further stated that “the defendant’s statements regarding the efficacy of the Products to combat Ebola, norovirus, influenza, absenteeism, and common colds indicate the Products were being marketed as drugs without FDA approval.” The agency cautions that failure to promptly correct the violations may result in legal action, including seizure and injunction.

On the heels of the FDA warning letter, a class action against GOJO was filed in New York Federal court over claims that Purell products could prevent the spread of everything from the flu to Ebola. While the world braces for a possible pandemic of the Wuhan Coronavirus, GOJO assured consumers that Purell products could stem the spread of viral diseases with marketing and packaging statements such as “Kills more than 99.99% of most common germs that may cause illness in a healthcare setting, including MRSA & VRE.” GOJO also stated that their products were “proven to reduce absenteeism” among students and teachers and that a “recent outcome study shows that providing the right products, in a customized solution, along with educational resources for athletes and staff can reduce MRSA and VRE by 100%.”

The plaintiffs charge that Purell’s marketing statements were misleading and unfair because they gave “the impression to consumers the products are effective at preventing colds, flu, absenteeism and promoting bodily health and increased academic achievement.” FDA and plaintiffs assert that GOJO could not substantiate its health claims since “no topical antiseptic products have ever been able to achieve the results defendant advertises.” Plaintiffs assert that consumers relied on GOJO’s health claims when deciding to buy products to help protect their health.

Companies that either expressly state or imply that over-the-counter products can “reduce the risk of spread” of any disease or condition are likely to attract both regulatory scrutiny and the attention of class action attorneys. The lawsuit and FDA warning letter to GOJO serve as reminders that businesses should carefully craft advertising claims and back them up with competent and reliable scientific evidence.

NIST Solicits Comments on Revised Draft IoT Cybersecurity Device Guidance

Posted in Privacy

On January 7, 2020, the National Institute of Standards and Technology (NIST) released a draft of revised cybersecurity recommendations for IoT devices at both the pre-market and post-market stages. NISTIR 8259, Recommendations for IoT Device Manufacturers: Foundational Activities and Core Device Cybersecurity Capability Baseline, identifies six voluntary steps manufacturers should take to account for security throughout a connected device’s lifecycle. It builds on the agency’s initial IoT guidance released last June, NISTIR 8228, Considerations for Managing Internet of Things (IoT) Cybersecurity and Privacy Risks. Comments on the revised draft are due by February 7, 2020.

NIST explains that the IoT devices in scope for this publication have at least one transducer (sensor or actuator) for interacting directly with the physical world and at least one network interface (e.g., Ethernet, Wi-Fi, Bluetooth, Long Term Evolution [LTE], Zigbee, Ultra-Wideband [UWB]) for interfacing with the digital world.

The draft recommends that manufacturers take four pre-market steps:

  1. Identify expected customers and define expected use cases;
  2. Research customer cybersecurity goals, including device identification, device configuration, data protection, logical access to interfaces, and software and firmware updating;
  3. Determine how to address customer goals; and
  4. Plan for adequate support of customer goals.

NIST advises two additional post-market steps:

  1. Define approaches for communicating to customers; and
  2. Decide what to communicate and how to do it.

NIST recommends that manufacturers consider: cybersecurity risk-related assumptions made during design and development; support and lifespan expectations; the cybersecurity capabilities that a device or manufacturer provides; device composition and capabilities, such as information about the device’s software, firmware, hardware, services, functions, and data types; software and firmware updates; and end-of-life or retirement options. Many of NIST’s recommendations may also help IoT device manufacturers assess security measures related to the safety of a connected consumer product and its operation.

The EU Advocate General Opinion is Out: Standard Contractual Clauses are Valid

Posted in Privacy

Businesses that rely on standard contractual clauses (SSCs) to transfer personal data outside the European Economic Area (EEA) just got good news. The long-awaited decision from the EU Advocate General (AG) is here: SCCs are valid. The AG’s opinion, although non-binding, is significant for the case brought by Austrian privacy activist Max Schrems against Facebook, currently before the European Court of Justice (CJEU), as the CJEU generally follows the AG’s reasoning in its decisions.

By way of background, in 2010 the European Commission issued Decision 2010/87, which adopted SCCs model. SCCs establish three sets of contractual terms intended to protect data transfers from the EEA to certain other countries, including the U.S. Two versions of the SCCs apply to data transfers from the EEA to data controllers outside the EEA, and the transfers of data from the EEA to data processors outside the EEA.

Under the General Data Protection Regulation (GDPR) (like Directive 95/46/EC which preceded its adoption), personal data may only be transferred out of the EEA to a third country if that country ensures an adequate level of data protection. Schrems previously challenged the former U.S./EU Safe Harbor, resulting in a determination that it did not assure adequate protection. The Safe Harbor was then replaced by the current EU-U.S. Privacy Shield. SCCs, the Privacy Shield, and binding corporate rules (BCRs) are currently recognized as options to assure adequacy. In this latest challenge, Schrems argued that Facebook’s SCCs were inadequate, and that SCCs in general offered insufficient protection for data transfers from the EEA to the U.S. Schrems requested that SCCs be suspended, the matter was then referred to the CJEU.

In evaluating Decision 2010/87, the AG concluded that the fact SCCs are not binding on authorities in third countries “does not in itself render that decision invalid.” The opinion goes on to state:

The compatibility of Decision 2010/87 with the Charter depends on whether there are sufficiently sound mechanisms to ensure that transfers based on the standard contractual clauses are suspended or prohibited where those clauses are breached or impossible to honour … that is the case in so far as there is an obligation — placed on the data controllers and, where the latter fail to act, on the supervisory authorities — to suspend or prohibit a transfer when, because of a conflict between the obligations arising under the standard clauses and those imposed by the law of the third country of destination, those clauses cannot be complied with … the analysis of the questions has disclosed nothing to affect the validity of Decision 2010/87.

As the AG noted, the current case does not require the CJEU to rule on the lawfulness of the EU-U.S. Privacy Shield framework, which is a separate mechanism for transferring data outside the EEA. Nonetheless, the AG expressed sympathy with a separate argument by Schrems that the Privacy Shield does not offer sufficient safeguards “in the light of the right to respect for private life and the right to an effective remedy.”

These EU decisions are relevant to the current discussion about what a possible framework for federal privacy legislation should look like. As debates about privacy continue, it will be important for policymakers to remember that requirements imposed on businesses to protect key individual privacy rights must be balanced by considering the extent of possible harm to consumers, economic efficiency, innovation, and burdens to all participants in the ecosystem.

FTC Gives Energy Labeling Rule a Facelift

Posted in Product Safety

The Federal Trade Commission (FTC)’s Energy Labeling Rule has a new look. Following a public comment period, the FTC issued amendments to the Energy Labeling Rule that reorganize the Rule’s product descriptions and categories to make them clearer and simpler for stakeholders to understand and apply. But the FTC’s changes are cosmetic – the agency made no substantive changes to the Rule.

The Rule requires manufacturers to attach yellow EnergyGuide labels to many home appliances and electrical products and prohibits retailers from removing these labels or rendering them illegible. It also directs sellers to post label information on websites and in paper catalogs.

The amendments divide the covered products list into four different groups organized by general product category to make it easier for stakeholders to identify relevant covered products, particularly for categories that contain different product types and exemptions, such as lighting. They also separate labeling requirements into seven sections: one for general layout and formatting requirements and six additional sections containing stand-alone label content requirements for refrigerator products, clothes washers, dishwashers, water heaters, room air conditioners, and pool heaters. Finally, the amendments remove obsolete references and correct minor errors.

The Commission approved publication of the final amendments by a vote of 4-1. Commissioner Christine S. Wilson issued a dissent in which she argued that some of the requirements in the amended Rule were unnecessarily exacting, and that the Commission should consider conducting “a comprehensive review of this Rule with a deregulatory mindset.”

The changes become effective on November 29, 2019.

FTC Publishes Practical Guidance for Influencers

Posted in Advertising

From beauty gurus on Instagram to product reviewers on YouTube, influencers are big business for brands. However, the intentions aren’t always clear when reading the advice of a celebrity fitness trainer who was paid for his endorsement or watching a video of a fashionista who just received a new wardrobe from the clothing company she is promoting. To help clarify when and how influencers need to make disclosures, the Federal Trade Commission (FTC) released Disclosures 101 for Social Media Influencers, a new guide intended to supplement the agency’s Endorsement and Testimonial Guides and 2017 Q&A on endorsements.

The guide and its accompanying video advise on disclosure language, how to disclose in different types of media, and avoiding dishonest claims. They also make important points for companies, such as recognizing that financial relationships are not limited to money and not assuming that social media followers are familiar with a company’s brand relationships.

The FTC has taken action against a number of companies over the last year for inadequate disclosures and posting false reviews, including snack box delivery service Urthbox and supplement manufacturer Nobetes. Just last month, the FTC brought complaints against cosmetics company Sunday Riley for posting fake reviews on Sephora.com and the now-defunct marketing company Devumi for creating fake social media followers.

The FTC continues to provide educational resources to influencers and brands about how to comply with the Endorsement Guides, but does not hesitate to initiate enforcement action where undisclosed endorsements have the potential to deceive consumers. Companies should continue to ensure they and the influencers they work with are familiar with both the Endorsement Guides and Disclosures 101 when working with them on an advertising or marketing campaign.

FTC Says “Stalking” Apps Violate COPPA and the FTC Act

Posted in Privacy

You know that movie where a person thinks they’ve barricaded themselves in their house against a stalker, only to grasp the awful realization that the threat is “coming from inside the house”? Unbeknownst to you, that threat may, in fact, be coming from your smartphone, according to a complaint by the Federal Trade Commission (FTC). The FTC recently took action against developers of three mobile apps that were, according to the Complaint, “designed to run surreptitiously in the background” and “uniquely suited to illegal and dangerous uses.” The Complaint alleged violations of the FTC Act and Children’s Online Privacy Protection Act (COPPA).

The FTC Complaint

Marketed as tools for parents to monitor their children and for employers to monitor employees, three mobile apps operated by Retina-X Studios – MobileSpy, TeenShield, and PhoneSheriff – tracked location and mobile device use, but without a user’s knowledge or consent. The apps collected text messages, call history GPS locations, photos, contact lists, browser history, and other information. According to the FTC, the information collected was not properly secured, despite the company’s promises to the contrary. Even after hackers penetrated the company’s cloud storage account twice in a one-year period, leading to the exposure of personal information, the company’s privacy policies insisted that “Your private information is safe with us.” The company also allegedly outsourced much of its product development and maintenance to third parties without sufficient oversight, such as conducting security testing on the apps.

Retina-X’s privacy protections were also allegedly lacking, and, in some instances, allowed users to flout protections designed to alert them about tracking. Default settings in the apps used an icon to inform users that they were being monitored, but the company provided purchasers with instructions on how to turn this feature off, leaving device users who installed the app in the dark about the fact that they were being tracked. The FTC also claimed the company took no steps to validate that the apps were only used to monitor children and employees. Another serious concern prompting the FTC to act was the possibility that domestic abusers and other stalkers could access a device where the app was installed and emotionally and physically abuse an unwitting victim.

The Order

The proposed consent order requires Retina-X and its principal to delete all data collected from the “stalking apps,” prohibits them from misrepresenting their privacy and security practices, and bans them from selling, promoting, or distributing monitoring apps or services that require circumventing the manufacturer’s security protections. The homepage of any website advertising the apps must clearly and conspicuously state that the apps may only be used for legitimate and lawful purposes by authorized users, and the company must obtain express written confirmation from purchasers that they will only use the app for legitimate and lawful purposes, such as a parent monitoring a child, an employer monitoring an employee who has consented, or an adult monitoring another adult who has consented.

Similar to other FTC Orders, Retina-X is required to implement and maintain a comprehensive information security program and obtain third-party assessments of its security program every two years by an assessor the FTC may approve. The company must designate a senior corporate manager to administer the security program and certify compliance annually.

While these security obligations are now standard in FTC consent agreements, this is the first time the FTC has brought a case against monitoring apps. It comes on the heels of the FTC’s COPPA Rule workshop that explored possible updates to the COPPA Rule to address changes in technology. This action establishes that COPPA and Section 5 of the FTC Act give the FTC authority to take action against app developers that circumvent security measures. The FTC has made it clear that safeguarding consumers from potential emotional or physical threats made possible through the surreptitious installation of a stalking app is just as important as protecting them from risks of identity theft and similar harms associated with privacy and security failures.

Reevaluating the COPPA Rule

Posted in Privacy

In the two decades following the enactment of the Children’s Online Privacy Protection (COPPA) Rule, technological developments have changed the online landscape considerably. Recognizing this, the Federal Trade Commission (FTC) held a public workshop on October 7, 2019, to discuss whether, given the proliferation of smart devices, video games, online channels, and EdTech, the Rule, which was last updated in 2013, needs further revision.

The Rule requires certain website operators to obtain parental permission to collect, use, or disclose personal information of children under 13. It applies to operators who target children or who have actual knowledge that children are using their website. FTC Commissioner Christine Wilson, who opened the first session, made clear that the FTC is taking an expansive view of the responsibilities of online platform operators under COPPA. Referring to the recent $170 million fine against YouTube and Google, she noted that platform operators are now “on notice regarding their obligations under COPPA. Specifically, if those operators gain knowledge that user-generated content on their platforms is directed at children, they must comply with COPPA if they collect personal information from viewers of that content … even if the operator does not view its target demographic as children under 13.” Commissioner Noah Phillips emphasized the importance of making sure that any Rule changes are consistent with the statutory directives from Congress and remaining mindful of the potential implications for competition.

The workshop presented an important opportunity for diverse stakeholders to respond to the questions in the FTC’s Request for Comments and to address the costs and benefits of legal and technical approaches to protecting children’s privacy online. Four panels made up of representatives from business, academia, government, and consumer groups discussed a broad range of topics, including behavioral advertising, EdTech, and third-party content. Business representatives raised the issue of conflicting privacy laws, noting that COPPA, the soon-to-be-enacted California Consumer Privacy Act, and the GDPR, vary on age limits and other requirements. Advocacy groups encouraged the FTC to use its authority under Section 6 (b) of the FTC Act to get more information from companies about how they collect and use children’s data.

Changes to the COPPA Rule and its enforcement could have far-reaching implications for companies, even those that do not make children’s products or content. Given the importance of stakeholder input, and in response to requests, the FTC has extended the deadline for comments until December 9, 2019. Comments can be submitted here.

.
Consumer Protection Connection

We and our analytics and advertising providers may use cookies and similar technologies to enhance the browsing experience, facilitate sharing of content, and generate statistics about use of the website. For more information or to change your preferences, click here.

I Agree