Developments in Law and Policy from Venable’s eCommerce, Privacy, and Cybersecurity Group
In this issue, we highlight House Committee on Energy and Commerce Member McMorris Rodgers’ remarks about a need for federal privacy legislation. We outline two July Federal Trade Commission events, PrivacyCon 2020 and the GLBA Safeguards Workshop. We discuss the Supreme Court’s second take on TCPA and the growing opposition of California’s data-privacy related ballot measure. Across the pond, we examine the Court of Justice of the European Union’s ruling relating to the EU-U.S. Privacy Shield, the United Kingdom’s Information Commissioner’s update on its regulatory approach, and its recently published annual report.
Heard on the Hill
House Energy and Commerce Subcommittee Member McMorris Rodgers Emphasizes Need for Federal Privacy Legislation
On July 9, 2020, the House Committee on Energy and Commerce’s (Committee) Subcommittee on Consumer Protection and Commerce (Subcommittee) convened a hearing on “Consumers Beware: Increased Risks During the COVID-19 Pandemic.” During the hearing’s opening remarks, Subcommittee Ranking Member Cathy McMorris Rodgers (R-WA) called for federal privacy legislation. She noted the need for a federal privacy framework is urgent, adding that federal privacy legislation must be passed before a patchwork of state privacy laws occurs.
Ranking Member McMorris Rodgers expressed support for the Committee’s Staff Discussion Draft that was circulated by Committee staff in December 2019. Among other provisions, the Discussion Draft would: (1) establish a Bureau of Privacy at the Federal Trade Commission; (2) require express, affirmative consent prior to the processing of covered data; (3) establish data use limitations; (4) establish certain provisions on data disclosure to third parties; (5) create clear and publicly available privacy policies; and (6) require certain data security measures. Ranking Member McMorris Rodgers noted that Committee partisan proposals are hindering the chances of a bipartisan solution to a federal privacy framework passing the House.
Around the Agencies and Executive Branch
Federal Trade Commission Hosts GLBA Safeguards Workshop
On July 13, 2020, the Federal Trade Commission (FTC) held a workshop entitled, “Information Security and Financial Institutions: FTC Workshop to Examine Safeguards Rule.” The workshop related to the Notice of Proposed Rulemaking (NPRM proposal) issued by the FTC in April 2019, which was followed by the FTC’s March 2020 Request for Public Comment on modifications to the Gramm-Leach-Bliley Act Safeguards Rule. The workshop consisted of five panels moderated by FTC Staff, and participants included academics and representatives from industry and government.
Opening remarks by FTC Staff highlighted provisions in the NPRM proposal, emphasizing that the FTC seeks to maintain the flexibility of the current rule while providing more guidance regarding the content of an information security program. Staff noted that the NPRM proposal would require non-bank financial institutions to: (1) designate one qualified individual responsible for overseeing the institution’s information security program; (2) base the information security program on a written risk assessment; (3) periodically perform additional risk assessments; (4) regularly test or monitor vulnerabilities in the information security program; (5) provide security training to personnel designated with responsibility in the information security program; and (6) include encryption or multi-factor authentication (MFA), or a “reasonable equivalent,” within information security programs.
The five panels covered the following topics:
- The Costs and Benefits of Information Security Programs. Panelists discussed the role of data security in businesses and challenges related to the application of cybersecurity practices across business units. Panelists addressed risk assessment programs and highlighted the importance of preventing data breaches.
- Information Security Programs and Smaller Businesses. Panelists debated exemption standards for small businesses. Panelists addressed the costs of implementing an information security program with an accountable individual to manage the program, and also highlighted the cost implications for small businesses.
- Continuous Monitoring, Penetration, and Vulnerability Testing. Panelists discussed the role of monitoring logs in protecting against data breaches and emphasized the importance of logs and audit trails. Panelists discussed penetration testing to find vulnerabilities and vulnerability testing after cyber incidents.
- Accountability, Risk Management, and Governance of Information Security Programs. Panelists discussed the NPRM proposal’s requirement that companies have a single person to be held accountable for risk mitigation and incident resolution. Panelists highlighted the importance of information security awareness at the senior levels of a company. Panelists debated the costs and benefits of the NPRM proposal’s reporting requirement.
- Encryption and Multifactor Authentication. Panelists discussed the NPRM proposal’s requirements that data be encrypted when it is transferred externally and be accessed with MFA. Panelists debated different types of authentication that would be included in the MFA standards.
Federal Trade Commission Convenes PrivacyCon 2020
On July 21, 2020, the Federal Trade Commission (FTC) held its fifth annual PrivacyCon. The virtual event included six panels on the following topics: (1) Health Apps; (2) Bias in Artificial Intelligence (AI) Algorithms; (3) The Internet of Things (IoT); (4) Specific Technologies: Cameras/Smart Speakers/Apps; (5) International Privacy; and (6) Miscellaneous Privacy/Security.
Prior to the event, the FTC sought research presentations on topics related to consumer privacy and security. Opening remarks were made by Andrew Smith, the Director of the Bureau of Consumer Protection. He emphasized the FTC’s “vigorous” enforcement actions, focusing on the area of children’s privacy.
PrivacyCon was divided into six sessions, each featuring a moderator from the FTC. The first session focused on data disclosure and privacy considerations related to health apps. Panelists discussed the transfer of health data covered by The Health Insurance Portability and Accountability Act of 1996 (HIPAA) to an entity that is not regulated by HIPAA and expressed support for legislation or regulation that addresses such scenarios. Professor Quinn Grundy from the University of Toronto stated that data privacy for health data must apply to “those who collect, control, and process user data” as well as app developers. In the second session, panelists discussed bias in AI algorithms, focusing on what was identified as algorithmic bias in advertising and public health. Muhammad Ali from Northwestern University opined that delivery bias in advertising could cause issues under Section 230 of the Communications Decency Act. Panelist Ziad Obermeyer from UC Berkeley School of Public Health expressed the need for a definition of bias from regulators in order to hold companies accountable.
Session three addressed the Internet of Things (IoT). Topics in this session included information exposure risks from consumer IoT devices, privacy and security disclosures for such devices, and methods of tracking IoT device network traffic. The fourth panel addressed apps, cameras, and smart speaker’s privacy policies. Panelists discussed contact tracing apps, privacy considerations related to video camera technologies, and “trustworthiness” of smart speaker devices.
The fifth session addressed international privacy, focusing on the European Union’s General Data Protection Regulation (GDPR) and discussed its effect on the data industry. Panelists discussed how current enforcement of the GDPR is not aggressive. The final session covered miscellaneous privacy and security topics. Hana Habib from Carnegie Mellon University stated that privacy policies lack clarity and expressed support for regulations to establish usability requirements. Panelists debated the merits of the “notice and consent” model for privacy, emphasizing that increased transparency is necessary for more meaningful consent.
In the Courts
Supreme Court Grants Writ of Certiorari to Determine the Definition of an ATDS Under the TCPA
On July 9, 2020, the United States Supreme Court granted certiorari in Facebook, Inc. v. Duguid to decide on the issue of whether an “automated telephone dialing system” (ATDS) under the Telephone Consumer Protection Act (TCPA) encompasses any device that can “store” and “automatically dial” telephone numbers, even if the device does not “us[e] a random or sequential number generator.”
As the TCPA prohibits the initiation of calls to certain numbers using an ATDS without the requisite level of prior consent, the definition of what technology qualifies as an ATDS is a threshold question for TCPA litigations. The plain language of the TCPA states that an ATDS is “equipment which has the capacity to store or produce telephone numbers to be called, using a random or sequential number generator; and to dial such numbers.” 47 U.S.C. § 227(a)(1). The Supreme Court’s decision will help resolve the circuit split that has emerged over the definition of an ATDS under the TCPA. While the Third, Seventh, and Eleventh Circuits have narrowly interpreted the definition of an ATDS to include only equipment that is capable of storing or producing numbers using a “random or sequential” number generator, the Second, Sixth, and Ninth Circuits have taken a more expansive view that a system need only have the capacity to “store numbers to be called” and “to dial such numbers automatically” to constitute an ATDS.
The Supreme Court’s next term opens on October 5, 2020, and recesses in late June or July 2021.
In the States
Opposition Voiced Against CPRA
Since the California Privacy Rights Act of 2020 (CPRA) ballot initiative qualified for the state’s November 3, 2020, general election, several interested parties have expressed concern about the measure. Advocates, businesses, and politicians have voiced opposition to the CPRA in press releases and during political gatherings on the grounds that the statute would weaken privacy rights for California consumers, impose excessive costs on small businesses, and strain the state’s economy. If approved by voters, the CPRA ballot initiative would amend the California Consumer Privacy Act of 2018 (CCPA), a broadly applicable privacy law that went into effect in January of this year and became enforceable by the California Attorney General on July 1, 2020.
The CPRA will appear to California voters as “Proposition 24” on the ballot in November. A draft copy of the state’s Voter Information Guide provides the “Yes/No” statement voters will see in connection with Proposition 24 when they go to submit their ballots. The draft guide also offers a number of other materials describing the CPRA, including arguments in favor and against the initiative. Common Sense Media, the California NAACP, and Californians for Consumer Privacy—the political action committee formed by CCPA proponent Alastair Mactaggart—drafted the arguments in favor of the CPRA. They state that consumers are in need of “stronger” protections than those created by the CCPA, and they cite concerns about industry efforts “to weaken and limit enforcement of this law.”
However, opponents of the CPRA measure maintain that it would take privacy protections away from California consumers rather than enhance them and impose costs on small businesses and the state’s economy. Both the Republican and Democratic parties of the state rejected the CPRA at their annual conventions in July. A group of consumer, business, and privacy advocates have joined together to run an organized campaign against the measure called “Consumers Against Prop 24.” Members of the campaign include the ACLU of California, the Consumer Federation of California, Californians for Privacy Now, Color of Change, labor advocate Dolores Huerta, the California Alliance for Retired Americans, and the California Small Business Association. A subset of the members of that campaign—the Consumer Federation of California, Dolores Huerta, and Californians for Privacy Now—authored the counterarguments to Proposition 24 that appear in the draft Voter Information Guide. The counterarguments state that the CPRA would benefit large corporations at the expense of small businesses and reduce privacy protections for Californians. The counterarguments also note that the CPRA ballot measure would rewrite the CCPA before there is time to understand how that law is working, thereby “forcing businesses to absorb even more costs at a time [when] the economic slowdown has many businesses on the verge of closing their doors.”
Court of Justice of the European Union Rules on Adequacy of EU-U.S. Privacy Shield
On July 16, 2020, the Court of Justice of the European Union (CJEU) issued a decision in Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems (Schrems II), upholding the validity of standard contractual clauses (SCCs), but invalidating the EU-U.S. Privacy Shield (Privacy Shield) as a safeguard for the transfer of personal data from Europe to the United States (U.S.).
The case arose from a 2015 complaint filed by Maximillian Schrems, an Austrian privacy advocate, challenging a company’s reliance on SCCs for the transfer of data outside of the European Union (EU). The General Data Protection Regulation (GDPR) restricts the transfer of personal data to “third countries” unless such countries provide an adequate level of protection to that provided under the GDPR. Since 2016, both the Privacy Shield and SCCs have been used as a means to lawfully transfer data from Europe.
In its decision, the CJEU upheld the validity of SCCs as a mechanism for the transfer of data outside of Europe. However, the court imposed additional obligations on companies that use SCCs. Specifically, the CJEU stated that companies are required to evaluate, on a case-by-case basis, whether the law in the recipient country ensures adequate protection in compliance with the GDPR. Data exporters must consider the law and practice of the country to which data will be transferred. If a company determines that the recipient country does not ensure adequate protection of personal data, the company must provide additional safeguards beyond the SCCs or suspend the data transfer.
While the CJEU was not asked to evaluate the validity of the Privacy Shield, it nevertheless determined that the framework was invalid. The Privacy Shield is an agreement between the United States and the EU that authorized the transfer of EU personal data to the U.S. It went into effect in 2016 as a successor to the U.S.-EU Safe Harbor agreement, which was also struck down by the CEJU in 2015.
In 2016, the European Union Commission (EC) determined that the Privacy Shield was adequate for purposes of GDPR and required an annual review of the framework. As recently as October 2019, the EC determined that the Privacy Shield continued to ensure adequate levels of protection for personal data. However, the CJEU determined that the EC’s adequacy determinations were invalid, citing concerns about U.S. government surveillance and the lack of a redress mechanism for EU residents to remedy data harms.
In a statement on July 21, 2020, the Federal Trade Commission noted that it continues “to expect companies to comply with their ongoing obligations with respect to transfers made under the Privacy Shield Framework.” Similarly, the U.S. Department of Commerce stated that it will continue to communicate with its European counterparts to limit the adverse consequences of the Schrems II decision on companies in the U.S.
However, in a set of “frequently asked questions” related to Schrems II, the European Data Protection Board (EDPB) stated that there is no grace period during which transfers to the U.S. can be made under the Privacy Shield. As a result, effective immediately, the EDPB considers transfers made under the Privacy Shield framework to be illegal.
United Kingdom’s Information Commissioner Issues Update on the Office’s Regulatory Approach
On July 13, 2020, the United Kingdom’s Information Commissioner’s Office (ICO) published updated guidance on its “empathetic and pragmatic” regulatory approach during the COVID-19 pandemic. The guidance provides regulated organizations with arguments for extensions or deferments of regulatory action for pandemic-related hardships, but organizations that hoped for a moratorium or deferment of enforcement during the pandemic may be disappointed.
While the ICO will consider hardships imposed by the pandemic when evaluating regulatory action and expects to conduct fewer investigations, it is only suspending regulatory action for organizations that have backlogged Freedom of Information requests. Regarding other regulatory action, the ICO stated that it would “continue to act proportionately, balancing the benefit to the public of taking regulatory action, against the potential detrimental effect of doing so, taking into account the particular challenges being faced at this time.”
The ICO provided the following guidance on its regulatory approach during COVID-19:
- Breach Reporting. Personal data breaches must still be reported without undue delay, and within 72 hours of discovery of the breach. The ICO will take “an appropriately empathetic and proportionate approach” in assessing breach reports.
- Investigations. The ICO stated that it expects to conduct fewer investigations, and focus on “serious non compliance.” When conducting investigations, the ICO will “take into account” the impact of the pandemic on a given organization and suggested that it may use its formal powers to request evidence less often, and may allow longer periods to respond to such requests.
- Anti-Opportunism. The ICO will take a “strong regulatory approach” against organizations that violate data protection laws to capitalize on the current crisis.
- Offsite Audits. While the ICO will continue to conduct “some risk-based audit work,” it anticipates doing so on an offsite basis.
- Regulatory Action and Fines. In evaluating regulatory action and fines, the ICO will consider whether an organization’s difficulties or violations resulted from the COVID crisis, as well as whether an organization has plans to remediate any violations after the crisis. The ICO noted that this is likely to reduce the level of fines. The ICO will also consider giving organizations longer than usual to remediate violations, even those that predate COVID, where remediation efforts were impeded by the crisis.
- Information Requests. While the ICO will continue to handle Freedom of Information and similar requests, it is suspending formal regulatory action in connection with backlogs of outstanding information requests, and will work to limit the burdens such requests place on public authorities. As a result, requestors may have their responses further delayed or deferred.
- Data Protection Fee. The ICO may not enforce against organizations that delay payment of their data protection fee, if such organizations can demonstrate that this is due to COVID-19 economic circumstances and can provide “adequate” assurances as to future payment.
- Data Subject Rights Requests. The ICO recognizes that resources may be stretched thin and could impact an organization’s ability to respond to data subject rights requests. They will consider this when deciding whether to impose formal enforcement action.
United Kingdom’s Information Commissioner’s Office Releases 2019-2020 Annual Report
The United Kingdom’s Information Commissioner’s Office (ICO) recently published its 2019/2020 Annual Report. The report outlines the number of complaints the ICO received during the period between March 2019 and March 2020, highlights enforcement actions taken by the ICO, and addresses key initiatives undertaken by the ICO.
On July 20, 2020, the United Kingdom’s Information Commissioner’s Office (ICO) published its 2019/2020 Annual Report. The report covers the ICO’s activities between March 2019 and 2020—a period the ICO refers to as “transformative” for privacy, data protection, and broader information rights. Information Commissioner Elizabeth Denham explained that the report demonstrates that the ICO has been at the center of privacy discussions ranging from “how facial recognition technology is used to how we protect children online.”
Highlights from the report include:
- The ICO received 38,514 data protection complaints, received 6,367 freedom of information complaint cases, and closed 39,860 data protection cases—up from the 34,684 cases closed in the previous year.
- The ICO took 236 regulatory actions to enforce legislation the ICO is charged with regulating. These actions included 54 information notices, eight assessment notices, seven enforcement notices, four cautions, eight prosecutions, and 15 fines. In addition, the ICO conducted over 2,100 investigations.
- The report addresses the Age Appropriate Design Code, a set of 15 standards for online services that are designed to protect the privacy and safety of children online, which was published by the ICO in January 2020. In the foreword to the report, Information Commissioner Denham stated that the Age Appropriate Design Code is “the most important piece of work covered in this report[.]”
The report noted that the ICO “played an active role” in ensuring guidance and information was available to businesses and the public sector for Brexit. It explains that a “key area of work” for the ICO this year will be developing new mechanisms and approaches for the ICO’s relationship with the European Data Protection Board, European Union Commission, and individual European data protection authorities now that the United Kingdom has left the European Union.