Back to top

Blog

Submitted by Guest on August 25, 2021

The NAI is expanding its expertise and resources in data privacy, public policy, and advertising law with the recent hirings of Cathy Lee and Ryan Smith, each as Counsel for Compliance and Policy, and Fatiha Tabibipour as Coordinator, Communications & Public Policy. Cathy and Ryan are former NAI legal interns familiar with the ad-tech industry and the NAI Code of Conduct. They will work with members on the compliance process and as part of the NAI policy team to inform members about legal and regulatory developments, and to help implement NAI’s educational and advocacy efforts with policymakers.

An experienced writer who is knowledgeable about data privacy, Cathy Lee will help members and regulators understand and navigate the legal and regulatory landscape surrounding digital advertising, in addition to her role in the compliance review process. Cathy’s experience includes serving as a staff editor and student Editor-in-Chief of the American Intellectual Property Law Association’s Quarterly Journal. She also has experience in private practice and as a Data Privacy Law Fellow at Teach Privacy. Cathy received the Dean’s Recognition for Foundations and Practice while at George Washington Law, from which she graduated in 2020.

Ryan Smith will join Cathy in helping members understand and comply with the NAI code, in addition to helping the NAI engage with federal and state regulators on data privacy, self-regulation, and the digital advertising industry. Ryan previously worked on Capitol Hill on data privacy law and policy. He received his J.D. from the George Washington University Law School. While in law school, Ryan clerked with the United States Senate Homeland Security and Governmental Affairs Committee and the Permanent Subcommittee on Investigations, where his work focused on data privacy law and policy.

Fatiha Tabibipour has been with the NAI since early spring, where she has quickly become an integral member of the communications and public policy teams, providing members with regular updates on state and federal privacy laws and regulations and key industry news. She also coordinates NAI’s events and is helping plan the 2021 NAI Summit in Seattle. Fatiha’s expertise in state legislative processes comes from her work with professors at the Schar School of Policy & Government coding Virginia legislative bills as part of a process to build a Virginia Legislative Database for researchers, law makers, and the public. Fatiha also has experience as a Research and Communications intern at ABPA Institute.

Please join us in welcoming Cathy, Ryan, and Fatiha!

Submitted by Anthony Matyjas... on June 29, 2021

Two years have passed since the NAI published the 2020 Code of Conduct (Code), expanding the scope of the NAI self-regulatory program to the use of offline data for Tailored Advertising, and announcing the development of the industry’s first email-based consumer choice mechanism for Tailored Advertising.

The NAI’s email-based consumer choice mechanism is now live on the NAI site in a beta version. Four NAI members have already integrated with this opt out, and several more are currently finalizing their integration, with the goal of including all NAI members engaged in Audience-Matched Advertising by July 1, 2021. The NAI is working with member companies to help them meet this deadline, and will begin Code requirements tied to the opt out for those companies who have not begun integration. 

Scope of the New Opt-Out Mechanism

When the NAI published its 2020 Code of Conduct and developed the first industry PII-based Opt-Out Mechanism for Tailored Advertising, the use of hashed PII in digital advertising was primarily focused on Audience-Matched Advertising, allowing brands to reach their own customers while they browse the web or use mobile apps. In these use cases, the hashed PII is not used as an addressability method, but rather as a link between existing cookie IDs and/or mobile advertising IDs. However, the past year has brought significant disruption into the digital advertising space, with the announced deprecation of third-party cookies in the Chrome browser, as well as changes to the way mobile advertising identifiers are accessed in iOS. These challenges to traditional addressability methods have spurred considerable innovation from advertising technology companies, with numerous efforts to provide shared industry identifiers that are able to provide some of the functionality previously provided by cookies and mobile advertising identifiers. Many of these solutions are still in development. Some depend on random identifiers, while others propose to leverage hashed email addresses as a means of distinguishing between various users and devices. 

Historically, the NAI’s self-regulatory framework has recognized data types as directly identifying an individual (Personally-Identified Information or PII), pseudonymously identifying a browser or device (Device-Identified Information or DII), or not being linked to a device or individual (De-Identified Information), with consumer notice and choice requirements based on the aforementioned distinctions. Audience-Matched Advertising typically involves the use of hashed (one-way encrypted) PII as a “match point” between two audiences. Once the PII is hashed, it no longer directly identifies an individual, and so it is no longer treated as PII. On the other hand, it is pseudonymously linked to an individual rather than a browser or a device, so it is not DII. Effectively, hashed-PII used for Tailored Advertising is a pseudonymous user identifier, which is not directly addressed in the NAI Code outside of its use for Audience-Matched Advertising.

Those member companies that leverage hashed email addresses as an addressability method for Tailored Advertising are expected to provide a consumer choice mechanism tied to those hashed email addresses, and are expected to integrate with the NAI’s email-based opt-out mechanism even if they do not engage in Audience-Matched Advertising. Thus, the scope of the NAI’s email-based Opt-Out Mechanism will expand from only Audience-Matched Advertising to all use of hashed email addresses for Tailored Advertising.

Moving Forward and Consideration for Future Guidance

The NAI is working with its members and with some of the addressability solution providers to better understand these solutions and how they will be utilized by NAI members and the digital advertising industry more broadly, with the aim of issuing future member guidance on the use of shared addressability solutions, as well as the required notice, transparency, and choice. In the meantime, the NAI expects member companies to provide consumer choice for all addressability solutions used in Tailored Advertising, as per the NAI Code.

The NAI encourages its members to reach out to NAI staff with any questions about current NAI Code applicability to any addressability solutions, or the development of future NAI guidance. We strongly encourage all members to focus on consumer privacy when evaluating any potential replacements for cookies and mobile ad IDs. The NAI plans to issue updated specifications for the opt out, as well as target enforcement dates, in the coming weeks.

 

Submitted by Anthony Matyjas... on April 23, 2021

The Network Advertising Initiative (NAI) recently published its 2020 Annual Report, which provides us with an opportunity to reflect on an unprecedented year, as the pandemic slowed many industries, and recently announced changes to the Chrome browser and iOS 14,  coupled with new state legislation, increased the urgency for the digital media and advertising industries to rethink many current practices, particularly how to recognize users and devices.

In the midst of these significant developments, NAI member companies continued to demonstrate their commitment to the highest standards for privacy in digital advertising. The NAI has been enforcing its 2020 NAI Code of Conduct (Code), which introduced a host of additional notice and choice requirements for data collection and use more broadly, as well as extending the NAI’s coverage to offline data used to target digital advertising, also known as “onboarding.”

In spite of workforce challenges and resource shortages caused by the pandemic, and the uncertainty caused by evolving technology standards and new state laws, NAI members demonstrated that they understand the very real privacy concerns that are behind many of these changes, and that they are ready to address these challenges by devoting further resources and time to consumer privacy protection.

The report is based on findings from the 2020 compliance period in which 87 returning member companies, all leading third-party digital advertising companies, were reviewed during the year by NAI staff for compliance with the Code. Companies new to the NAI underwent an equally stringent review as part of their membership application process in 2020 and will join the annual review program in 2021.

The NAI would like to thank its members for all of the work they put into the 2020 compliance review process as they continued their record of strong compliance with the Code in a most-difficult year.. NAI staff found that in 2020 evaluated member companies once again overwhelmingly complied with the Code, and that to the extent that any potential violations were identified, they were not material in nature.

The previous year, the NAI noted inconsistent compliance with the requirements of the NAI’s Connected TV Guidance, which was newly enforced in 2019. In 2020, these requirements were incorporated into the NAI Code.  After the NAI’s educational efforts and related compliance warnings in 2019, NAI staff observed significant improvement in evaluated member companies’ disclosures regarding connected televisions and information about privacy controls on those devices. 

Overall, NAI staff observed strong compliance from nearly all members, including the presence of many new disclosures required as of 2020, and significant work done to advance the technical development of an Opt-Out Mechanism for Audience-Matched Advertising and just-in-time notice for the collection of Precise Location Information. Evaluated member companies demonstrated that they remain vigorously committed to the NAI’s self-regulatory framework. 

The report also provided an update on the NAI’s other endeavors, which included the publication of two Guidance documents, on modelling health data and non-marketing uses of advertising data over the course of the prior year, and fifteen all-member calls and webinars, including several to address the pandemic and how the digital advertising industry can support public health efforts. 

The NAI’s public policy work further expanded in 2020, with efforts to help shape CCPA implementing regulations, promote a national privacy framework, and engaging with state policymakers to help avoid a patchwork of disparate state consumer privacy laws. The NAI was an active participant in the CCPA implementing regulations process, including multiple sets of comments. The NAI also provided its expertise in digital advertising through testimony before the Senate Commerce Committee and active participation as an executive committee member in the Privacy for America coalition. 

The NAI is perhaps even more relevant now than ever before, as it helps the digital advertising and media industry speak with a united voice, while anticipating the consumer privacy challenges that come with new addressability methods and responding with clear and timely guidance, even as the technological and legislative landscape is in a state of rapid evolution. The NAI welcomed six new members in 2020, with several more applying in 2020 with the goal of joining the NAI in 2021. Over sixteen million users visited the NAI’s opt-out page in 2020, which is more than a 50% increase from 2018 when the page launched.

At a time when the nature of digital advertising is being questioned and reconsidered globally, it is even more important for self-regulatory efforts in the United States to clearly demonstrate that a thoughtful and flexible approach can provide robust consumer privacy protection, while also assessing if and how these efforts might evolven into a “co-regulatory” approach with state and federal regulators, allowing digital advertising technology, and the internet economy more broadly, to flourish. In 2020, as always, the NAI’s primary mission is to enable responsible data-driven advertising to provide free and equal consumer access to robust digital media. 
 

Submitted by Guest on March 29, 2021

By: Nathan Moore, a former legal intern for the NAI and a student at the University of Maine School of Law.

The future of privacy law and regulation in the United States could be shifting from the notice and consent regime to that of a duties-based framework. There is evidence that the notice and consent framework is not working for consumers, and with ever-growing technological capabilities, this problem will continue to grow in the future. The basic concept of the duties-based framework is that consumers transfer data to companies, which will perform a service on the data, and the companies will have to conform to the duties delegated to them by law. Multiple duties-based frameworks exist for data privacy and security, but three recently proposed frameworks are particularly relevant as Congress and the Biden Administration consider a national privacy framework: The Data Care Act of 2018, the Consumer Online Privacy Rights Act, and a Brooking Institution white paper on privacy legislation. 

Woody Hartzog and the Data Care Act of 2018 

In his article Privacy’s Constitutional Moment and the Limits of Data Protection, Woody Hartzog analyzes Senator Brian Schatz’s (D-HI) Data Care Act of 2018 and argues that the U.S. should move our privacy paradigm away from the notice and consent framework to one of strict duties. He explains how the notice and consent framework has been failing consumers and argues that the duties-based system can address more issues than the notice and consent regime, while still offering a flexible approach to privacy that consumers can trust. He examines the European Union’s General Data Protection Regulation (“GDPR”) and U.S. sectorial laws and concludes that these laws have built in limitations, are too focused on the individual and consent, and not focused enough on relationships and power, thus having an inherent vulnerability which should lead to its disuse.

Hartzog uses the duties-based framework presented in the Data Care Act as a springboard for his idea of a successful federal law that would address: “(1) corporate matters; (2) trustworthy relationships; (3) data collection and processing; and (4) personal data’s externalities.”1  Senator Schatz defines the duties of care, loyalty, and confidentiality in his law, which Hartzog incorporates into his framework. Senator Schatz defines the duty of care in privacy as requiring providers to reasonably secure individual identifying data and promptly inform users of data breaches that involve sensitive information.2 To meet the duty of care, an ad tech company employing the National Institute of Standards and Technology (“NIST”) or the International Organization for Standardization (“ISO”) would satisfy the requirement for providing security in additional to following state of federal data breach notification standards. 

Hartzog defines the duty of loyalty as the hallmark of fiduciary duties and requires a strict commitment to “refrain from self-dealing and a firm prioritization of the trustors’ interests over the interests of the entrustee.”3  Similarly, Hartzog refers to the Schatz bill, which defines the duty as, “not [using] individual identifying data in ways that harm users.”4  Hartzog explains the duty of loyalty would require that: 

An online service provider may not use individual identifying data, or data derived from individual identifying data, in any way that--(A) will benefit the online service provider to the detriment of an end user; and (B) (i) will result in reasonably foreseeable and material physical or financial harm to an end user; or (ii) would be unexpected and highly offensive to a reasonable end user. 5

Navigating this framework for an ad-tech company that does not come into contact with consumers is a rather easy duty to satisfy conceptually, but depending on the interpretation of the impacts, it could pose real challenges for ad-tech businesses. The duty of loyalty essentially does not bar the use of data but requires smart use of consumer data. Data can be responsibly passed through the stack as long as it does not harm the user, is not used in an offensive way, and is not to the detriment of the consumer. Notice may need to be altered to provide for consumer expectations, but the use of data for digital advertising could possibly be achieved. That is, consumer data could be used as long as it is safeguarded and the consumer is aware of its use. For digital advertising, consumer data is used to the benefit of the consumer because it provides for ad-supported free and low-cost content, as long as the data is not misused or the advertising practices do not provide harms to the consumer. 

Hartzog’s duties framework appears to impose stronger privacy regulations than current laws like GDPR, however, the inherent ambiguity of duties could be a cause of concern for these ad-tech companies. The ad-tech industry could likely survive under a duties-based regime, but the key to survival is the interpretation of any legal outcomes and responsible data stewardship. At the same time, new duties-based regulations could impose heavy compliance and legal burden on the ad-tech industry, and it could therefore cause many smaller companies to exit the market. 

Consumer Online Privacy Rights Act

Senator Maria Cantwell (D-WA) introduced the Consumer Online Privacy Rights Act (“COPRA”) in 2019 to try and address the privacy issues in American law. COPRA is divided into three parts: (1) privacy rights (§§ 101-110); (2) oversight (§§ 201-202); and (3) enforcement (§§ 301-304). The bill contains ambiguous definitions of the duties of care and loyalty as there is no defined duty of care. 

COPRA defines the duty of loyalty in two parts. First, a company should not “engage in a deceptive data practice or a harmful data practice.” Second, it should not “process or transfer covered data in a manner that violates any provision of this Act.”While similar to the Hartzog framework, the COPRA definition is slightly narrowed to focus on data practices. The bill narrows the scope by defining a deceptive data practice as “an act or practice involving the processing or transfer or covered data in a manner that constitutes a deceptive act or practice in violation of section 5(a)(1) of the [FTC] Act.”7 Likewise, the scope of harmful data practice is narrowed by defining such practices as those that are likely to cause, “financial, physical, or reputational injury.”COPRA further defines harmful data practices as those that cause physical or offensive intrusion upon the solitude of an individual.That conception of harmful data practices is a way in which COPRA attempts to codify the tort of intrusion into privacy law. Finally, COPRA leaves the definition of harmful data practices quite vague by including “other substantial injuries” in the third prong of its definition.10 Since this uncertainty poses a high probability for litigation, it is impractical for businesses and would likely be a barrier to innovation. 

Bridging the Gaps: a Path Forward to Federal Privacy Legislation

The Brookings Institution offers a middle ground in their privacy law orientated white paper and accompanying legislative text (“Brookings Framework”).11 The Brookings Framework seeks to take a more practice approach to balance business innovation with the duties of loyalty and care. 

The Brookings Framework proposes adding the duties of care and loyalty to any privacy law. It defines the duty of loyalty as: 

[R]equire covered entities to implement reasonable policies and practices to protect individual privacy “appropriate to the size and complexity of the covered entity and volume, nature, and intended use of the covered data processed,” limit data processing to “necessary [and] proportionate” purposes, consistent with COPRA … and communicate data practices “in a fair and transparent manner.” 12

The Brookings Framework defines the duty of care by modifying COPRA’s §101(b)(2) definition of “harmful data practices” by including in the definition, “discrimination in violation of federal anti-discrimination laws or anti-discrimination laws of any State or political subdivision thereof applicable to the covered entity.”13 The Brookings Framework would also include a prohibition of covered entities processing data in a way that could reasonably be foreseen as causing harm.14

The Brookings Framework includes a duty of loyalty that limits what companies can do with data, and it relegate activities to those that are “reasonably necessary, proportionate, and limited.”15 In an industry conscious move, the Brookings Framework proposes that laws include the duty of care language that allows companies to collect and use data for improving services, both for products requested and those that would be reasonably anticipated within the context of the covered entity’s relationship with the individual.16 For the duty of loyalty, they propose that laws build upon the concept of “reasonably anticipated in the context of the covered entity’s relationship with the individual.” The Brookings Framework suggests enlarging that conceptualization into a set of baseline duties toward individuals.17 Another element they would add to the duty of loyalty is an “obligation to communicate policies and practices for processing and transferring covered data ‘in a fair and transparent manner.’18 Finally, adding to the duty of care, the Brookings Framework suggests having a harmful data practices section, which would encompass civil rights laws and would include in the list of harms the violation of state and federal anti-discrimination laws.19 In general, the Brookings Framework approach seeks to add much-needed specificity. For ad-tech companies, this approach would be beneficial, compared to other approaches discussed above that would result in an open-ended legal framework subject to interpretation by regulators or the courts. 

By pursuing the Brookings Framework, the U.S. can effectively balance consumer protections with business needs. 

1. Woodrow Hartzog & Neil Richards, Privacy’s Constitutional Moment and the Limits of Data Protection, 61 B.C. L. Rev. 1687, 40 (2020).
2. Press Release, Office of United States Senator Brian Schatz, Schatz Leads Group of 15 Senators in Introducing New Bill to Help Protect People’s Personal Data Online (Dec. 12, 2018), https://www.schatz.senate.gov/press-releases/schatz-leads-group-of-15-se...
3. Hartzog, supra note 1, at 47.
4. Schatz, supra note 2. 
5. Hartzog, supra note 1, at 49.
6. Consumer Online Privacy Rights Act, S.2968, 116th Cong. § 101(a) (2019). 
7. Id. at § 101(b)(1). 
8. Id. at § 101(b)(2)(A). 
9. Id. at § 101(b)(2)(B).
10. Id. at § 101(b)(C). 
11. Cameron F. Kerry et. al., Bridging the Gaps: A Path Forward to Federal Privacy Legislation, Governance Studies at Brookings, June 2020.
12. Id. at 6.
13. Id.
14. Id.
15. Id. at 28.
16. Id. at 29.
17. Id.
18. Id. at 30.
19. Id. at 31.