National Consumers League

Developing a pro-consumer approach towards privacy and data security—context of the transaction

Polly Turner-Ward

By NCL Google Public Policy Fellow Pollyanna Turner-Ward

This blog post is the sixth, and final, in a series of blogs offering a consumer perspective on developing an approach towards consumer privacy and data security.

This commentary is the product of a deep dive into the National Telecommunication and Information Administration’s (NTIA) September Request For Comments (RFC), a key part of the process that informs the government’s approach towards consumer privacy. Stakeholder responses to the RFC provide a glimpse into where consensus and disagreement lies on key issues among major consumer and industry players.

Along with risk of privacy harm, information sensitivity, and user expectations, another factor proposed by the NTIA for organizations to consider when designing business models, products and services, and when deciding which defaults and controls to offer consumers is the context in which the transaction takes place. This blog explores this important consideration for organizations developing approaches towards privacy and data security.

Context of the transaction   

Organizations must provide appropriate mechanisms for individuals to exercise control over how their personal information is handled. There are contexts where consumers should have affirmative control over data uses, such as where personal information is sensitive, or where there is risk of harm. However, user control should not come at the expense of opportunities for beneficial uses of consumer data–such as research leading to innovation, new products and capabilities, customized services, and other services that consumers want. Consent may, therefore, be presumed in contexts where the privacy choice is insignificant or inconsequential, where the organization has legitimate interests, or where there are compelling social benefits. 

That said, the Center for Democracy and Technology (CDT) noted that “context” has been difficult to operationalize. Policymakers must provide clarity about how organizational privacy obligations may be tied to the transactional context. To allow for flexibility as norms and technologies continue to develop, the specifics of consent could be provided in the form of regulatory guidance and codes of conduct rather than being enshrined in statutory language.

The EU’s General Data Protection Regulation (GDPR) contains the notion of “legitimate interests.” This offers a meaningful way to permit standard data uses that are consistent with individual interests while reserving express consent to situations where individuals should pause and consider their choice. The Internet & Television Association noted that individual control should not interfere with the ability of a company to engage in legitimate business activities consistent with the transaction or consumer relationship. Building on this, Access Now called for a more thorough description of what routine business practices may be considered “reasonable,” particularly regarding entities with no first-party consumer relationship.

As described by the Federal Trade Commission staff, this may include operational processing to provide requested services or goods (e.g. retailers' disclosure of consumers contact information to delivery companies to ship their purchases), first-party marketing (e.g. retailers recommending products based upon a consumers’ prior purchases and collecting data for loyalty programs), related operational processing such as first-party analytics, research, and marketing, conducting reasonable routine business operations (e.g. websites collecting click-through rates to improve navigation), compliance with laws (e.g. search engines disclosure of consumer data in response to a legal process) or protection of consumers and/or third parties from harmful activities (e.g. scanning ordinary web server logs to detect fraud).

Consumers Union highlighted that most consumers do not want unrelated data collection and use. If a company wishes to engage in additional, non-contextual data collection or sharing, individuals should be given a meaningful choice after the company has made to them a clear and compelling case, separate from boilerplate language. Such requests should be rare to avoid consent fatigue and to align with consumer expectations. However, industry stakeholders such as the Internet Association (IA) advocated for broad carve-outs from consumer control. This would allow companies to handle all personal information deemed to be “necessary for the basic operation of the business.” Public Knowledge warned that this approach would permit any advertising-supported platform or data broker to handle any and all consumer data. To avoid such a result, PK recommended for implied consent to be drawn more narrowly.

Secondary and tertiary data uses

The NTIA’s stated privacy goal of “user control” is undermined by a loophole that enables downstream third-party processing, secondary uses, and sales of consumer data. To overcome this deficiency, The Consumers Union suggested that controls should extend to personal information obtained by third parties as well as first-party interactions. Consumer data may be compiled, analyzed, and transformed to create new knowledge, classification, and predictions. The World Privacy Forum noted that this presents a compelling value proposition if used to create new meaningful data sets, and ultimately public benefits. Of course, exceptions are needed to allow data to be used for valuable research purposes. However, unfair secondary and tertiary data uses also have the potential to create long-term consumer harm.

In the data broker industry, which has evolved to focus on detailed consumer analytics and predictive profiling, highly sensitive and identifiable consumer data is often sold and shared. Harms may arise to individuals, groups, and communities that are out of reach for the individual under a privacy self-management regime. Without one’s knowledge, consent, or participation, profiles, and inferences may be made about individuals based on aggregated or anonymized datasets, or data derived from consenting third-party individuals. As described by the World Privacy Forum, “consumer scores” influence meaningful market opportunities, covering consumer loyalty, employability, personality, medical risk, political affiliations and more.

Data brokers should not be allowed to unrestrictedly build and sell ever-more detailed consumer profiles. To protect consumers, it is necessary to limit certain commercial data practices regarding personal, inferred, aggregated and de-identified information. According to the World Privacy Forum, at the heart of creating a comprehensive approach towards consumer privacy and data security must be the application of meaningful technical, procedural, and policy controls over secondary and tertiary consumer data uses and sales. These may include effective standards for de-identification, technical measures to audit data for inappropriate uses over its lifetime, and data tracking techniques.

Save for various exceptions, there are prohibitions on the processing of “sensitive categories” of data in the EU. Building on the Fair Information Practice Principles (FIPPs), CDT supported a federal law that presumptively prohibits unfair secondary uses of sensitive information. The Center on Privacy and Technology at Georgetown Law suggested that the right to object to data processing, use limitations and purpose specifications should be applied to prevent discriminatory uses (such as the denial of access to or awareness of critical opportunities in housing, education, finance, employment, and healthcare). To promote accountability, clear purposes must be articulated, activities must be consistent with that stated purpose, and these obligations must also apply to downstream and third-party actors. To account for harmful business models, such as those used by certain data brokers, purpose limitations should accompany fairness obligations for all data practices. Echoing comments made by the Center for Digital Democracy, these measures should aim to safeguard equitable, fair, and just outcomes.

In conclusion, the Cambridge Analytica scandal demonstrated to consumers that they are helpless when data is lost to third parties. Consumers are alarmed by intrusive and pervasive commercial surveillance, which threatens the American ideals of self-determination, fairness and equal opportunity. This atmosphere has contributed to a crisis of confidence in the Internet. As Mozilla noted, an inflection point has been reached. The regulatory framework must be re-evaluated.

The author completed her undergraduate degree in law at Queen Mary University of London and her Master of Laws at William & Mary. She has focused her career on privacy and data security.