National Consumers League

Developing a pro-consumer approach towards privacy risks and harms

Polly Turner-Ward

By NCL Google Public Policy Fellow Pollyanna Turner-Ward

This blog post is the second of a series of blogs offering a consumer perspective on developing an approach towards consumer privacy and data security.

This commentary is the product of a deep dive into the National Telecommunication and Information Administration’s (NTIA) September request for comments (RFC), a key part of the process that informs the government’s approach to consumer privacy. Stakeholder responses to the RFC provide a glimpse into where consensus and disagreements lie on key issues among major consumer and industry players.

The adoption of a risk and outcome-based approach towards consumer privacy is stated by the National Telecommunications and Information Administration (NTIA) in their RFC as a high-level goal for federal action. This attunes with work of the National Institute of Standards and Technology (NIST) on developing a voluntary risk-based Privacy Framework as an enterprise risk management tool for organizations. In its RFC, NTIA states that access and correction rights of individuals should be “appropriate to the risk of privacy harm.” In decision-making relating to privacy practices, the approach entails balancing potential privacy risks and harms against other factors such as business needs and means of mitigating those privacy risks.

In its RFC, NTIA points to the success of risk management in cybersecurity as a model for privacy regulation. However, it may be ill-advised to draw inspiration from cybersecurity risk management in the privacy realm. When developing new products, services, and business models, it is suitable to balance organizational agility against cybersecurity concerns because harms and mitigating controls are relatively uncontroversial. However, as highlighted by the Center for Democracy and Technology (CDT), the same cannot be said for privacy risks and harms. Within industry circles, there is a lack of consensus as to what constitutes risk of privacy harm. As demonstrated by recent events, there is significant disagreement about the value of privacy between users and industry. If a risk-based approach towards privacy is advanced, clarification of “privacy risk” is therefore urgently needed.

It is evident that the NTIA considers some privacy violations to be less harmful than others. Comments from Public Knowledge highlight the danger that such an approach may focus on legally cognizable and tangible risks such as financial loss and physical injury. However, inclusive of future risks, an adequate definition of “privacy risk” would include a wide array of risks and harms which may arise from privacy violations. A privacy violation is itself an individualized harm, even when secondary harms are not immediately obvious. The Center on Privacy and Technology at Georgetown Law emphasizes that once personal data is exposed, there is always a chance of later misuse exceeding social norms or user expectations, violating user rights, and undermining trust in technology and organizations handling data.

Privacy risks go beyond actual or expected monetary, physical, or psychological harm. A multitude of additional harms can stem from privacy violations. An incomplete list of potential harms includes risks of identity theft, re-identification harms, sensitive inferences, exploitation harms, reputational harms, stigmatization, embarrassment, limitations of choice and awareness of opportunities, significant inconvenience and loss of time, unfair price discrimination, limitation of employment and social prospects, and adverse eligibility determinations relating to housing, health care, and other forms of discrimination. CDT recommends the adoption of the list of risks compiled by NIST. This includes diminished capacity for autonomy and self-determination, discrimination, and lack of trust.

Public Knowledge notes that many privacy harms are difficult to prove, making it difficult to accurately calculate damages associated with unauthorized access. This makes it easy for organizations to externalize the cost of consumer privacy protection, forcing the public to bear the burden associated with privacy violations. To force organizations to internalize costs, “reasonable” harm must be defined according to the consumer perception of reasonableness. Another way to overcome the externalization problem and provide redress for privacy violations is for legislation to avoid requiring a showing of monetary loss or other tangible harm. Liquidated damages, used in situations where damage is real but difficult to quantify, should be available. The Cable Communications Privacy Act already makes liquidated damages available for privacy violations. These measures would incentivize organizations to better account for and mitigate potential harms, leading to better protection of personal information by encouraging the design of products, services, and business models which avoid harm to individuals and groups.

Even when privacy violations do not result in tangible and measurable harm to specific individuals or groups, Public Knowledge draws attention to the fact that they may cause collective societal harms. Such privacy harms may include the exacerbation of informational disparities, undermined public trust in democratic institutions, distortions of the public record, misinformation, and Cambridge Analytica-type psychographics. Leveraging individuals' digital footprints, psychographic profiles are developed by organizations to make personality inferences and predictions to micro-target consumers with advertisements, or to micro-target persuadable voters during elections.

To address societal privacy harms, and to promote the fair allocation of privacy benefits, Public Knowledge proposes that methodologies should be developed to assess the human rights, social, economic, and ethical impacts of data processing outputs. This broader view of risk management would consider the advancement of values such as equity, fairness, community, competition, efficiency, and innovation. These impact assessments should be required for companies that engage in high-risk data practices. Rather than allowing organizations to conduct such impact assessments internally, a data protection authority would be better placed to carry out these evaluations and provide oversight.

Consumers Union’s comments draw attention to the subjective nature of privacy risk assessments and the dangers of privacy protections being contingent upon internal risk evaluations that are “necessarily biased towards [an organizations] own interests.” Many other stakeholders also urged NTIA to abandon its risk-based approach to consumer privacy. As an alternative, a rights-based approach would offer better consumer protection. The concept dates back to the Fourth Amendment of the Constitution, underscoring the long history of providing strong privacy protections to citizens. A framework that conditions privacy obligations upon the outcome of a risk assessment is incompatible with the traditional U.S. approach of individual ownership and control. Baseline obligations should automatically attach when personal information is collected, retained, used, or shared. Consistent with existing federal legislation such as the Wiretap Act, protections ought to apply per se, rather than depending on the outcome of an evaluation of potential harms and benefits. By offering strong privacy protections and clear and predictable rules, this would also advance NTIA’s high-level goal of promoting legal clarity while maintaining flexibility to innovate.

As well as risk management, the NTIA states that transparency and control are also desired privacy outcomes. Data practice transparency is a challenge in the age of big data. For consumers to feel comfortable adopting and using online services, the growing atmosphere of privacy concerns must be addressed in a serious manner. In the next blog of this series, we will explore ways in which organizations may effectively (and not so effectively) approach user notice and control in relation to their collection and use of personal data online.

Click the following link to read the third entry of this series.

The author completed her undergraduate degree in law at Queen Mary University of London and her Master of Laws at William & Mary. She has focused her career on privacy and data security.