National Consumers League

Developing a pro-consumer approach towards privacy and data security—user expectations

Polly Turner-Ward

By NCL Google Public Policy Fellow Pollyanna Turner-Ward

This blog post is the fifth of a series of blogs offering a consumer perspective on developing an approach towards consumer privacy and data security.

This commentary is the product of a deep dive into the National Telecommunication and Information Administration’s (NTIA) September Request For Comments (RFC), a key part of the process that informs the government’s approach towards consumer privacy. Stakeholder responses to the RFC provide a glimpse into where consensus and disagreement lies on key issues among major consumer and industry players.

Today, data originating from individuals powers the internet. This leads to new services and innovation. However, associated information leaks also put consumers at an unprecedented risk of identity theft and fraud. Previous blogs of this series have explored how organizations may meaningfully offer controls to consumers and factors proposed by the NTIA for organizations to consider when designing business models, products and services and when deciding which defaults and controls to offer consumers. These factors include the risk of privacy harm and information sensitivity. Other factors include user expectations and the context of the data flow. The next two blogs will, therefore, explore these in more depth. 

User expectations 

Policymakers need to better explain how the privacy obligations of an organization can be tied to user expectations. Put simply, consumers want their data to be used in ways that benefit them, for it not to be used in ways that harm them and for their data to be protected. As described by the Center for Democracy and Technology (CDT), user expectations rest on several subjective variables such as level of trust and the perception of value received from the use of information. Data security and privacy risks only exist so long as information is stored. Consumers should be able to utilize online products and services without fear that their information will be excessively collected, processed or shared. Rather than leaning too heavily on the principle of user control, Consumers Union suggested that the onus of good data practices should be redistributed onto organizations by strengthening the principle of data minimizationThis would be more consistent with user expectations and it would also go some way to increase consumer trust.   

Companies currently face very few data practice limitations. The NTIA embraces “reasonable” data minimization, but it declines to specifically state how personal data may or may not be used. This affords organizations flexibility in deciding which control measures and techniques to employ, depending on factors such as information sensitivity and contextual data usage. However, the Center on Privacy and Technology at Georgetown Law urged NTIA to add purpose specification to its list of desired privacy outcomes. Most can agree that certain data uses—such as discriminatory uses or purposes with overly vague descriptions—should simply not be allowed.  

To bring about real change to corporate practices, the Center for Digital Democracy noted that accountability and equitable, fair and just uses of data should be advanced. Organizations should be encouraged to process personal information with the aim of providing value for individuals and society while minimizing individual and collective risks of harm. They should proactively analyze and document data requirements when designing methods to enhance consumer experiences. Such evaluations should specifically outline what the organization hopes to accomplish by collecting and processing personal data. As it is difficult to define what practices may be considered “reasonable”, the Internet Association stated that this area may benefit from the development of industry guidelines 

Organizations need flexibility in order to develop engaging and innovative products and services. Technologies such as machine learning (ML), from which transformative and important insights may be gleamed, often require large data sets. To reduce potential biases and to increase the utility of ML applications, large and relevant training data sets are often necessary. As a society, our collective aim must be to advance emergent technologies for meaningful, rigorous and ethical purposes without handcuffing companies. In order to utilize such technologies, innovative solutions which alleviate the tension between this and data minimization as a control measure to data collection and usage should be rewarded. The question is how and when can the technology be used safely to advance beneficial purposes.  

Rather than thinking of data minimization merely as a limitation on the amount of data a company can collect and use, there are a variety of techniques available to companies such as de-centralizing the storage and use of information, additional security safeguards, aggregation, privacy-enhancing techniques (PETs), use of synthetic data and employing effective de-identification techniques. As highlighted by Public Knowledge, the employment of differential privacy, federated learning and machine learning techniques which do not require such large data sets ought also to be encouraged.  

User expectations and power differentials  

No conversation about user expectations would be complete without consideration of how online user experiences are shaped by industry. Online platforms and other providers establish the means and methods to access their products and services. Power asymmetries exist between consumers and those who build, design and commercialize the digital world. Consumers have little choice but to use services provided by a handful of dominant providers (such as Google, Facebook, and Amazon) that have become integrated with everyday communications.   

Due to the power of network effects, many consumers waive their privacy rights by simply ticking “I agree” to consent requests. This undermines the requirement of consent prior to data collection or processing under the General Data Protection Regulation (GDPR). The California Consumer Privacy Act (CCPA) will give consumers the more granular option to restrict the sale of their information to third parties. While transparency and user control are important aspects of a comprehensive privacy framework, Microsoft noted that the sheer volume of processing decisions and lack of information make it difficult for individuals to protect themselves alone. Modern data practices combined with a myriad of settings and opt-outs undermine individuals’ capacity to exercise control via privacy self-management.   

The Federal Trade Commission (FTC) has previously noted that unfair practices exist online where activities prey on vulnerable consumers, involve coercive conduct and create significant information deficits. The NTIA should consider whether and how consumer rights can be waived, especially where there is unequal bargaining power between consumers and businesses. CDT described how unfair design defaults and aggressive notifications proliferate the digital ecosystem to manipulate consumers and undermine their privacy. Consent is often coerced through manipulative design choices or as a condition of service for platform access. As highlighted by the Electronic Frontier Foundation, companies are becoming skilled at steering users to share personal data.    

To tackle these problems, companies should be encouraged to protect consumers through the deployment of tools that empower them throughout their online experience in relation to their data. The FTC recently brought an enforcement action against Venmo for offering deceptive privacy settings. CDT argued that more enforcement against unfair design and further exploration of user experience, user interface, backend design and how to cabin “dark patterns” is necessary. In addition, Public Knowledge argued that services should not be contingent upon acquisition of data not necessary to render a service. This would better enable consumers to exercise privacy controls without fear of discriminatory treatment. To promote meaningful control, Consumers Union opposed take-it-or-leave-it terms of service, differential pricing, pay-for-privacy and other unfair terms which exacerbate the unbalanced relationship between consumers and companies. They point to the discriminatory effects of requiring individuals to pay more or providing them with lower quality goods or services if they do not agree to waive their privacy rights. Pay-for-privacy can also make product costs less transparent by frustrating consumer efforts to comparison shop.  

Unfair secondary and tertiary data uses are another area of questionable online practices. Often stemming from the involvement of third parties with whom individuals have no direct relationship, many consumers feel that this is the most objectionable part of the new data-driven economy. The next blog of this series will look at how the context of the transaction or data flow may be considered by organizations when deciding which defaults and controls to offer consumers and when designing business models, products and services.

Click the following link to read the sixth, and final, entry of this series.

The author completed her undergraduate degree in law at Queen Mary University of London and her Master of Laws at William & Mary. She has focused her career on privacy and data security.