By NCL Google Public Policy Fellow Pollyanna Sanderson
This blog post is the third of a series of blogs offering a consumer perspective on developing an approach towards consumer privacy and data security.
This commentary is the product of a deep dive into the National Telecommunication and Information Administration’s (NTIA) September Request For Comments (RFC), a key part of the process that informs the government’s approach towards consumer privacy. Stakeholder responses to the RFC provide a glimpse into where consensus and disagreement lies on key issues among major consumer and industry players.
Data practice transparency is a challenge in the age of big data analytics. For consumers to feel comfortable adopting and using online services, we must address the growing atmosphere of privacy concerns in a serious manner. Although transparency and consent are not enough to protect consumers alone, these are nevertheless important aspects of building a comprehensive approach towards privacy and data security. In its RFC, NTIA notes that organizations should make adequate disclosures and controls available in ways that allow users to exercise informed decision-making. To empower consumers to effectively assess risk and express privacy preferences, they must be reasonably informed about how organizations collect, use, share, and secure their personal information. This raises two questions: What information should organizations disclose, and how?
What information should organizations disclose?
Making the right information available is essential to fostering and maintaining consumer trust. It would also go some way to empowering consumers to make individual determinations of the benefits of choosing to use services and technologies. The California Consumer Privacy Act (CCPA) grants individuals the “right to know.” This includes a requirement for organizations to provide notice to California residents prior to collection and use of personal information. California residents are also granted the right to request the disclosure of the categories and specific data collected about them. However, rather than requiring consumers to request disclosures, transparency would be more effective if organizations were required to proactively provide consumers with clear and comprehensive information in a more granular manner.
The National Consumers League (NCL) believes that organizations should make disclosures about what data they retain, process, and share, and for what purposes. When organizations share information with third parties, they should be specifically identified, as well as the nature of the relationship. Organizations should effectively notify consumers of any updates to data practices and of the rights they may exercise. As Public Knowledge has commented, other disclosures are also important, such as whether the data is identified, identifiable, or pseudonymous, whether the information is protected with industry-recognized best practices and standards, and whether data is subject to automated decision-making. Microsoft also commented on the importance of providing meaningful disclosure of the logic involved, the significance, and envisaged consequences of automated decision-making.
How should organizations disclose this information?
American consumers desire greater control over their data, as demonstrated by a recent survey in which 71 percent of consumers reported downloading software to protect their data privacy or otherwise help them control their web experience. For American consumers to have greater control over their data, they first need to know and understand what is being done with it. To promote the goal of informed consumer consent, the U.S. may take inspiration from the European Union’s (EU) General Data Protection Regulation (GDPR), which requires organizations to provide information “in a concise, transparent, and intelligible and easily accessible form, using clear and plain language.” This may include simplified language, clear headings, visualizations, videos, examples, and language translation software to support the many Americans whose first language is not English.
An excellent suggestion by the Center for Democracy and Technology (CDT) is for the NTIA to explore “nutrition label” style approaches to privacy policies. This approach would entail conveying salient information to consumers in plain language on a single cell-phone screen. Design choices are critical to promoting meaningful and effective notice and control. To avoid overloading users, the FTC is a proponent of context-specific disclosures at the point at which consumers make decisions about their data. Microsoft’s comments describe how organizations must design user experiences so that individuals receive relevant notifications at the right times and in the most effective and actionable way without unnecessarily disrupting their online experience.
To facilitate easy-to-understand privacy choices, many stakeholders agree with NTIA that organizations should continually consider how the average user interacts with a product or service and maximize the intuitiveness of how it conveys information. Intuitive navigation and settings would better enable consumers to make informed decisions about their account settings as they learn about the data practices of an organization. As well as consideration of the context, design choice mechanisms should also depend on user expectations. For instance, controls used to withdraw consent should be as readily accessible and usable to consumers as those used to permit the activity.
The Internet Association (IA), a trade association that represents leading global internet companies, commented that the NTIA should provide guidance and examples to lawmakers and regulators of ways to achieve transparency and control aside from privacy policies. As highlighted in the FTC Staff Report on the Internet of Things, this may include just-in-time notices, set-up wizards, user command centers, dynamic user privacy dashboards with contextual explanations and choice management, functionality, and in-product settings. For example, almost 2 billion people have visited their Google Account Controls, where they can find their privacy and security settings in a single place.
Organizations should be held accountable for their design choices. This would provide incentive for them to continually improve their tools to make them more robust and intuitive, leading to choices that are more consistent with consumer preference and risk. Under an improved disclosure system, the FTC noted that it could promote accountability through the exercise of its authority to challenge deceptive disclosures. To demonstrate a valid and credible decision-making process in designing online experiences, Microsoft suggested that organizations should be obliged to conduct and document proactive risk assessments during the design stage. This should demonstrate consideration of the context of data processing, the impacted parties, and the purpose or desired effect of disclosure.
So, where does all this leave privacy policies?
For legal and accountability purposes, organizations should still make detailed disclosures at the point of interaction. Public Knowledge highlighted how privacy policies not only help organizations understand their own information practices, but that they also give consumers, advocacy groups, watchdogs, press, and regulators key information to enable them to assess and scrutinize those data practices. However, Consumers Union warned that accountability is currently undermined by the vague and expansive nature of privacy policies, which provide little reliable concrete information about actual practices. To help consumers make more informed decisions in the marketplace, Consumer Reports created the Digital Standard to test products for privacy and security. However, the success of this effort depends on the transparency of privacy policies and user interfaces. To promote accountability, organizations must enable outsiders to understand and assess their data practices through transparent and explicit disclosures which are full and complete statements of practice.
The next blog of this series will unpack issues associated with organizations distinguishing between sensitive and non-sensitive information in the context of building privacy protections into products and services. Problems that arise are twofold. First, the distinction is increasingly questionable due to the impact of big data analytics and machine learning (ML) technologies. And second, the distinction is controversial because information sensitivity is a highly personal, subjective, and arbitrary concept. The next blog will examine these problems and solutions proposed by stakeholders.
Click the following link to read the fourth entry of this series.
The author completed her undergraduate degree in law at Queen Mary University of London and her Master of Laws at William & Mary. She has focused her career on privacy and data security.