Privacy and the Dark Side of Control

Is control over our data enough to ensure privacy?

To hear some in industry and government tell it, the answer to our modern privacy dilemma is simple: give users more control.  There is seemingly no privacy-relevant arena, from social media to big data to biometrics that cannot be remedied with a heaping dose of personal control. Facebook founder and CEO Mark Zuckerberg said “What people want isn’t complete privacy. It isn’t that they want secrecy. It’s that they want control over what they share and what they don’t.”[1] Microsoft CEO Satya Nadella summarized his company’s focus on user control by stating, “Microsoft experiences will be unique as they will…keep a user in control of their privacy”, adding that the company will “[a]dvocate laws and legal processes that keep people in control.”[2] Google asserts that it “builds simple, powerful privacy and security tools that keep your information safe and put you in control of it.”[3]

Privacy regulators love the concept of control, too. The Federal Trade Commission, one of the chief privacy enforcers in the US, adopted it as a regulatory beacon.[4] “Consent”—one form of effectuating control—is the centerpiece of the European Union’s entire General Data Protection Regulation.  It legitimizes all kinds of personal data practices.[5] A few foundational privacy theories hold that the essence of privacy is control over personal information. Privacy scholar Alan Westin defined privacy as “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others.”

It’s time to take a step back. It’s a mistake to assume control alone can solve the modern privacy dilemma. While control is vital, people can only do so much in a day. We are being stretched too thin and it is making us vulnerable. Companies use the appeal of personal control over data as a way to shift the risk of data collection and processing back onto data subjects. This is dangerous because people are not in the best position to understand the risk of harm from sharing personal information. Nor can people exert meaningful control over the vast, byzantine data ecosystem that can require thousands of ongoing decisions about how their data is collected, used, and shared. Control in excess of our abilities is broken. It’s time we treated an endless buffet of control over personal information as too much of a good thing.

The Bandwidth Problem

There are many jokes about whether anyone reads privacy policies, but these jokes rest on the truth that meaningful control over information is almost impossible to scale. Researchers have estimated that if an ordinary Internet user were to quickly read every privacy policy they encountered over the course of a year, it would take them 76 working days to finish up.[6] From the second we boot up any device, we’re gifted with “control” over information in the form of privacy policies to read, terms of use to agree to, and pop-up banners to click— for each and every website or app we visit or use. Eventually our eyes gloss over and that control does us little good. Our critical faculties just crash, and we give up.



"The weight of too much control might crush us. It could leave us bewildered and hopeless and agreeable to anything.."


I am not trying to belittle the idea of control. Quite the opposite. Control is essential. It furthers foundational human values like autonomy and dignity. People crave it.[7] They need it. The problem is industry and governments treat control as if it were an infinite resource. It’s a little like the problem of trying to remember all of your passwords. You may be able to remember a few, but it is almost impossible to remember them all. Control is far too precious for companies and lawmakers to over-leverage it. But this is what happens when the pursuit of control becomes the main or only way companies address privacy.

Simply giving people choices without context, supplemental protections, and substantive promises is likely to mislead or endanger consumers. People might feel falsely empowered by opportunities to restrict the collection and use of their personal information without any meaningful reductions of risk or a clear sense of how information is being collected and used. Industry attempts to improve privacy notices are admirable but for naught if people are still buried in a pile of options from all the different technologies they use.

“Here, You Take the Risk”

Even when control is effective, it can act to shift the burden of responsibility for protecting privacy to people who are less equipped to handle it. Control over personal information might sound attractive in isolation. Who doesn’t want more power over things that affect our lives? But with this power often comes a practical obligation. If you do not exercise that control, you are at risk. Companies can take your inaction as acquiescence. So while you might remember to adjust your privacy settings on Facebook, what about Instagram, Twitter, Google, Amazon, Netflix, Snapchat, Siri, Cortana, Fitbit, Candy Crush, your smart TV, your robot vacuum cleaner, your WiFi-connected car, and your child’s Hello Barbie?

A recent Pew study found that mobile apps can seek over 235 (!) different types of permissions from smartphone users, with the average app asking around five different permissions to access and use data.[8] Will anyone even be able to figure out what all of the settings mean or where they are, much less remember to look for them? Is there even a remote possibly people will be able to be consistent with their choices? Or will this all just devolve into blindly poking at buttons or just accepting the default? And all this work is just for the initial setup. When the settings and permission options change, as they often do, people might need to log back in and change them again.

In the aggregate, the weight of too much control might crush us. It could leave us bewildered and hopeless and agreeable to anything. Privacy policies become “anti-Privacy Policies” because companies know that we will never read them. The default settings for privacy controls are permissive, because companies know that we do not usually change them. Control is a vital resource. But it also a scarce one that is easily diluted. Prioritizing control hides the power imbalances inherent in our modern mediated lives. Instead, our privacy rules should seek to preserve and maximize control for when it can be the most effective and hold data processors accountable for the rest.


This essay is adapted from a longer article titled “The Inadequate, Invaluable Fair Information Practices,” 76 Maryland Law Review 952 (2017), which can be accessed here.[9] A more detailed discussion on control and the design of information technologies can be found in Privacy’s Blueprint: The Battle to Control the Design of New Technologies, forthcoming 2018 from Harvard University Press.

[1] Michael Zimmer, Mark Zuckerberg’s Theory of Privacy, Wash. Post (Feb. 3, 2014), (quoting a 2010 interview by Time Magazine with Mark Zuckerberg).

[2]Data Privacy Day 2015—Putting People in Control, Microsoft Corp. Blogs (Jan. 28, 2015), (quoting an e-mail from Satya Nadella, CEO Microsoft, to Microsoft employees). 

[3] Guemmy Kim, Keeping Your Personal Information Private and Safe—And Putting You in Control, Official Google Blog (June 1, 2015),

[4] Fed. Trade Comm’n, Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers 20 (2010),


of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation),

[6] Alexis C. Madrigal, Reading the Privacy Policies You Encounter in a Year Would Take 76 Work Days, Atlantic (Mar. 1, 2012),; see also Aleecia M. McDonald & Lorrie Faith Cranor, The Cost of Reading Privacy Policies, 4 ISJLP 543 (2008).

[7] Marry Madden and Lee Raine, “Americans’ Attitudes About Privacy, Security and Surveillance,” Pew Research Center (May 20, 2015), (“When they are asked to think about all of their daily interactions – both online and offline – and the extent to which certain privacy-related values are important to them, clear majorities say these dimensions are at least “somewhat important” and many express the view that these aspects of personal information control are “very important.”).

[8] Kenneth Olmstead and Michelle Atkinson, “Apps Permissions in the Google Play Store,” Pew Research Center (Nov. 10, 2015),

Latest Releases
Join the conversation