VI. CONCLUSION: REFRAMING PRIVACY’S MEANING
The last question we asked focus group participants concerned the future of privacy. We tasked each group to describe, in just a few short words, what they expected from this future. While the responses varied, the most cited phrase associated with privacy’s future was “meaningless.” It is noteworthy that this word was used not just by Defeatists who were dismayed by a future without privacy but by the Futurists who championed the benefits of this new world as well. Across the Resignation Curve, nearly every person cast doubt on the
111 Focus Group 1 (Seniors), Rutgers University (Sept. 13, 2019) (on file with author).
112 Focus Group 5 (Young Adults), Rutgers University (Sept. 23, 2019) (on file with author).
113 Focus Group 6 (Middle Age), Rutgers University (Oct. 6, 2019) (on file with author).
114 Focus Group 6 (Middle Age), Rutgers University (Oct. 6, 2019) (on file with author).
meaning or purpose that privacy might play in an increasingly digital society.
We have already discussed literary, philosophical, and legal conceptions of privacy at length.115 Upon reflecting over the totality of the data we gathered, however, it is worth emphasizing a phenomenon also discussed by Solove in Nothing to Hide: the lack of a prevailing consensus around any single conception of privacy or its alleged values. 116 This may seem strange given the final responses of our focus group participants—how could individuals lament (or even celebrate) the loss of privacy’s meaning, when that meaning was never entirely clear in the first place?117
The notion that privacy conceptions are rather ephemeral and amorphous in practice is supported by our focus group participants, who often struggled to give coherent responses when asked what the term “privacy” meant to them.118 It was not until these individuals were further prompted that they could even attempt to outline any values placed on privacy, and they did so primarily by identifying the types of information they sought to keep private. Even the value of privacy respondents assigned to these types of data was purpose-specific and was not generally associated with higher ideals involving privacy itself. For instance, for the most commonly cited categories—financial and health data—respondents explicitly sought to keep this information private because of fears over potential financial loss over the exposure of that data.
Among the variety of responses on this topic, however, focus group respondents constantly raised one theme, if not a clear definition. In every focus group, the theme of control over one’s personal information—or more commonly, the lack thereof—was cited in discussions of respondents’ conceptions of privacy. 119 Although these discussions often also boiled down to a simple “feeling,” that feeling was undoubtedly the sense of being in control over one’s personal information; irrespective of whether or not a respondent was accepting of his or her information being shared, he or she wished to have a say in that decision. Upon reviewing these responses and their implications within the larger context of privacy in the US and abroad, it became clear
115 See supra Part II.
116 See SOLOVE, supra note 42, at 24–26.
117 See generally Daniel J. Solove, Conceptualizing Privacy, 90 CALIF. L. REV. 1087 (2002) (discussing changing conceptions of privacy over time).
118 See infra APPENDIX A
119 See Auxier et al., supra note 49. Similar to our findings here, data from the Pew Privacy Study showed that respondents’ conception of privacy is heavily skewed toward the idea of control over their personal information. Id.
that reconceptualizing privacy around a more nuanced notion of control may be a worthwhile thought experiment to conclude our exploration of privacy in the public eye.
A. Condition vs. Choice: The Privacy Paradox Reconceptualizing privacy around the concept of personal choice provides a new resolution to a paradox surrounding the left half of the Resignation Curve.120 Defeatists lament their loss of privacy while simultaneously sharing their information with Google, Facebook, Amazon, etc. Many Defeatists themselves attribute these inconsistencies to an overwhelming feeling of resignation: with privacy having “lost its meaning,” many respondents have succumbed to the benefits of exchanging privacy for a variety of tradeoffs. Although nearly all respondents agreed that these tradeoffs had immense value, many—especially the Defeatists—felt as though they did not always retain control over which tradeoffs to make and the extent to which their own privacy should be exchanged for the corresponding benefits. It is quite possible that when respondents lamented their loss of privacy, they were actually lamenting their diminished control over the decision to be private (or not), rather than the actual state of being private itself.
One could argue that individuals still retain complete control over whether or not to share personal information. Our TOS agreement discussion,121 however, serves as a counterargument to that belief. On paper, it would seem that the free market grants individuals seemingly limitless choices as to which data-collecting services to use, if any at all. Indeed, many would likely point to the ability of privacy-concerned individuals to abstain entirely from these services as justification for their claim that people still retain some level of control over their private information. Irrespective of the realistic feasibility of total abstention, the perception our focus group respondents held was clear: respondents felt as though they had no choice but to use certain products—such as Google’s search engine—and to agree to its TOS contract. Respondents articulated that abstention from interfacing with any internet services would preclude them from participating in society as the average person would. It is this feeling—the belief that one must agree to TOS contracts or face societal ostracism—that is central to privacy’s loss of meaning in the public eye.
120 See supra p. 1443.
121 See supra Part IV.B.
As surfaced at the beginning of each of our focus groups, respondents’ relationship to privacy was defined not by privacy itself but by its competing interests. In the eyes of many respondents, the compelling nature of these tradeoffs has essentially forced their hands in a variety of situations, thereby eliminating any feeling of control over their information. Although technology has provided a new impetus for this exchange in the twenty-first century, individuals’ desire to trade privacy for a competing interest is by no means a new phenomenon. In None of Your Damn Business, Lawrence Capello provides evidence that Americans were willing to exchange privacy for competing interests as early as the Gilded Age.122 In his analysis, Capello argues that the current state of privacy was not the inevitable result of technological progress, outlining several key moments throughout American history in which privacy was placed against a competing interest—and lost.123 This analysis appears to reveal an unspoken truth: maybe individuals never truly cared about the actual state of being private or anonymous.
While it may be difficult to gauge public sentiment in the past, it may very well be possible that the loss of privacy’s meaning today can be attributed to the romanticization of a privacy-devoted world that never existed. In this world, everyone chose anonymity without the fear of missing out. In actuality, there were simply fewer opportunities to exchange privacy for competing interests in the past when compared to the opportunities that exist today, due in large part to the advancement of consumer technology. Previous conceptions of privacy, therefore, did not have an impetus to distinguish the state of being private with the decision to be private, for this was once a distinction without a difference. Today, however, shifting emphasis to the latter distinction could potentially provide a privacy framework that accounts for individuals’ desire for agency over their personal information, while also acknowledging their desire to occasionally share that information.
To speak in terms of our frog metaphor, who could blame people for wanting warm water? For as a Futurist may claim, society has now been ushered from a technological ice age into a paradise of information enlightenment. Older conceptions of privacy, such as Westin’s, which rely heavily on promoting the benefits of anonymity as a principle component of privacy, may seem somewhat tone-deaf in a world where over two billion individuals have Facebook accounts.124 These previous
122 CAPELLO,supra note 95, at 5–6.
123 Id. at 3–4.
124 J. Clement, Number of Monthly Active Facebook Users Worldwide as of 4th Quarter 2019, STATISTA (Jan. 30, 2020), https://www.statista.com/statistics/264810/numberof-monthly-active-facebook-users-worldwide.
conceptions, which focus more on the condition of being private rather than the choice to enjoy that condition, are perhaps incompatible in a world defined not by what is withheld but by what is shared.125 From this perspective, it is unsurprising that respondents were unable to describe what exactly privacy meant to them, largely because traditional conceptions of privacy, which emphasize anonymity, are not easily compatible with individuals’ willingness and desire, in some instances, to share their information.
This inability to reconcile contemporary norms with amorphous conceptions of privacy based on anonymity may be one reason individuals feel a lack of control. Because traditional definitions of privacy emphasize the actual state of being private rather than the decision to retain that state (to whichever extent one chooses), individuals may tend to conceptualize sharing-abstinence as a more legitimate form of privacy instead of seeking out responsible ways to share information.126 This feeling of not having control is likely furthered by the need to seek out these responsible means rather than having them implemented as a legislative standard.127
B. What’s the Point? The Purpose of Privacy
Advocates who champion privacy as an important part of human dignity may criticize a conception of privacy that de-emphasizes the actual condition of being private. Political theorists such as Westin, for instance, have specifically cited the anonymity granted by privacy as a contributing factor in securing a person’s dignity.128 But fears that a model of privacy based around sharing, rather than withholding, would undermine individuals’ ability to maintain their dignity are undoubtedly unfounded for two reasons: the lack of association between privacy and dignity, and the dignity that is still maintained through choice.129
125 But see Julie E. Cohen, What Privacy Is For, 126 HARV. L. REV. 1904, 1906 (2013) (providing an alternative analysis that argues in favor of older conceptions of privacy).
126 For example, limiting which smartphone applications have access to certain kinds of data (i.e., location, Bluetooth, etc.), or restricting the ability of software to access data altogether.
127 For example, the European Union and California have implemented measures that may assist in granting users greater control over their data. These measures are further explored in a later section. See infra Section VI.B.
128 WESTIN, supra note 11.
129 See Auxier et al., supra note 49. A majority of participants within the Pew Privacy Study responded that the developments of new tools allowing for individuals greater control over their personal information would be a more effective way to protect their personal information. Id.
Throughout the focus groups, the purpose of privacy was discussed at great length. Many, but not all, respondents cited the benefits that privacy offered in terms of protecting against potential harms,130 such as identity theft or other forms of financial loss. Hardly any respondents reported that they believed privacy was an end in and of itself. No respondents offered “human dignity” as a potential value of privacy that could compete against other interests such as convenience or security.131 The closest the discussion came to this topic were the instances in which individuals expressed concerns of government eavesdropping, but even then, these concerns were also met with “I have nothing to hide” claims from many other respondents. Almost all respondents acknowledged, and even accepted, that today’s society is defined more by a cost/benefit calculation than it is insisting on the primacy value of human dignity.
This is not to claim that rhetoric surrounding human dignity has no place in a new conception of privacy. Instead, the lack of salience of the human dignity justification for privacy provides evidence as to why privacy has been so heavily eroded in the United States. One reason privacy may have lost the battles outlined by Capello 132 could be that the human dignity element of remaining anonymous was never that compelling to Americans, and given the advent of new technology and limitless information sharing, perhaps the human dignity argument is now less compelling than ever. This is especially true given the ties between traditional conceptions of privacy, human dignity, and anonymity, or “the right to be let alone.”133 To the extent that humanity wishes to share more information than ever before, it is perhaps unsurprising that these arguments have not supported privacy throughout the history of the United States.
Instead of associating human dignity with anonymity under older conceptions of privacy, dignity ought to instead be tied to choice. It is the ability to decide whether or not to be private, and the extent thereof, that provides individuals with a sense of self-respect and worth, rather
130 An interesting point of contention from the focus groups was the extent to which these harms were actually realized. While many focus group respondents used services that, at one point, have been electronically compromised in some way, only a few respondents cited cases in which they felt personally victimized by some violation of their privacy due to a company being hacked or otherwise storing data in an unsecure fashion. The extent to which these harms are unrealized may contribute to claims that privacy concerns are often taken out of proportion.
131 See generally James Q. Whitman, The Two Western Cultures of Privacy: Dignity Versus Liberty, 113 YALE L.J. 1151, 1164–70 (2004) (comparing the United States’ emphasis on liberty to European emphasis on dignity for issues involving privacy).
132 CAPELLO,supra note 95, at 6.
133 Warren & Brandeis, supra note 7, at 139.
than the actual decision itself. For instance, a person who elects to enjoy all of the competing interests of privacy at the cost of sharing their personal information retains no more or less dignity than a person who chooses to share nothing—so long as both individuals had the choice of deciding.
Of course, creating and maintaining this choice is much easier said than done. It may be tempting to identify the Pragmatist group of the Resignation Curve 134 as the set of individuals who best exemplify this new conception of privacy. After all, these were the respondents who had already begun to take measures 135 with the hope of having better control over access to their private information. Despite this observation, we would caution against turning Pragmatists into normative role models for other individuals in society if a robust conception of privacy is to be preserved. The true takeaway from the Pragmatist section is the type of behavior that flourishes due, in large part, to the absence of other privacy protections. Perhaps these individuals would be less inclined to engage with technologies that grant them greater control over their privacy if they believed that this control could be easily exerted through other means, such as legislative provisions that compel companies to implement such controls into their services.
Certain jurisdictions have actually sought to implement legislation designed to grant their constituents greater control over their personal data. The European General Data Protection Regulation (GDPR), which took effect in the European Union (EU) in May 2018, aims to protect all residents of the EU, meaning anyone living in the region falls under the same protective umbrella as citizens.136 To achieve this goal, all companies with an internet presence in the EU must comply with its regulations, including American businesses that have European websites.137 A second fundamental change resulting from this
134 See supra p. 1443.
135 See supra p. 1448.
136 Juliana De Groot, What Is the General Data Protection Regulation? Understanding & Complying with GDPR Requirements in 2019, DIGITAL GUARDIAN (Sept. 30, 2020), https://digitalguardian.com/blog/what-gdpr-general-data-protection-regulationunderstanding-and-complying-gdpr-data-protection.
137 While companies may have implemented certain measures worldwide, GDPR provisions only protect EU residents. Aarti Shahani, 3 Things You Should Know About Europe’s Sweeping New Data Privacy Law, NPR (May 24, 2018), https://www.npr.org/sections/alltechconsidered/2018/05/24/613983268/a-cheat-sheet-on-europe-ssweeping-privacy-law (stating that U.S. citizens are not necessarily entitled to the same protections afforded to EU residents: “[i]n Europe, Facebook has to get permission to do facial recognition—and it’s not the default setting. But in the U.S., it is. American users have to click through screens to opt out”).
legislation is an alteration in the definition of “personal data,” and, accordingly, what data are included in these protections.138 Some examples of categories of data not previously included are what you post, electronic medical records, mailing addresses, IP addresses, and GPS locations—all of which are now included as protected data.139 These foundational alterations are crucial in understanding the legislation’s greater implications, as they alter previously caste-in-stone beliefs about what “should” or “should not” be considered private.140
The GDPR contains several provisions designed to grant internet users greater control over their privacy. For instance, to comply with the GDPR, all companies must conform to an opt-in style of data collection for any online services that track users’ information, with the goal of encouraging increased awareness of and transparency regarding the information being collected on the part of users themselves.141 Furthermore, the GDPR empowers EU users to request that companies delete personal data they collect “without undue delay” or face potential penalties under the law.142 Other critical components of this legislation include: “[r]equiring the consent of individuals for data processing[;] [a]nonymizing collected data to protect privacy[;] [p]roviding data breach notifications[;] [s]afely handling the transfer of data across borders[;] [and] [r]equiring certain companies to appoint a data protection officer to oversee GDPR compliance.”143 This provides consumers an enormous amount of control over their data compared to the “wild west” of the internet as it had previously existed in Europe, and as it continues to exist throughout much of the United States, with some notable exceptions.144
138 Id.
139 Id.
140 See Lior Jacob Strahilevitz, Toward a Positive Theory of Privacy Law, 126 HARV. L. REV. 2010, 2033–39 (2013) (discussing the potential implications of these new privacy classifications).
141 Shahani, supra note 137.
142 Id.
143 Juliana De Groot, What Is the General Data Protection Regulation? Understanding & Complying with GDPR Requirements in 2019, DIGITAL GUARDIAN (Sept. 30, 2020), https://digitalguardian.com/blog/what-gdpr-general-data-protection-regulationunderstanding-and-complying-gdpr-data-protection.
144 See generally Paul M. Schwartz, Global Data Privacy: The EU Way, 94 N.Y.U. L. REV. 771, 786–87 (2019) (comparing the GDPR with other legislative approaches taken around the world, primarily in Japan and the United States).
California presents what is likely the most notable of these exceptions. As of January 1, 2020, California’s Consumer Privacy Act (CCPA) provides California residents 145 with access to enhanced knowledge and control over their personal data. Inspired by the GDPR,146 the CCPA provides California residents with rights to the following: (1) to know what personal data is being collected about them, and by whom; (2) to know whether their personal data is being sold or otherwise disclosed, and to whom; (3) to refuse to the sale of their personal data; (4) to regain and curb access to their personal data; (5) to request businesses to delete any personal data that they may have collected; and (6) to not face discrimination for exercising their right to privacy.147 The CCPA also provides California residents with legal standing to bring suit against any qualifying company that violates these provisions.148
Importantly, both the GDPR and CCPA comport with the aforementioned new conception of privacy, as they do not seek to restrict the quantity of information that companies can collect but instead aim to give individuals greater awareness and control of their personal information.149 Given the laws’ recency, it remains to be seen what effect, if any, these pieces of legislation will have on the individuals’ behavior or general attitudes toward privacy. At the very least, the GDPR and CCPA illustrate the potential influence that governments can wield in safeguarding their citizens’ capacity to control private information.150 This influence, however, can work both ways.
145 The provisions of the CCPA apply to all residents of California, and restrict any business, nonprofit or for-profit entity that collects personal data of consumers, conducts business practices in the state of California, and is characterized by at least one of three “thresholds.”. CAL.CIV.CODE § 1798.140 (2018). These thresholds include having a gross annual revenue of $25 million or more, having the ability to buy or sell the information of 50,000 or more individuals or separate households, and/or earning 51% or more of its gross annual revenue from data selling. Id.
146 Although they share many similarities, there are several differences between the two pieces of legislation. The most notable is that the CPPA protects data that originated from the consumer directly, while the GDPR extends protections to cover data purchased by third parties as well. Nicholas Moline, 2019 Changed the Internet Forever, JUSTIA (Jan. 3, 2020), https://onward.justia.com/2020/01/03/2019-changedeverything.
147 See CAL. CIV. CODE § 1798.100 (2018).
148 For further discussion of the origins of CCPA and GDPR, see Anupam Chander et al., Catalyzing Privacy Law, 105 MINN. L. REV.(forthcoming 2019).
149 Nicholas Moline,* 2019 Changed the Internet Forever*, JUSTIA (Jan. 3, 2020), https://onward.justia.com/2020/01/03/2019-changed-everything.
150 Contra Woodrow Harztog & Neil Richards, Privacy’s Constitutional Moment and the Limits of Data Protection, 61 B.C. L. REV. 1687, 1696–97 (2020) (providing an alternative analysis of GDPR, especially its potential shortcomings if similar legislation is applied in the United States).
The most glaring case of government influence working against privacy deserves special mention—China. Today, technology is being deployed in the Chinese mainland to increase control over the population under the guise of keeping civil order and promoting moral norms, with the most common systems being facial recognition coupled with large-scale data collection.151 Approximately 300 million facial recognition cameras are being installed in train stations, at crosswalks, on light fixtures, traffic signals, and buildings.152 Furthermore, pilot testing for a social credit system is already taking place, in which the government assigns credit scores to citizens based on their personal habits and the scores of their associates. The Chinese government uses those factors to indicate the presence of traits the Chinese Communist Party finds desirable in its citizens.153 Currently, these pilots are facing technical barriers due to the sheer amount of data that must be processed,154 but this is a limitation that may not exist one day—possibly soon.
The aforementioned developments do not stop at China’s borders. Already, Chinese firms are working with foreign governments to spread facial recognition technology. Eighteen countries 155—including Zimbabwe, Uzbekistan, Pakistan, Kenya, the United Arab Emirates, and Germany—are using Chinese-made monitoring systems.156 Chinese technology is expanding past Chinese borders,157 so Americans face the question of whether such a system could exist here, and what it may look like. This may boil down to cultural questions and tolerance for privacy invasions on a national level, but these are questions that people must ask while they still can.
An old adage states, “knowledge is power.” We now live in an information age with nearly limitless information available—but not information all possesses equal value. Whether the end goals of companies or governments are commercial gain or societal power, the
151 China Invents the Digital Totalitarian State, ECONOMIST (Dec. 17, 2016).
152 Id.
153 Mareike Ohlberg, Shazeda Ahmed, & Bertram Lang, Central Planning, Local Experiments: The Complex Implementation of China’s Social Credit System, MERCATOR INSTITUTE FOR CHINA STUDIES (Dec. 12, 2017).
154 Id.
155 Paul Mozur, Jonah M. Kessel & Melissa Chan, Made in China, Exported to the World: The Surveillance State, N.Y. TIMES (April 24, 2019). Thirty-six countries have received training in topics such as “public opinion guidance . . . which is typically a euphemism for censorship.” Id.
156 Id.
157 Amy Hawkins, Beijing’s Big Brother Tech Needs African Faces, FOREIGN POL’Y (July 24, 2018), https://foreignpolicy.com/2018/07/24/beijings-big-brother-tech-needsafrican-faces.
collection of personal data has proven to be among the most valuable information that exists today in achieving these ends. Understanding the value of this information and deciding for oneself what to do with it—whether to enjoy a state of privacy or the seemingly endless benefits of exchanging information (or to exist somewhere in between)—is among the greatest challenges for humanity in the age of information.
To refer once again to our Frogs analogy, the amphibians perhaps need not make the water cooler again—for that would be difficult to convince others to do, and even those that might have once preferred colder water now enjoy the warmth. Rather, the solution is to ensure that they themselves have their hands on the faucet, ever vigilant of reaching the boiling point.
Table of Contents
- I. INTRODUCTION
- II. BACKGROUND
- III. RESEARCH DESIGN
- IV. FOCUS GROUP FINDINGS
- V. THE RESIGNATION CURVE - PROFILES IN PRIVACY
- VI. CONCLUSION - REFRAMING PRIVACY’S MEANING
- APPENDIX A - GENERAL SCRIPT/QUESTIONS FOR FOCUS GROUPS
- APPENDIX B - RESPONDENTS’ ONE-WORD DESCRIPTIONS OF PRIVACY IN THE FUTURE