Balkinization  

Tuesday, December 10, 2024

Can Private Law Protect Privacy in Today’s Economy?

Guest Blogger

For the Balkinization Symposium on Ignacio Cofone, The Privacy Fallacy: Harm and Power in the Information Economy Cambridge University Press (2023).

Elettra Bietti

 A few weeks ago, Carrie Goldberg, an online victims’ rights lawyer, visited my classroom. Students were attentive as she recounted her clients’ cases. Nude pictures of a victim disclosed to her work colleagues by a former boyfriend, child abuse on the site Omegle, several youth who died after buying suicide kits suggested to them on Amazon Marketplace: these were clear situations where data and privacy interferences caused extremely significant losses that courts could hardly turn a blind eye to. Many–-most—of Goldberg’s cases are fought on tortious grounds. Most of them form the tip of a much larger iceberg that Ignacio Cofone, in his book, calls “privacy harms.”

Cofone wants courts to recognize and deter an entire iceberg of privacy harms that encompasses these very obvious forms of harm, but also routine instances of digitally-mediated identity theft, impersonation, behavioral modification and micro-targeting that have less clear sources and less obvious effects. His aspiration is to translate situations where privacy has purely intangible consequences into actionable torts. Take the way Google constantly monitors the online browsing behavior of Chrome users without their ability to know who is tracking them and for what purposes. Each user, at some point or other, will click “I consent” to Google Chrome’s privacy policy. And yet, now that Google has shielded themselves from liability, most users remain in the dark about how their browsing information is collected and used. Such opacity, according to Cofone, should give rise to tort liability. If a user later suffers from discrimination on an online marketplace or is impersonated on social media because of the information that was tracked, Google should have to compensate users despite their privacy policy and, sometimes, even in the absence of tangible loss. 

Cofone’s argument that probabilistic and intangible privacy harms should give rise to compensation when these harms are connected to exploitation, that is to private gain on the wrongdoer’s part, is novel and useful for thinking about harm in today’s complex intermediated economy. Cofone echoes much of the existing privacy literature suggesting we should move past individual and contract-based opt-ins and opt-outs, informed consent and individual ex ante choices about our privacy. He embraces the bent toward harms, and fiduciary law, rejecting contract-based data governance but retaining a private law grounding. Cofone thinks we should approach privacy from the mass torts perspective: privacy generates diffuse harms and correspondingly diffuse responsibilities, and it should be possible to sue tech companies collectively and obtain compensation notwithstanding the difficulty of proving damage to each class member. 

Cofone’s new book is ambitious and a very good read. My main reaction to his account is a skepticism that private law and torts can help protect our privacy in today’s context. In what follows, I express some skepticism on three implications of his torts-based approach: (a) the conceptual and distributive effects of an ex post case-by-case approach as opposed to an ex ante regulatory framework, (b) the risks of delegating privacy standards to courts, and (c) the limits of a private law approach to power and domination in the surveillance economy. 

First, the emphasis on ex post outcomes-based liability for privacy violations as opposed to ex ante privacy frameworks is both conceptually confusing and has some counterintuitive distributive effects. The book seems to layer an ex ante/ex post dichotomy onto a different contrast between deontology and consequentialism. Let’s remember that wait-and-see ex post enforcement has been the default approach in digital settings since the birth of the internet. From “move fast and break things” to “permissionless innovation in networks, for thirty years the question for Anglo-American lawyers was whether to intervene ex ante in a space where case-by-case ex post tortious liability was the default. Early cases such as Stratton Oakmont v Prodigy demonstrate that torts liability was at the front and center of addressing digital consumer harm before section 230 and FTC privacy enforcement. Statutes and ex ante immunities began to emerge in the 1990s, and, alas, it is hard to disagree that not all of these ex ante strategies favored a more interventionist approach. In the privacy space, the FTC began to prevent privacy violations somewhat systematically only starting in the late 1990s

Despite what Cofone describes as attempts at ex ante privacy governance, US consumers today lack meaningful ex ante privacy protections. Cofone’s argument that we must move beyond ex ante frameworks and toward an approach more focused on ex post harms therefore seems to dismiss the potential of more audacious forms of ex ante regulation and intervention: redlines on activities such as behavioral advertising or police use of facial recognition technologies, pro-competitive regulation requiring companies to act more fairly toward competitors and users. Absent an acknowledgment of such potential, the dismissal of privacy legislation and regulation in favor of ex post privacy litigation appears fragile. It seems to favor slow, costly, case-by-case and ad hoc private litigation over robust sectoral frameworks that could more consistently protect consumers against harm. Dismissing ex ante regulation thus risks leaving the most vulnerable and least legally savvy privacy victims without a remedy. A better way to understand the move toward ex post privacy governance views it as (1) a preference for evidence-based privacy policy over purely procedural deontological ideals, and (2) a defense and expansion of torts role in tackling digital harms. This denotes a preference for replacing predominant deontological or proceduralist accounts of privacy with consequentialist approaches to harm.  I share this preference, without agreeing that it entails a move toward ex post case-by-case enforcement. 

Second, Cofone’s emphasis on the role of standards leaves an undue amount of interpretive power in the hands of courts. Tort law is heavily based on standards such as the duty of care in negligence law. Cofone suggests that a few special standards may be needed to operationalize his idea of privacy harms. Mirroring Ronald Dworkin’s famous distinction between rules and “principles, policies, and other sorts of standards,” Cofone defines standards as “broad[] principles used to evaluate whether someone acted wrongly.” He puts forward privacy-specific standards such as data minimization, privacy-by-design and duties of loyalty. Why are standards of fairness, transparency, purpose limitation and lawfulness excluded, since these are also data protection specific standards embedded in GDPR? The reason may be that Cofone wants privacy to be about after-the-fact accountability mechanisms and less dependent on procedural ideals. Still, Courts are left to guess what data minimization, privacy-by-design and loyalty mean in particular cases. Is Google’s failure to give its users options to opt-out from online tracking a violation of data minimization or privacy-by-design standards? Is it a violation of Google’s loyalty to its users? Or is it motivated by necessity or an overriding legitimate interest? Are courts, as non-expert and notoriously conservative institutions, best placed to adjudicate on such open-ended standards in a changing economy?  Do they have sufficient tools to understand tech companies’ strategies and motivations including what happens behind their proprietary walls? The alternative, once again, could be a regulatory framework setting out relevant privacy standards whose enforcement would be supervised by a well-funded and expert-led regulatory agency with more significant investigatory powers. 

Third, and perhaps most importantly, the book’s private law focus foregoes a serious investigation of power and domination in surveillance markets. Private law is the law that governs relations between private individuals. Tort law is the law of wrongs (or the law of accidents). To focus an analysis of privacy on tort law frames privacy as a risk or cost that can be allocated in ways that efficiently settle the relation between two parties, or between a class of people and one or a few other parties. Yet surveillance is a more pervasive societal phenomenon whose costs and benefits can’t be settled, apportioned, allocated or even translated into compensatory language. Arguing that Google ought to compensate me, and several other users, for surreptitious tracking, does not address the fact that, under current material, economic and social conditions, Google continues to be allowed to engage in tracking and no one can meaningfully prevent it from doing so other than by threatening it with monetary liability. Instead of dealing with diffuse surveillance harms by expanding private litigation, it may be useful to start thinking about surveillance as an infrastructural phenomenon

Three responses could be offered to this critique. The first is that Cofone introduces the notion of exploitation to address some of surveillance’s systemic aspects: the idea that compensation is available because tech companies like Google are gaining from eroding people’s privacy (e.g. through tracking). The main role of exploitation in Cofone’s account is, however, to act as a limiting principle. Reducing surveillance to something more tractable through private litigation won’t, however, make it actually tractable. The second response emphasizes mass torts, arguing they really help address the systemic and collective dimensions of surveillance. This is persuasive. Yet a mass torts approach remains premised on the binary idea of compensation for harm, and therefore does not escape the critique I just raised. Third, one could suggest that tortious liability does have systemic effects in that, over time, it could start deterring companies like Google from tracking their users. This, I think, is the strongest response to my critique. It is an important reason to support and carry forward the expansive tort liability approach to privacy outlined by Ignacio Cofone in his highly recommended new book.

Elettra Bietti is Assistant Professor of Law and Computer Science at Northeastern University. You can reach her by e-mail at e.bietti@northeastern.edu.



Older Posts
Newer Posts
Home