For the Balkinization Symposium on Ignacio Cofone, The Privacy Fallacy: Harm and Power in the Information Economy Cambridge University Press (2023).
Yan Shvartzshnaider
Inappropriate information sharing can lead to privacy violations and cause real harm. Nevertheless, these “[harms remain] invisible, and [are exploited] in the information economy [continuing to] proliferate,” because “the courts and regulators perceive privacy interferences solely through the lens of monetary losses” (Cofone 2023).
As we become increasingly dependent on online services, we frequently ask, “Is this service/app safe, privacy-preserving, and secure?” Unfortunately, for the average consumer, it is difficult to find definitive answers. Modern services generate, collect, share, and trade vast amounts of information as part of a complex digital ecosystem of third-party services and actors. What makes the situation even more complex is that their information-handling practices often go beyond the immediate needs of their service. This is especially true of mobile apps, which often build their business models around data collection, rather than the information services they provide.
The
law and regulation offer little solace. “Privacy law places the onus on those
whom it protects. It unreasonably expects people to foresee the consequences
that may arise from data practices outside their control - and beyond their
ability to predict” (Cofone 2023). A growing body of work shows that it is
impractical to expect consumers to make an informed decision while facing such
an information overload. The current “informed consent” model places the burden
on the user to comprehend and consent to all the practices across all
components: Users need to a) be familiar with the company privacy policy, b) be
aware of existing relevant laws and regulations c) check the apps granted
permissions, and d) finally, analyze the traffic generated by the service.
Furthermore, these components are often misaligned.
An
average consumer will find it difficult to understand and account for the
possible side effects in deciding on whether the service is safe to use. The
law and regulation often lag technological innovation; privacy expectations and
norms shift, and app behavior and permissions may change with successive
updates. As Cofone notes:
the failure of individual control mechanisms such as consent provisions defeats the expectation that people will anticipate these harms and agree only to data practices that don’t harm them. Procedural rules are also woefully insufficient for protecting personal information because they’re insufficiently related to preventing harm (Cofone 2023).
To determine the most effective basis for liability, considering who can minimize the likelihood and magnitude of harm is crucial. (Cofone 2023).
To introduce
privacy liability, we can draw parallels with prescription drugs in the highly
regulated pharmaceutical industry that uses provisions to mitigate potential
harm to patients. As part of the learned intermediate rule, drug manufacturers
need to inform the prescribing physicians about any potential risks and harm
the drug can cause.
The learned intermediary rule is based on the principle that healthcare professionals, as intermediaries, can assess the risks and benefits of treatment, ensuring that patients receive necessary information through their physicians.[1]
Pharmaceutical
companies provide “package inserts” (PI) with each drug-containing detailed
unbiased information about the risk and benefits associated with the drug
(Watson and Barash 2009). Importantly, PIs are designed to inform medical
professionals such as physicians, pharmacists – not patients. Physicians– “learned-intermediaries”–interpret
the dangers of a particular drug for a given patient based on the information
the pharmaceutical companies provide. A separate “patient package insert”
(PPI), however, is voluntarily included with the majority of drugs.
A redesign of privacy policies could
support the “learned intermediary” and the consumers. Consumers are often
overwhelmed with information that they don’t understand, while privacy experts
(“learned practitioners”) often see privacy policies as ambiguous and
incomplete (Reidenberg et al. 2016). Privacy inserts could provide clarity for
both parties in two separate documents. The first, like PI, will include an
exhaustive list of information handling practices for a privacy expert to
investigate and determine the benefit for the consumer.
The first comprehensive privacy
insert could include description of information flows resulting from
third-party integration, list of supply-chain business associate services, and
automatic, ML-driven components and scenario of potential “side effects.” This
part of the document could include Contextual Integrity (CI) framework annotations.
CI can serve as a framework for privacy inserts to address existing structural
faults in privacy policies, which often make them ambiguous and cognitively
burdensome to comprehend. The goal is to provide the necessary information for
empirical analysis that researchers, lawyers, and policymakers need to perform
privacy assessments.
The
theory of contextual integrity (CI) (Nissenbaum 2009) defines privacy as the
accommodation of the appropriate flow of information in accordance with
governing contextual norms. To perform a privacy analysis using CI, we need
to capture the values of the information flow and the norms for the sender of
information, its subject, the receiver, and the transmission principle, which
specifies the constraints and conditions under which the information is being
shared. Stating the five parameters is essential; without them, the analysis
would be inconclusive and ambiguous. A deviation of the parameter values from
those of the established norms is grounds for examining a potential privacy
violation. This examination is performed using the CI heuristic, which
requires several levels of analysis to investigate the moral, ethical,
political, and social implications of the breached flow (Nissenbaum 2014). |
The second, like PPI, is a
shorter version aimed at the consumer. It will concisely present privacy risks
associated with any of the practices. This part will build on the recent work
on designing “Privacy Nutrition Labels”
to provide a concise view of information handling practices to the consumer
without overwhelming them with details. In some situations, a PPI-type summary
would be enough. In other contexts, such as health, education, it would be
preferred for an “intermediary” expert to read through a comprehensive,
unbiased list of information handling practices to make the privacy assessment.
In the conclusion of the Privacy
Policy book, Cofone forebodes the emergence of AI, though the warning can apply
to any sociotechnical system, in our daily lives:
In a society where our information can be used to exploit us and where our wellbeing is influenced by how our information is turned into credit scores, risk assessments, and employability, developing functional protection against privacy harm is urgent. (Cofone 2023)
To heed
this call, we need to recognize that violations of privacy can pose significant
and real risks to our society. We need to develop novel regulatory and
governance mechanisms that will help protect consumers. While technological
advances are unprecedented, the harms resemble those in other industries. The
burden of informing and mitigating harm should not fall on the consumer.
Rather, like other industries, ‘effective liability regimes deter harm by
placing the responsibility on those who can prevent and mitigate it’ (Cofone
2023). Introducing new ways to inform and empower experts can help them provide
the right advice to assist consumers in navigating the treacherous and
data-hungry landscape. Learning from existing regulatory mechanisms—such as
those based on ‘primum non nocere’ in the medical domain—can provide a platform
to address many of these issues.
Yan Shvartzshnaider is Assistant Professor in the Department of Electrical Engineering and Computer Science, Lassonde School of Engineering at York University. Email: yansh@yorku.ca
References
Cofone,
Ignacio. 2023. The Privacy Fallacy: Harm and Power in the Information
Economy. Cambridge University Press.
Nissenbaum,
Helen. 2009. “Privacy in Context: Technology, Policy, and the Integrity of
Social Life.” In Privacy in Context. Stanford University Press.
———.
2014. “Respect for Context as a Benchmark for Privacy Online: What It Is and
Isn’t.” Cahier de Prospective 19.
Reidenberg,
Joel R, Jaspreet Bhatia, Travis D Breaux, and Thomas B Norton. 2016. “Ambiguity
in Privacy Policies and the Impact of Regulation.” The Journal of Legal
Studies 45 (S2): S163–90.
Watson, Kelley Teed, and Paul G
Barash. 2009. “The New Food and Drug Administration Drug Package Insert:
Implications for Patient Safety and Clinical Care.” Anesthesia &
Analgesia 108 (1): 211–18.
[1]
https://www.americanbar.org/groups/litigation/resources/newsletters/mass-torts/learned-intermediary-rule-and-rhode-island/