Access to Error
Guest Blogger
Michal Shur-Ofry
For the
Innovation Law Beyond IP 2 conference, March 28-29 at Yale Law School
What are your immediate associations to the word "innovation"? The responses I
received while running this question through friends and acquaintances ranged
from the general ("progress", "future", "modernization",
"technology", "patents", "curiosity", "thinking
outside the box", "fresh", and "intriguing") to the
specific ("artificial intelligence", "spaceships", "cherry
tomatoes", "Einstein", "Da Vinci" and "Apple").
No one, however, mentioned errors, failures or negative findings.
This is
hardly surprising—we tend to associate innovation with success (or other
positive things) and errors with defeat. But errors and innovation are actually
tightly linked. My purpose in this project is to focus on errors as drivers of
innovation. I argue that the
current incentive structure in our innovation ecosystem, both within and beyond
IP, does not provide sufficient incentives for the dissemination of errors and other
negative information. I further hope to start a conversation about
access-to-error as an important (and largely overlooked) goal for innovation
policy.
First, what
do errors have to do with innovation? To clarify, I
use "errors" here in a very broad sense, that includes mistakes,
failures, falsifications, blind-alleys, negative findings and additional types
of negative information. The main answer is almost obvious, and was suggested by philosophers of
science long ago: errors provide us with important negative knowledge, the
knowledge of what doesn't work, which brings us closer to understanding what
does. In the words of
Karl Popper: "we learn from our mistakes."
But there is more to it: errors are especially
important for triggering paradigm shifts---a particular type of innovation that
opens up new fields of research and can completely change scientific domains.
Thomas Kuhn in his influential work about scientific revolutions recognized that paradigm shifts are often preceded by detecting mistakes and inconsistencies
under existing paradigms. Indeed, actual shifts in physics, life-sciences and even behavioral economics provide ample examples
for the power of errors to push innovation beyond the state of the art. More recent
research in the field of complexity highlights another angle: due to the
networked and inter-dependent nature of many innovation ecosystems small errors
that accumulate undetected can eventually cause large-scale catastrophes---famous
failures of aircraft, buildings and nuclear plants provide powerful examples. The
detection of errors in complex innovation ecosystems may therefore be
especially significant. Finally, although my focus is on the scientific and
technological domains, errors and mistakes are also drivers of artistic
creativity.
Of course, the potential of errors to
promote innovation depends very much on their disclosure and dissemination, or
in other words, on access-to-error. But a quick look at the main legal and
social institutions that promote the diffusion of knowledge reveals that these
institutions hardly incentivize the diffusion of errors. First, intellectual
property, a primary legal mechanism for incentivizing the dissemination of knowledge
goods is largely dysfunctional when it comes to negative information. This is
because IP rights are based on exclusion (i.e., on the ability
to control the use of the protected subject matter and prevent unauthorized
uses), while negative information is generally difficult to exclude—as articulated
by Amy Kapczynski and Talha Syed, it is located at the lower end of the
continuum of excludability. It is
often impossible to trace those people who actually used certain negative
information, and distinguish them from other persons who avoided the wrong path
for various other reasons.
What is more surprising is that the major
alternatives to IP---the scientific establishment with its reputational rewards,
and state-supported grant and funding schemes—also fail to incentivize the disclosure
and dissemination of negative information. In fact, scientists face substantial
difficulties when they try to publish negative findings. Accumulating evidence
from different fields suggests that in many high impact journals
"negative results are not accepted", and often, negative
findings never see daylight (see, for example, recent evidence by Daniele
Fanelli that
"negative results are disappearing from
most disciplines and countries"). And
even when published, negative information attracts less citations and less public
attention than positive data.
Similarly, many state-supported grants and funds strongly
prefer projects that seek to attain positive findings over
projects which focus on falsification or replications. Leaders of prominent
funds quoted in
the Economist explain that the
latter "in all likelihood would be turned down" (one encouraging exception
may be a forthcoming change in NIH policies, as announced by heads of NIH, Francis
Collins and Lawrence Tabak, in a recent comment in
Nature).
This current incentive structure, combined with common cultural perceptions of error as
shameful and humiliating, produces a well-documented
"file-drawer effect": positive information is disclosed ad disseminated
while negative information is shelved into a huge metaphorical file-drawer. And
the social costs are significant: distortions of the "big picture" of
science and technology; troubling evidence that much of what we think of as
proven scientific knowledge is actually wrong (John Ioannidis even maintains
that
"most published research findings are
false"); innovators who are steered away from potentially
ground-breaking projects toward "safer" and more conservative
enterprises, and valuable R&D resources that are wastefully poured into
blind alleys. In some cases, the difficulties to disseminate and access
negative knowledge may even risk lives: flawed scientific hypotheses "live
on" and create health hazards; drugs with dubious safety may find their
way to the market, and undetected errors in complex systems accumulate
"under the surface" until abruptly causing disastrous outcomes.
How can innovation law and policy
facilitate access to error? This project does not provide a comprehensive
solution, but sketches in rather broad strokes three principal directions for policy
intervention. First, I propose a few adjustments to IP doctrine (mainly, to patent
law's disclosure requirements and to IP's exceptions and limitations) whose
essential purpose is to lower access-barriers and allow the exposure of errors
in IP-protected technologies without risking IP infringement.
Second, I explore the option of direct,
top-down, regulation mandating the disclosure of
negative information. While this type of state intervention has significant drawbacks
as a general solution, it may be suitable in certain settings, especially where
disclosure pertains to safety and efficacy and regulatory schemes are already
in place (an amendment to FDA regulation that instructs the disclosure of
"adverse events" that are part of clinical trial results is one such
example). More generally, highlighting the links between error and innovation can
alert regulators to the significance of negative information, and help them
calibrate their regulatory efforts accordingly.
Finally, the most
promising direction for promoting the diffusion of errors may be locating
access-to-error within the larger paradigm of access-to-knowledge. Building on
the extensive scholarly work of the recent decades I suggest that the norms of
sharing cultivated by the access-to-knowledge movement can serve as a useful
model for commons-based schemes designed to improve access-to-error. Emerging
enterprises in various domains (such as the
Journal of Negative Results in
Biomedicine or
the Reproducibility Initiative)
indicate that this could be a productive path.
Such schemes are not
susceptible to top-down state regulation and their success largely depends on
the emergence of sociocultural patterns. But the state can still play an
important role here, by adopting a variety of measures to facilitate
commons-based projects and nudge bottom-up initiatives of access-to-error. Concrete examples for
such measures which I explore in the paper include:
- initiating and
facilitating the formation of errors-repositories
- calibrating state
funding schemes to support the access-to-errors paradigm
- adopting and
supporting voluntary "near-miss" schemes for reporting errors
- promoting a broader educational agenda that embraces error rather than conceals
it.
It is time that access
to error becomes a prominent item on the innovation policy agenda. This
project, I hope, is a step in this direction.
Michal Shur-Ofry is Senior Lecturer, Hebrew University of Jerusalem, Faculty of Law. She can be reached at michalshur at mail.huji.ac.il.Labels: Beyond IP
Posted
3:52 PM
by Guest Blogger [link]