Coordination: A prerequisite for an effective fight against misinformation
Guest Blogger
From the Workshop on “News and Information Disorder in the 2020 US Presidential Election.”
Valerie Belair-Gagnon, Oscar Westlund, and Bente Kalsnes
At
the beginning of the 2020 U.S. election, Twitter marked false or misleading
information. It then changed its strategic orientation by hiding fewer false or
misleading posts to contextualize fact-checks, such as by linking to the source
of the information. This example confirms how platform companies have become arbiters of truth, while news organizations
and fact-checking companies are seeking to regain gatekeeping
power
from Big Tech and Silicon Valley.
This
particular arbitration of “what is truth” shows a stark contrast with what
journalists have sought to achieve. Journalists have been described as civic gatekeepers, meaning “civic and moral
roles of journalistic institutions, and their enactment of cultural codes that
give shape to, and help to protect, a society’s normative values.” Meanwhile,
platform companies have taken the role of gatekeepers of
democracy, a role traditionally taken by formal political actors through formal
political channels and engaging in opinion power, as Helberger argues. These two forms of
gatekeeping shows a detachment between media, tech, and platform companies, the
latter having demonstrably been developing policies on the fly. Meanwhile,
governments are increasingly trying to regulate and limit platform power and,
as Meese argues, they may fail to
address the interconnectedness of platforms and news publishers, and ultimately
care for public interest.
These
power dynamics between fact-checkers, journalists, and platform companies
prompt a set of questions in the fight against misinformation. Who are the
winners and losers? Who does what for what gain? How can people find truthful
information? In other words, what does the digital labor of fighting against
misinformation tell us about the technologically driven practices and ways in
which actors are seeking to gain legitimacy with their audiences? And “what are the limits of what fact-checking can
accomplish without greater support from platform companies for the researchers,
journalists, and fact-checkers seeking to understand and limit the spread of
harmful misinformation?”
As
we proposed in our project, Source Criticism and Mediated Disinformation
(SCAM), advancements in image, video, and audio manipulation technology are
being used both to misinform and manipulate, as well as to determine the
trustworthiness of sources and content. Diverse emerging digital technologies
can be used by actors with intentions to manipulate, but also by journalists,
fact-checkers, technologists, and other stakeholders working to detect and
counter information manipulation. Annany reports on evolving
platform-press collaborations in the U.S. between Facebook and major news and
fact-checking organizations. Graves and Anderson studied the developments of
other collaborations around structured journalism and fact-checking widgets.
Ultimately, research suggests that while publishers and platform companies are
competitors for attention, engagement, and revenues, there is a form of codependence between them when it comes
to combating misinformation. This combat is at the core of the fight against
misinformation, which extends beyond human actors and includes also the digital
technologies they use, as well as more or less active audiences potentially
contributing to networked forms of fact-checking.
Thus
emerge tensions and how they may work towards “reconciling” external
collaborations for reducing misinformation with business logic: To reduce
misinformation, key institutions need to recognize that they are a part of a
larger informational system, and work in concert with other institutions
towards shared goals by integration of their diverse specialized knowledge and
specialized technological affordances. But any such coordination may run counter
to the incentives of major companies and upstarts, and is often difficult
across industries or sectors.
Journalists,
fact-checkers, and platform companies each have their own ways of socially and
technologically constructing truth. Fact-checkers, for example, pride
themselves on being a transparent business, showing what steps they have taken
to come to their conclusions. Journalists, on the other hand may have to uphold
a set of traditional journalistic values and epistemological presumptions. It
is worth noting, though, that these values, norms, and practices may vary
across cultures, as the World of Journalism Study has again and again
demonstrated.
In
our grant-funded research, SCAM, we are asking a series of questions which may
enlighten us better on visible and less visible technological tools and systems
deployed to fight misinformation, from fact-checkers and journalists to tech
companies. The problem in the fight against misinformation may lie in divergent
epistemological departure and a lack of coordination.
Organizations
have their own interests and missions that depend on time, resources, and
capability, which affects the choices that they make. In our interviews, a
fact-checker emphasized how effective fact-checking relied on local
understandings. A media organization may decide to develop their database only
in a particular locale as to not uphold privacy
laws in other locales. Journalists also shared the sentiment that they have to
fight for those who agree with them, an idea supported by research, and seem to
long the idea that they could convince those who have different opinions.
These
fragmented practices that are based on missions, interests, and capability,
underlie a larger issue: How can truth gain legitimacy if the institutions are
fragmented and there is limited coordination among actors (noting that
right-wing extremists are now moving to like-minded networking platforms such
as Parler)? One thing is certain: Tech solutionism is not the ultimate way out
to fix or solve the problem; rather
it may exacerbate the issue. The larger institutional questions and power
dynamics, which sociotechnical systems can help unpack, are ones that tech
companies as arbiters of democracy and journalism and fact-checkers as civic
gatekeepers will have to reckon with in concert with each other.
Valerie Belair-Gagnon is an assistant professor, media sociologist, and director of the Minnesota Journalism Center at the University of Minnesota-Twin Cities. Oscar Westlund is a professor at Oslo Metropolitan University, Adjunct professor at Volda University College, and associate professor at the University of Gothenburg. Bente Kalsnes is an associate professor at Kristiana University College.
This post was made possible thanks to
a Source Criticism and Mediated Disinformation grant funded by KULMEDIA,
Research Programme on the Culture and Media Sector at the Norwegian Research
Council.
Cross posted at the Knight Foundation
Posted
10:00 AM
by Guest Blogger [link]