Wednesday, December 23, 2020

The 2020 election integrity partnership

Guest Blogger

From the Workshop on  “News and Information Disorder in the 2020 US Presidential Election.” 

Jevin West

They gave Trump voters sharpies and now their votes are being invalidated! WTF!

An unverified user posted this tweet shortly after the Arizona polls closed. Within a few hours, the tweet and others like it went viral. The official (verified) Pima County account tweeted a response to Sharpiegate, stating that felt-tipped pens could be used to vote; they would not invalidate a ballot. That was not enough to stop the surge. Hundreds of thousands of tweets followed, pushing the narrative of voter fraud.

Sharpiegate teaches us several lessons. Domestic disinformation was far more potent and prevalent in 2020 than inauthentic, foreign actors. Political operatives and social influencers in the U.S. didn’t need to create stories of voter fraud. They just needed to amplify existing ones, like this one in Arizona by an authentic, domestic user.

Sharpiegate is one of hundreds of conspiracy theories that my colleagues and I have tracked over the last year. On Dec. 3, 2019 — exactly one year ago today — we launched the Center for an Informed Public (CIP) at the University of Washington with support from the Knight Foundation. Since then, we have spent day and night monitoring misinformation surrounding COVID-19, vaccine hesitancy, the West Coast fires, the social justice movement, and now the U.S. election. Like other researchers, we have seen the rise of Plandemic, 5G, and Sharpiegate. We have seen the actors and tactics from one conspiracy theory emerge within the next. We have seen social media platforms experiment with banners, tags, and takedowns. And we have seen policy makers begin to take notice, writing laws limiting the use of synthetic media and other forms of deceptive technology. The year 2020 has delivered a decade’s worth of research material.

But research is not enough. Misinformation is a topic of urgency. It cannot wait for a five-year publication cycle. We are committed to public engagement and working with policy makers, journalists, and educators. A recent collaborative effort illustrates practically how this can be done. In July, together with Stanford’s Internet Observatory, Graphika, and the DFRLab, the CIP announced the Election Integrity Project. The goal of this nonpartisan project was to monitor, in real time, misinformation around voter integrity and efforts to deter voting. It included more than 120 researchers, staff, postdoctoral researchers, and students. We focused specifically on procedural interference, participation interference, and fraud. We also monitored delegitimization efforts that continue today at both the federal and local levels. In fact, here in the state of Washington, the gubernatorial candidate, Loren Culp, refuses to concede, claiming massive voter fraud, despite little to no evidence to support his claims and despite losing by more than half a million votes.

The 2020 election, in some ways, was surprisingly unsurprising. A week before the election, my colleagues and I wrote an EIP post on what to expect on election night and the days after. The goal was to prepare journalists and the public for uncertainty, not knowing the result on election night, and likely shifts from red to blue in vote tallies. During this time of uncertainty, claims of voter fraud were going to be numerous, backed by “evidence,” with lost and found ballots, and videos showing poll station irregularities. There would be premature declarations of winners, affidavits filed, and “statistical evidence of voter malfeasance. Much of this occurred and continues today. In retrospect, it is not surprising. Voters had been primed prior to the election — and so had researchers — for “massive” voter fraud. Narratives had been assembled months before, and some continue today: mail-dumping, ballot harvesting, the color revolution, electronic voting machine errors, dead voters, and others.

When it comes to policy, there are two findings that stand out: What to do with repeat offenders? And what to do with livestream video?

The EIP team investigated more than a thousand reports of misleading information about the voting process. These reports included hundreds of thousands of Twitter users pushing this content. However, a large proportion of the disinformation was being promulgated by a small percentage of users. These users or news organizations tended to have large followings (although not always). These repeat offenders often reframed or decontextualized “findings” to support their favorite narrative. They often exploited local news, as with Sharpiegate. Policies at Twitter have led to labels at the individual tweet, but rarely at the individual. Misinformation that goes viral is rarely stopped with a label on the content; a label and policies directed at repeat offenders may be more effective.

The EIP team also investigated the many ways that livestreaming spread misinformation and disinformation during the election cycle. Much of this livestream is without context. These context-free videos can be clipped, edited and used as evidence to support selected narratives. This is an effective medium for pushing disinformation because of the manipulative potential, but also because it is difficult for fact-checkers, journalists, and researchers to identify the source and mitigate its spread. EIP found highly variant social media policies that lacked specificity when it came to this form of communication. 

The 2020 election presented new challenges for researchers and the public. But among these challenges were positive outcomes as well. Voter turnout was high. Decentralized control of the voting process was resilient. There was no eleventh-hour deepfake video. Most poll stations ran smoothly. Thankfully, there was little violence. And foreign interference, and claims of foreign interference, were not much of an issue, at least from our observations. But delegitimization efforts continue today and will continue. Democracy pays the price. Our hope is that the efforts in this workshop, and in the broader research and policy community, will outline ways to reverse this concerning trend of distrust. 

Jevin West is an associate professor in the Information School at the University of Washington and co-author of the book, “Calling Bullshit: The Art of Skepticism in a Data-Driven World.” 

Cross posted at the Knight Foundation

Older Posts
Newer Posts