Facebook could have stopped 10 billion impressions from "repeat misinformers", but didn't: report

A study raises questions as to why Facebook did not stop the spread of misinformation in the 2020 election run-up

By Matthew Rozsa

Staff Writer

Published April 12, 2021 6:20PM (EDT)

Donald Trump | Facebook  (Photo illustration by Salon/Getty Images)
Donald Trump | Facebook (Photo illustration by Salon/Getty Images)

Facebook does not need any more bad publicity. The company is currently being publicly scorned after more than 500 million users had their personal information leaked. It has also been faced with an antitrust suit endorsed by more than 40 states since last year, with reports alleging that CEO Mark Zuckerberg would intimidate potential competitors.

Now the big blue social media titan has some more bad press — namely, a new report which claims that it failed miserably in its promise to stop misinformation during the 2020 presidential election. Indeed, the report accuses Facebook of being so lax that the top 100 "repeat misinformers" on the site received millions more interactions than the combined total netted by the top 100 traditional U.S. media pages.

Released by the online advocacy group Avaaz, the report argues that if Facebook had not waited until October (roughly one month before Election Day) before altering its algorithm to reduce the visibility of inaccurate and hateful content, it could have stopped roughly 10.1 billion views from accumulating on 100 pages that frequently disseminated misinformation in the eight months prior to the 2020 election. You read that right: 10.1 billion impressions of misinformation.

"Failure to downgrade the reach of these pages and to limit their ability to advertise in the year before the election meant Facebook allowed them to almost triple their monthly interactions, from 97 million interactions in October 2019 to 277.9 million interactions in October 2020 — catching up with the top 100 US media pages 2 (ex. CNN, MSNBC, Fox News) on Facebook," Avaaz reported.

The report noted that an October 2020 poll found that 44% of registered voters (or roughly 91 million people) saw false claims about mail-in voter fraud on Facebook, with 35% of registered voters (or roughly 72 million people) believing them.

The organization also noted that Facebook has rolled back many of the changes it made before the election, which is allowing right-wing conspiracy theories like QAnon and Stop the Steal to thrive on the site. Avaaz says that they have identified 267 pages and groups, as well as many "Stop the Steal" groups, that have a combined 32 million followers and which spread "violence-glorifying content" based around the 2020 presidential election. More than two-thirds of these groups are in some way connected to QAnon, Boogaloo, militia-aligned or other violent far right groups. Despite violating Facebook's policies, Avaaz says that 118 of those pages and groups are still active.

Facebook denied the report's conclusions. As Facebook spokesperson Andy Stone told Time Magazine, "This report distorts the serious work we've been doing to fight violent extremism and misinformation on our platform. Avaaz uses a flawed methodology to make people think that just because a Page shares a piece of fact-checked content, all the content on that Page is problematic."


By Matthew Rozsa

Matthew Rozsa is a staff writer at Salon. He received a Master's Degree in History from Rutgers-Newark in 2012 and was awarded a science journalism fellowship from the Metcalf Institute in 2022.

MORE FROM Matthew Rozsa


Related Topics ------------------------------------------

2020 Election Aggregate Donald Trump Facebook Joe Biden Misinformation