Struggling to tame the spread of misinformation on its platform, Facebook can better reduce the extent to which their users fall for and spread fake news articles by deploying a better designed fake news flag.
Researchers from the University of Texas at Austin found that although Facebook users may be too quick to believe or share misinformation, flagging fake news can also make a significant difference.
The new paper published in Information Systems Research, studied what tools would help Facebook users spot fake news.
Lead author Tricia Moravec, an assistant professor in the McCombs School of Business, and co-authors, Antino Kim and Alan R Dennis of Indiana University, found that two simple interventions, especially when combined, had a strong effect on helping people discern reality from fake news.
The first intervention the researchers tested was a stop sign icon.
The second was a strong statement, “Declared Fake by 3rd Party Fact-Checkers”.
Each intervention was effective, but when combined, they were almost twice as powerful.
“Ideally, we would see Twitter and especially Facebook use some type of flag for misinformation with a brief statement to nudge people to think more critically,” Moravec said.
Twitter has begun using labels and warning messages.
Facebook uses technology and fact-checkers to identify false information and moves the information lower in the News Feed so it’s less likely to be seen.
Facebook says that people who repeatedly share false news will see their distribution reduced and their ability to advertise removed.
“Twitter has been doing a much better job than Facebook at managing misinformation, since they actively flag misleading information,” Moravec said.
“It is a good step that Facebook is taking to at least demote misinformation and punish repeat offenders, but based on the misinformation I have seen about COVID-19 on Facebook, I do not think their efforts are effective in managing misinformation on their platform”.
The proliferation of fake news on social media worsened during the 2016 election and has accelerated during the Covid-19 pandemic, feeding confusion and misinformation about matters that can have life-and-death consequences.
The researchers conducted several studies to see what would have the biggest impact on getting people to avoid spreading fake news.
They found that the combination of the stop sign, statement and awareness training had the biggest impact.
Although the social network in December 2016 started flagging fake news with an icon combined with a brief warning statement, it stopped doing so about a year later.
Moravec said Facebook abandoned the fake news flag too soon and needs to try again.