The Covid-19 vaccine misinformation policies of Facebook, the world’s largest social media platform, were not effective in combating misinformation and its overall design is more to blame for this rather than just algorithms, a new study has revealed.
The study, led by researchers at the George Washington University in the US and published in the journal Science Advances, found that Facebook’s efforts were undermined by the core design features of the platform itself.
“To effectively tackle misinformation and other online harms, we need to move beyond content and algorithms to also focus on design and architecture,” said David Broniatowski, lead study author and an associate professor of engineering management and systems engineering.
The results show that removing content or changing algorithms can be ineffective if it doesn’t change what the platform is designed to do — enabling community members to connect over common interests — in this case, vaccine hesitancy — and find information that they are motivated to seek out, he explained.
The researchers found that while Facebook expended significant effort to remove a lot of anti-vaccine content during the Covid-19 pandemic, overall engagement with anti-vaccine content did not decrease beyond prior trends — and, in some cases, even increased.
“This finding… is incredibly concerning. It shows the difficulty that we face as a society in removing health misinformation from public spaces,” said Lorien Abroms, study author and a professor of public health.
In the content that was not removed, there was an increase in links to off-platform, low credibility sites and links to misinformation on “alternative” social media platforms like Gab and Rumble, especially in anti-vaccine groups.
In addition, remaining anti-vaccine content on Facebook became more — not less — misinformative, containing sensationalist false claims about vaccine side effects that were often too new to be fact-checked in real time.
There was also “collateral damage,” say the researchers, as pro-vaccine content may have also been removed as a result of the platform’s policies and, overall, vaccine-related content became more politically polarised.
Furthermore, anti-vaccine content producers used the platform more effectively than pro-vaccine content producers, the authors wrote.
Although both had large page networks, anti-vaccine content producers more effectively coordinated content delivery across pages, groups, and users’ news feeds.
Even when Facebook tweaked its algorithms and removed content and accounts to combat vaccine misinformation, the researchers say the architecture of the platform pushed back.
The social media platform designers could promote public health and safety by working collaboratively to develop a set of “building codes” for their platforms that are informed by scientific evidence to reduce online harms, the study suggested.