In a damage-control mode since the Cambridge Analytica scandal shocked the world, Facebook has decided to take yet another step to prevent “interference and misinformation in elections”. The social networking giant will now establish an independent election research commission that will “solicit research on the effects of social media on elections and democracy”.

CEO Mark Zuckerberg announced the development in a post on his social networking site on Monday.

He said the objective was to get the ideas of leading academics on how to address such issues as well as to hold Facebook accountable for making sure “we protect the integrity of these elections on Facebook”.

“To do this, we’re working with foundations across the US to set up a committee of academic experts who will come up with research topics and select — through a peer review process — independent researchers to study them,” said Zuckerberg.

These researchers will be given access to Facebook resources so they can draw “unbiased conclusions” about Facebook’s role in polls, including how it’s handling the risks and what steps are needed in future. “They would share their work publicly…” said Zuckerberg.

Facebook claims its goal is also to understand its impact on upcoming elections in a few countries, including India.

Zuckerberg said the initiative was going to be a new model of collaboration between researchers and companies, and “it’s part of our commitment to protect the integrity of elections around the world”.

He once again admitted that Facebook was “too slow” in identifying election interference in 2016, and that the social networking giant now needed to do better.

Read his complete post here.

In a separate post, Elliot Schrage, Vice-President of Communications and Public Policy, and David Ginsberg, Director of Research, Facebook, detailed how Facebook’s new initiative would help provide independent and credible research “about the role of social media in elections, as well as democracy more generally”.

As part of the initiative, the group of scholars will define the research agenda; solicit proposals for independent research on a range of topics; and manage a peer review process to select scholars “who will receive funding for their research, as well as access to privacy-protected datasets from Facebook which they can analyze”.

Facebook asserts that it will not have any right to review or approve their research findings prior to publication.

“We have made real progress since Brexit and the 2016 US presidential election in fighting fake news, as well as combating foreign interference, in elections in France, Germany, Alabama and Italy. But there is much more to do — and we don’t have all the answers. This initiative will enable Facebook to learn from the advice and analysis of outside experts so we can make better decisions — and faster progress,” said the post by Schrage and Ginsberg.

The initial term of the independent election research commission will be one year and Facebook will determine the membership in the coming weeks. Facebook plans to have on board experts with “different political outlooks, expertise and life experiences, gender, ethnicity and from a broad range of countries” .

According to Schrage and Ginsberg, Facebook is building a dedicated team to work with the commission and academic researchers who will develop privacy-protected datasets. These data-sets, they claim, will be kept exclusively on Facebook’s global network of secure servers and there will be “continuous audit”. The commission will ensure only aggregated, anonymised results are reported.

Schrage and Ginsberg said Gary King of Harvard University and Nate Persily of Stanford Law School were instrumental in developing this innovative model for academic collaboration.

The initiative to establish an independent election research commission will be funded by the John and Laura Arnold Foundation, Democracy Fund, the William and Flora Hewlett Foundation, the John S. and James L. Knight Foundation, the Charles Koch Foundation, the Omidyar Network, and the Alfred P. Sloan Foundation.

Facebook has been in a damage-control mode since allegations that data mining firm Cambridge Analytica might have used ill-gotten user data to try to influence the 2016 US Presidential election.

Zuckerberg has acknowledged that he made a “huge mistake” as he failed to take a broad view of what Facebook’s responsibility should be towards the community it has created.

The Facebook CEO has been writing about a number of the steps being taken to correct that. Facebook has already announced that it is building new artificial tools (AI) tools and taking down thousands of fake accounts, besides verifying every political advertiser and large-page admin and launching ads transparency tools.