Logo

Logo

Facebook says it removed, flagged 1.9 million terror-related content in 2018

The number is twice as much from the last quarter of 2017. Facebook claims it found 99% of this content on its own, due to advances in its technology

Facebook says it removed, flagged 1.9 million terror-related content in 2018

(Photo: Getty Images)

Facebook “took action” on 1.9 million pieces of content related to the Islamic State (IS) and Al Qaeda in the first quarter of 2018. The number is twice as much from the last quarter of 2017. And Facebook claims it found 99 per cent of this content on its own.

“In Q1 2018, 99% of the ISIS and al-Qaeda content we took action on was not user reported. In most cases, we found this material due to advances in our technology, but this also includes detection by our internal reviewers,” Monika Bickert, Vice-President of Global Policy Management at Facebook, says in the latest post shared as part of Facebook’s Hard Questions series.

The post by Monika Bickert and Brian Fishman, Global Head of Counterterrorism Policy, details the use of technology by Facebook to keep terrorists off the social networking site, and explains how effective it is.

Advertisement

By “taking action”, Facebook means it removed the vast majority of these 1.9 million pieces of content and added a warning to a small portion that was shared for informational or counter speech purposes.

“This number likely understates the total volume, because when we remove a profile, Page or Group for violating our policies, all of the corresponding content becomes inaccessible,” says the post.

READ | The Facebook Controversy

Facebook, however, says it does not go back through to classify and label every individual piece of content found to have supported terrorism.

The social network giant defines terrorism as “any non-governmental organization that engages in premeditated acts of violence against persons or property to intimidate a civilian population, government, or international organization in order to achieve a political, religious, or ideological aim”.

“Our newest detection technology focuses on ISIS, al-Qaeda, and their affiliates — the groups that currently pose the broadest global threat,” says the post.

Facebook now has a counter-terrorism team of 200 people, up from 150 in June 2017.

“We remove not just new content but old material too. We have built specialised techniques to surface and remove older content. Of the terrorism-related content we removed in Q1 2018, more than 600,000 pieces were identified through these mechanisms,” says the post.

READ | Facebook begins fact-checking news in poll-bound Karnataka

In Q1 2018, Facebook’s “historically focused technology” claims to have found content that had been on its platform for a median time of 970 days.

“We’re under no illusion that the job is done or that the progress we have made is enough. Terrorist groups are always trying to circumvent our systems, so we must constantly improve,” Bickert and Fishman say in the post.

Advertisement