In the latest announcement from Facebook, CEO Mark Zuckerberg has said the social network giant has taken a step towards developing a “more democratic and independent system” to determine Facebook’s Community Standards, creating full transparency about its current rules and how they are enforced. In a related development, Facebook has also established a new appeals process. Starting this week, users can appeal adverse decisions on individual posts if they think there was a mistake on part of Facebook.

In May, the company will launch Facebook Forums: Community Standards, a series of public events in Germany, France, the UK, India, Singapore, the US and other countries where it will get people’s feedback directly.

“Starting today we’re making transparent our internal guidelines for how exactly we define hate speech, violence, nudity, terrorism, and other content we don’t allow. These standards are a work in progress, and sharing them openly will help us get feedback from you to make them better,” Zuckerberg said in a Facebook post.

Facebook’s Community Standards page has more details on what is allowed and what is not on the platform. It says the rules apply around the world to all types of content, and claims they are designed to be comprehensive — “content that might not be considered hate speech may still be removed for breaching our Bullying Policies”.

Facebook says the consequences of breaching its Community Standards vary depending on the severity of the breach and a person’s history on Facebook. “For instance, we may warn someone for a first breach, but if they continue to breach our policies, we may restrict their ability to post on Facebook or disable their profile.”

READ | Facebook says it removed, flagged 1.9 million terror-related content in 2018

Facebook may also notify law enforcement if it believes there is a genuine risk of physical harm or a direct threat to public safety.

In a separate blog post, Facebook has published its internal enforcement guidelines and given details about its appeals process.

“…for the first time we’re giving you the right to appeal our decisions on individual posts so you can ask for a second opinion when you think we’ve made a mistake,” Monika Bickert, Vice President of Global Policy Management, said in the post.

Facebook is launching appeals for posts that were removed for nudity / sexual activity, hate speech or graphic violence.

Bickert’s post details how the appeals process works.

If a photo, video or post has been removed because Facebook finds it violated the Community Standards, the user will be notified, and given the option to request additional review.

A team will review the appeal, “typically within 24 hours”.

“If we’ve made a mistake, we will notify you, and your post, photo or video will be restored,” says the post.

Facebook says it is working to extend this process further, “by supporting more violation types, giving people the opportunity to provide more context that could help us make the right decision, and making appeals available not just for content that was taken down, but also for content that was reported and left up”.