Facebook sought Monday to clarify what posts, images and other content it allows on its site and why.
In an update to its community standards page, the world’s largest online social network gave users more guidance on why, for example, it might take down a post that featured sexual violence and exploitation, hate speech, criminal activity or bullying.
It also explained why it not only bans terrorist and organized crime groups, but it also removes content supporting them.
The Menlo Park, Calif., company said it isn’t changing how it regulates the content of posts, and that while some of the guidance for users is new, “It is consistent with how we’ve applied our standards in the past.”
In a blog post Monday, Facebook, with 1.39 billion active users worldwide, said it is a challenge to maintain one set of standards that meet the needs of its entire community. More than 80 percent of Facebook users are outside the U.S. and Canada.
“People from different backgrounds may have different ideas about what’s appropriate to share — a video posted as a joke by one person might be upsetting to someone else, but it may not violate our standard,” Monika Bickert, head of global policy management, and Chris Sonderby, deputy general counsel, wrote in the post.
Facebook users who believe that a particular page or content violates the site’s standards can click a “report” link to notify Facebook. The company then considers whether to take it down.
Some content is removed only in some countries. Facebook restricts content in countries where it violates local laws, even if that content does not violate its community standards.
Separately, Facebook also released its latest report on requests it gets from governments worldwide, covering the second half of 2014. The report shows that requests from governments for data and to restrict information are both increasing.
Republished with permission of the Associated Press.