Facebook"s Internal Rulebook on Adult Content, Self-Harm, Hate Speech Leaked
Amid increasing political pressures from the EU and the US government, Facebook has had its internal rulebook leaked, fueling new controversy about the social network”s policies on questionable user content.
Internal training manuals, spreadsheets and flowcharts obtained by The Guardian show what appear to be conflicting rules on the appropriateness of posts â€“ including Facebook Live posts â€“ involving child or animal abuse, revenge porn, war and terrorism, hate speech, and more.
In one example, the newspaper reveals that Facebook views live-stream attempts at self-harm as okay to leave online because it “doesn”t want to censor or punish people in distress.” Earlier this year, it faced a backlash for failing to pull videos of murders in the US and Thailand posted on the social network. In contrast, the company removed an iconic Vietnam war photo simply because there was a naked girl in the picture, promoting public outcry.
The leak further reveals that Facebook considers remarks such as “Someone shoot Trump” should be deleted. Facebook”s explanation? Trump falls in a protected category as head of state. At the same time, the rulebook allows things like “F*** off and die” because Facebook regards these as non-credible threats.
“Keeping people on Facebook safe is the most important thing we do,” Facebook”s Head of Global Policy Management Monica Bickert said in a statement. “We work hard to make Facebook as safe as possible while enabling free speech,” Facebook”s Head of Global Policy Management Monica Bickert said in a statement. “This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously.”
Facebook is reportedly hiring some 3,000 additional moderators, as existing moderators say they are drowned in requests. Often they are left with “just 10 seconds” to make a decision. One source told the UK publication: “Facebook cannot keep control of its content. It has grown too big, too quickly.” Moderators are said to be at odds with inconsistent and peculiar policies â€“ chiefly on sexual content.
“It”s one thing when you”re a small online community with a group of people who share principles and values, but when you have a large percentage of the world”s population and say “share yourself”, you are going to be in quite a muddle. Then when you monetise that practice you are entering a disaster situation,” said Sarah T Roberts, an expert on content moderation.
To Facebook”s credit, equally disturbing posts have helped uncover police killings and other abuses that would have otherwise remained unpunished. But when it comes to appropriateness, a lot of content falls in a gray area for context, culture, ethnicity and even age. And, while Facebook promises to do everything in its power to keep users safe, the social network has a ways to go to meet this goal.
Watch Out for These Ongoing Bank of America Phishing Campaigns Targeting Customers in the US
July 16, 2021
How to protect yourself against cyberstalking
July 06, 2021
The Top Five Security Risks Smartphone Users Face Today
July 02, 2021
Phishing Alert: Scammers Use Fake SharePoint and DocuSign Messages to Steal Users’ Login Credentials
July 02, 2021
Your Doxxing Dossier Will Keep Growing Thicker Until You See the Danger
June 30, 2021
Mobile security threats: reality or myth?
June 13, 2021
FOLLOW US ON
You might also like
July 27, 2021
July 27, 2021
July 23, 2021