NewsBite

Inside ‘Facebook Jail’: The secret rules that put users in the doghouse

The site’s Oversight Board is poised to rule on Donald Trump’s case. Breaking Facebook’s rules can mean blocked privileges, but many guidelines aren’t made public.

Facebook’s closely watched decision on whether Donald Trump can use the platform is expected late on Wednesday or Thursday (AEST). Picture: AFP
Facebook’s closely watched decision on whether Donald Trump can use the platform is expected late on Wednesday or Thursday (AEST). Picture: AFP

In Facebook Jail, many users are serving time for infractions they don’t understand.

Colton Oakley was restricted after ranting about student debt. The recent graduate of the State University of New York at New Paltz posted that anyone who was mad about loan cancellation was “sad and selfish.” His sentence: three days without posting on Facebook.

Alex Gendler, a freelance writer in Brooklyn, got a similar ban after sharing a link to a story in Smithsonian magazine about tribal New Guinea. Nick Barksdale, a history teacher in Oklahoma, served 30 days recently after jokingly telling a friend “man, you’re spewing crazy now!” None of the three quite understand what they did wrong.

“If you use the term ‘crazy,’ does that automatically get you banned?” asked Mr Barksdale.

The plight of baffled users caught in Facebook’s impenetrable system for adjudicating content has reinforced the company’s reputation for heavy-handed and inept policing of its online platforms. The problem, which has been mounting for years, is increasingly acute as legislators and the public focus on the vast power social-media companies hold over the flow of information.

The company’s newly formed Oversight Board — a group of 20 lawyers, professors and other independent experts who consider appeals to decisions made by Facebook — has been charged with interpreting Facebook’s numerous detailed rules governing everything from the depiction of graffiti to swearing at newsworthy figures.

The board’s most closely watched decision is expected late on Wednesday or Thursday (AEST) — whether Facebook appropriately applied its rules when it booted former US president Donald Trump indefinitely from the platform.

Users that run afoul of the secret rules can spend time in what many now call “Facebook Jail”. Picture: AFP
Users that run afoul of the secret rules can spend time in what many now call “Facebook Jail”. Picture: AFP

Legislators have repeatedly grilled Mark Zuckerberg about the issue, prompting the CEO to repeat his mantra that nobody would ever think such delicate work should be handled by a private company. Still, the challenge remains: how best to police a platform that every day sees billions of posts, comments and photos.

In its earlier decisions, the board has zeroed in on Facebook’s legion of rules, calling them unclear and “difficult for users to understand”. Facebook says it is taking steps to address the Oversight Board’s recommendations and “build out better customer support for our products”. “While we’re transparent about our policies, we understand that people can still be frustrated by our decisions, which is why we’re committing to doing more,” a Facebook spokeswoman, Emily Cain, said in a written statement.

Since it began taking cases in October, the Oversight Board has received more than 220,000 appeals from users, and issued eight rulings — six of them overturning Facebook’s initial decision.

John Taylor, the board’s spokesman, says the intention was never “to offer the hot takes on any particular issue of the day. The point of the board is to render a decision on the most difficult content decisions facing the company”.

Facebook in recent years has introduced many new rules, often in response to specific complaints from legislators and various interest groups, designed to protect users and guide an army of outside contractors who work on content moderation. The internal guidelines are in addition to the company’s Community Standards listed on its website, and not made public.

Some of the guidelines include detailed examples to illustrate fine distinctions.

Moderators were instructed in documents viewed by The Wall Street Journal to remove this statement: “If you vote by mail, you will get Covid!” The documents also said this statement was acceptable: “If you vote by mail, be careful, you might catch COVID-19!”

Around voting and the 2020 election, moderators were instructed to remove this sentence: “My friends and I will be doing our own monitoring of the polls to make sure only the right people vote.” But not this one: “I heard people are disrupting going to the polls today. I’m not going to the polling station.”

Facebook founder and CEO Mark Zuckerberg. Picture: Getty Images
Facebook founder and CEO Mark Zuckerberg. Picture: Getty Images

The Facebook spokeswoman says the differences are small but “enough of a distinction that we called them out and why we have these detailed documentation for our content reviewers.” The spokeswoman says Facebook reviews two million pieces of content a day. Mr Zuckerberg has said the company makes the wrong call in more than 10 per cent of cases — meaning about 200,000 decisions could be wrong a day.

Users that run afoul of Facebook’s rules can spend time in what many now call “Facebook Jail”, losing commenting and posting abilities for from 24 hours to 30 days or, in more serious cases, lose their accounts indefinitely. A user typically has to rack up multiple strikes against them before facing a ban, but Facebook doesn’t tell users how many it takes, saying in a 2018 blog post that “we don’t want people to game the system”.

Facebook doesn’t release the number of accounts it restricts. When content is removed or users are blocked, users usually receive a notice saying they have violated Community Standards. The notice typically indicates which broad category has been violated. Separately from the Oversight Board’s decisions, Facebook restores thousands of pieces of previously removed content each quarter after user appeals.

Facebook recently has turned more toward automation to help guide its decisions, relying on artificial intelligence and algorithms to take down content and also decide on user appeals, according to people familiar with the company and more than two dozen users interviewed by the Journal. The result is more frustration, with some users wondering how Facebook could have made a decision on their content in only a matter of seconds.

A research paper from New York University last northern summer called the company’s approach to content moderation “grossly inadequate” and implored Facebook to stop outsourcing most of the work and to double the overall number of moderators.

“This is what you get when you build a system as big as this,” says Olivier Sylvain, a professor of law at Fordham University, who has researched Facebook and content moderation generally. “I don’t think it’s unhealthy or wrong for us to wonder if the benefits that flow from such a big service outweigh the confusion and harm.”

Mr Barksdale, a history teacher in Newcastle, Okla., has been banned from his Facebook page several times since last fall, each time for reasons he says he doesn’t fully understand.

One time he was restricted from posting and commenting for three days after sharing a World War II-era photo of Nazi officials in front of the Eiffel Tower as part of a history discussion, with a brief description of the photo. He got a 30-day ban for trying to explain the term pseudoscience to one of his followers.

In March, after he joked with another history aficionado during a debate that “you’re spewing crazy now,” Facebook alerted him that he had been restricted for seven days. When Mr. Barksdale clicked a button to appeal, Facebook disagreed, and lengthened his ban to 30 days, saying six of his past posts had gone against the company’s Community Standards.

Mr. Barksdale says he tries to follow Facebook’s Community Standards, but hadn’t been aware he had committed so many infractions.

The Facebook spokeswoman says the company mistakenly removed Mr. Barksdale’s comment. She says, in general, users can find their violations in their “support inbox” attached to their profile.

A giant digital sign at Facebook's corporate headquarters campus in Menlo Park, California. Picture: AFP
A giant digital sign at Facebook's corporate headquarters campus in Menlo Park, California. Picture: AFP

Facebook’s Community Standards, the public rules, have expanded in recent years to include six major categories and 27 subcategories ranging from “violent and graphic content” to “false news.” Facebook’s policy on hate speech forbids direct attacks on people, based on race, religion and other demographics. However, it allows them if the words are used to raise awareness, or “in an empowering way,” according to the Community Standards. “But we require people to clearly indicate their intent. If intention is unclear, we may remove content.”

Internally, Facebook works from a far more specific and complicated set of guidelines, a more than 10,000-word document called its “Implementation Standards”, that its more than 15,000 content moderators rely on to make decisions. Still more documents — known internally as “Operational Guidelines” and “Known Questions” — further explain the company’s rationale for its rules.

In most cases, when a piece of content is flagged by Facebook — either by a user or by its algorithms — the post or photo or video is usually reviewed by the moderators, whose job is to try to apply the rules that Facebook has devised.

In one of the documents viewed by the Journal, the company forbid use of a “degrading physical description,” which it defined as “calling an individual’s appearance ugly, disgusting, repulsive, etc.” It gave as an example: “It’s disgusting and repulsive how fat and ugly John Smith is.” But, the document continued, “We do not remove content like “frizzy hair, ” “lanky arms,” “broad shoulders,” etc. since “frizzy,” “lanky,” and “broad,” are not deficient or inferior, and therefore not degrading.”

Many users aren’t aware of the internal documents that help moderators interpret the public rules. In contrast, Alphabet’s Google search engine publishes the full set of 175 pages of rules that its 10,000 “search quality raters” use to evaluate search results.

“We want people to understand where we draw the line on different kinds of speech, and we want to invite discussion of our policies, but we don’t want to muddy the waters by inundating people with too much information,” the Facebook spokeswoman says.

In several rulings this year, the Oversight Board has urged Facebook to be more clear in its Community Standards akin to the specificity found in its internal documents.

The Wall Street Journal

Read related topics:Donald TrumpFacebook

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/business/the-wall-street-journal/inside-facebook-jail-the-secret-rules-that-put-users-in-the-doghouse/news-story/bb30ba88de0e37c760b1684987f25c4a