• Facebook has a system protecting elite users from being reprimanded for breaking content rules, the Wall Street Journal reports.
  • The company’s “XCheck” system has protected Donald Trump, Doug the Pug, and other “influential” figures.
  • But Facebook employees have expressed disapproval with giving special treatment to users, the WSJ reports.

Facebook has a secret internal system that exempts 5.8 million users from having to follow the rules on its platform, according to the Wall Street Journal.

The paper on Monday published an investigation detailing how high-profile users on its services who are “newsworthy,” “influential or popular, or “PR risky” don’t see the same enforcement action as do ordinary users, citing company documents it had viewed.

A former Facebook employee said in a memo that the company “routinely makes exceptions for powerful actors,” per the Journal.

Advertisement

Figures like former President Donald Trump, soccer star Neymar da Silva Santos Júnior, Sen. Elizabeth Warren, and even Doug the Pug are covered by the system, nicknamed “XCheck” or “cross check.” The system was created in response to the shortcomings of Facebook’s dual human and AI moderation processes.

But as The Journal reported, XCheck has led to a bevy of other problems.

Shape Up Your Future 16th & 17th September 2021

  • 15+ Knowledge Sessions
  • 30+ Industry leaders

Register Now

Meet Our Speakers

Nikhil Malhotra

Nikhil Malhotra

Dan Schawbel

Dan Schawbel

Ronnie Screwvala

Ronnie Screwvala

Dr. KV Subramanian

Dr. KV Subramanian

Benjamin Pring

Benjamin Pring

Sanjeev Bikhchandani

Sanjeev Bikhchandani

When users are added to it, it’s more difficult for moderators to take action against them, like with Neymar, who posted his WhatsApp communication with a woman who accused him of rape to his Facebook and Instagram accounts. The screenshots showed her name and nude photos of her.

Advertisement


Neymar’s sharing of “nonconsensual intimate imagery” would have prompted Facebook to delete the post, but since Neymar was covered by XCheck, moderators were blocked from removing the content, which was then seen by 56 million online users.

Less than 10% of the content that XCheck flagged to the company as needing attention was reviewed, per a document reported by the paper. Facebook spokesperson Andy Stone told the Journal that the number grew in 2020 but did not provide evidence to support that assertion.

Most Facebook employees have the power to add users to the XCheck system for whitelisting status, a term used to describe high-profile accounts that don’t have to follow the rules. But the Journal viewed a 2019 audit that found Facebook doesn’t always keep a record of who it whitelists and why, which poses “numerous legal, compliance, and legitimacy risks for the company and harm to our community.”

Facebook employees, including an executive who led its civic team, expressed disapproval with the company’s practice of doling outs special treatment for some users and said it was not in alignment with Facebook’s values, the paper reported.

Advertisement

“Having different rules on speech for different people is very troubling to me,” one wrote in a memo viewed by the Journal. Another employee also said Facebook is “influenced by political considerations” when making content moderation decisions, the paper reported.

Facebook acknowledged XCheck and its downfalls years ago and told the Journal that it’s trying to terminate its whitelisting practice. Company documents also show Facebook’s intention to eradicate the system – a product manager proposed a plan to stop allowing Facebook employees to add new users to XCheck as a solution.

Some of the company documents will be handed over to the Securities and Exchange Commission and Congress, with that person requesting federal whistleblower protection, per the WSJ.

In a series of tweets, Facebook Communications Director Andy Stone stressed how the company has made its “Cross-check” system public before in 2018 after Channel 4 News posed questions about the practice.

Advertisement

“In the end, at the center of this story is Facebook’s own analysis that we need to improve the program,” Stone tweeted. “We know our enforcement is not perfect and there are tradeoffs between speed and accuracy.”

Zuckerberg has long touted one of his signature taglines: that Facebook’s leaders don’t want the platform to be the “arbiters of truth” or to decide what is true or false and then leave up or remove content accordingly.

But that hands-off approach has tossed Facebook into a thorny position, especially in recent years, as critics say misinformation runs rampant on the site and some Republicans crusade against the company for serving a liberal agenda and discriminating against conservatives online.

Advertisement

Facebook has rolled out several moves in light of that scrutiny – reports surfaced in June that Facebook would stop granting politicians special treatment from enforcing its content rules.

Facebook did not immediately respond to Insider’s request for comment.

Read the full report from The Wall Street Journal here.

Advertisement

 

Original Source