Meta’s internal oversight board slammed the company’s policy of giving VIP users — including celebrities, politicians and business associates — preferential treatment on Facebook and Instagram.
If you’re a regular user of either platform, your speech will fall under the tech giant’s often controversial content moderation policies. However, if your name is Donald Trump or Kim Kardashian, or if you just have a really high follower count, you have more leeway to share and say things that are against the rules.
The internal program on Facebook and Instagram, known as cross-checking, protects celebrities and other high-profile users from having their content automatically removed by the company’s algorithms.
However, revelations from whistleblower Frances Haugen, who testified in detail about the program before Congress, appear to have informed the oversight board’s assessment. Haugen has said the company prefers “profit over safety.”
Meta’s internal oversight board slammed the company’s policy of giving VIP users — including celebrities, politicians and business associates — preferential treatment on Facebook and Instagram for content moderation decisions
“The board is concerned about how Meta has prioritized business interests when moderating content,” the report said. The program, it said, “provides additional protection for the expression of certain users.”
When supervision plate the cross-check program investigation began, Meta was making an astonishing 100 million content enforcement attempts every day.
So even if the company could make such decisions with 99% accuracy – an impossible standard – it would still be making a million mistakes a day.
Meta chose “profit over safety,” a whistleblower testified to Congress. Above: Meta CEO Mark Zuckerberg
One of the board’s key findings, detailed in a 57-page report, is that content that violates Meta’s own rules often stays up for more than five days when the user posting it is a VIP.
The company led by Mark Zuckerberg doesn’t currently offer much transparency to the public about how cross-checking works.
“Currently, Meta does not inform users that they are on cross-check lists and does not publicly share its procedures for creating and reviewing these lists,” the board wrote in a summary of its work, which began in October 2021.
“For example, it is unclear whether entities that consistently post infringing content are kept on cross-check lists based on their profile.”
The board recommends publicly marking the pages and accounts of all entities receiving list-based protection with Meta in the following categories: “including all government actors and political candidates, all business partners, all media actors, and all other public figures due to the commercial benefit to the company.’
The board also wrote that Meta’s cross-checking systems operate with a “consistent backlog of cases.”
“Meta told the board that it can take more than five days on average to make a decision about the content of users on its cross-check lists,” the oversight group noted. “This means that, due to cross-checking, content found to violate Meta’s rules will be left on Facebook and Instagram when it is most viral and can cause harm.”
In many cases, this delay has negative consequences in practice.
For example, Brazilian football star Neymar posted a video in 2019 with nude photos of a woman who had accused him of sexual assault.
Due to the cross-checking program, the post remained for over a day and was viewed over 100 million times before it was finally removed.
In its report, the board asks why the athlete has not been suspended and also notes that the incident only came to light as a result of Haugen’s revelations.
In total, the board advised 32 different actions and gave Meta 90 days to respond. However, since the board is advisory, the company is under no obligation to implement its suggestions.
When the board of trustees began its investigation into the cross-checking program, Meta was conducting about 100 million content enforcement attempts every day