Facebook revamps controversial content moderation process for VIPs- QHN




New York
CNN
 — 

Facebook-parent Meta on Friday announced a revamp of its “cross-check” moderation system after facing criticism for giving VIPs special treatment by applying different review processes for VIP posts versus those from regular users.

But Meta stopped short of adopting all the recommended changes that had previously been put forward by its own Oversight Board, including a suggestion to publicly identify which high-profile accounts qualify for the program.

The cross-check program came under fire in November 2021 after a report from the Wall Street Journal indicated that the system shielded some VIP users — such as politicians, celebrities, journalists and Meta business partners like advertisers — from the company’s normal content moderation process, in some cases allowing them to post rule-violating content without consequences.

As of 2020, the program had ballooned to include 5.8 million users, the Journal reported. Meta’s Oversight Board said in the wake of the report that Facebook had failed to provide it with crucial details about the system. At the time, Meta said that criticism of the system was fair, but that cross-check was created in order to improve the accuracy of moderation on content that “could require more understanding.”

Meta’s Oversight Board in a December policy recommendation called out the program for being set up to “satisfy business concerns” and said it risked doing harm to everyday users. The board — an entity financed by Meta but which says it operates independently — urged the company to “radically increase transparency” about the cross-check system and how it works.

On Friday, Meta said it would implement in part or in full many of the more than two dozen recommendations the Oversight Board made for improving the program.

Among the changes it has committed to make, Meta says it will aim to distinguish between accounts included in the enhanced review program for business versus human rights reasons, and detail those distinctions to the board and in the company’s transparency center. Meta will also refine its process for temporarily removing or hiding potentially harmful content while it’s pending additional review. And the company also said it would work to ensure that cross-check content reviewers have the appropriate language and regional expertise “whenever possible.”

The company, however, declined to implement such recommendations as publicly marking the pages of state actors and political candidates, business partners, media actors and other public figures included in the cross-check program. The company said that such public identifiers could make those accounts “potential targets for bad actors.”

“We are committed to maintaining transparency with the board and the public as we continue to execute on the commitments we are making,” regarding the cross-check program, Meta said in a policy statement.

The Oversight Board said in a tweet Friday that the company’s proposed changes to the cross-check program “could render Meta’s approach to mistake prevention more fair, credible and legitimate, addressing the core critiques” in its December policy recommendation.

Note:- (Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor. The content is auto-generated from a syndicated feed.))

Leave a Reply

Your email address will not be published. Required fields are marked *