FB It’s clear that content editing systems need to be overhauled.
On Thursday, Meta’s supervisory board announced that it had reversed two of Facebook’s decisions to remove content from its platform. The independent group’s conclusions point to major flaws in Facebook’s content moderation protocols in two main areas: the platform’s use of automated systems to remove content and the removal of news-relevant content by human moderators.
The first case From the Censorship Board relating to a Facebook user in Colombia who posted a caricature depicting police brutality from the National Police of Colombia in September 2020. Facebook removed the user’s post 16 months later When the company’s automated systems matched a cartoon image to an image stored in a media matching service bank.
The oversight board decided that it was wrong for Facebook to remove the user’s post because the image depicted does not violate Facebook’s rules and should not be added to the Bank Media Matching Service.
And according to the Oversight Board, this user wasn’t the only one affected. In total, 215 users appealed to remove a post that included this photo. Of those, 98 percent succeeded in their appeal to Mita. However, the cartoon image remained in the bank and continued to lead to automated detections and subsequent removals. Meta removed the image from the Bank Media Matching Service only when the supervisory board decided to address this particular case.
In the The second case, the oversight board determined that Meta erroneously removed a news publication about the Taliban. In January 2022, an India-based newspaper published a link to an article on its website about the Taliban’s announcement of reopening schools for women and girls. Meta had determined that the post violated the policy of dangerous individuals and organizations because he interpreted the post as “praising” the Taliban.
As a result, Meta removed the post and restricted the Indian newspaper’s access to some Facebook features, such as Facebook Live. The newspaper attempted to appeal the decision but it was not reviewed due to a shortage of Urdu-speaking auditors in the company.
Once again, when the oversight board decided to take this case, Meta reversed its decision, reinstated the content and removed the Facebook page restrictions. The Oversight Board has determined that simply reporting newsworthy events is not a violation of Facebook’s policies.
While affected users in these specific cases may be fairly young in number or reach, the Oversight Board has taken the opportunity to recommend broader changes to Facebook’s content moderation systems, whether automated or human-reviewed.
Founded in 2018, and Supervisory Board To create a somewhat higher court of Meta content moderation decisions. The organization issued its decisions on first cases January 2021. One of these early rulings was heavily criticized for calling for the restoration of the deleted post deemed hate speech by Muslim activist groups. But the oversight board’s most notable issue up to this point was easily its decision to stick A dead comment for Donald Trump on Facebook. The former president was suspended from the podium after violent riots at the Capitol on January 6.
But the oversight board’s decision forced Meta to set a time frame for Trump’s suspension. Shortly after the 2021 Supervisory Board decision, Meta Will consider allowing Trump back On its platforms in January 2023. That may have seemed far into the future in June 2021, but now there are only a few months left. If and when Trump returns to Facebook next year, don’t be surprised to see his name in one or two oversight board cases…or twenty.