Oversight Board Finds Meta's Removal of Two Israel-Hamas Videos Unjustified

Oversight Board Finds Meta's Removal of Two Israel-Hamas Videos Unjustified

Meta's automated tools mistakenly removed two videos on the Israel-Hamas conflict, hindering users from accessing content related to human suffering on both sides The Meta Oversight Board highlights the unnecessary moderation and potential impact on user understanding

Two videos related to the Israel-Hamas war were mistakenly removed by Meta's automated content moderation tools, according to a statement from the Meta Oversight Board. This may have prevented users from accessing content related to human suffering on both sides of the conflict. These comments come after the Oversight Board's first "expedited review," shedding light on the increased scrutiny social media companies are facing in their management of conflict-related content.

The board reversed Meta's initial decision to delete two pieces of content, urging Meta to uphold users' rights to "freedom of expression and their ability to communicate during this crisis."

Michael McConnell, a co-chair of the board, stated, "The Board's focus is to safeguard the freedom of expression for all individuals regarding these tragic events, while ensuring that none of the content incites violence or hate. These posts are crucial not only for the individuals sharing them, but also for users globally seeking accurate and varied information about significant events."

Meta responded to the boards decision by stating that since it had already reinstated the two pieces of content before the boards decision, it would not be taking any additional action. "Both expression and safety are important to us and the people who use our services," the company noted in a blog post.

Oversight Board Finds Meta's Removal of Two Israel-Hamas Videos Unjustified

On February 2, 2023, a security guard is seen standing next to a sign at Meta headquarters in Menlo Park, California. Meta, the parent company of Facebook, has announced a fourth quarter revenue of $32.17 billion, surpassing expectations.

The Metas Oversight Board, comprised of specialists in freedom of expression and human rights, is set to assess the company's management of content related to the Israel-Hamas war. Often likened to a Supreme Court for Meta, the board provides users with the opportunity to appeal content decisions on the company's platforms. It issues recommendations to Meta on content moderation and broader policy matters.

Earlier this month, the board announced that it would conduct a faster review in this case due to the potential urgent real-world impact of content decisions related to the war. Following the outbreak of the Israel-Hamas conflict, the board reported a significant increase in daily user appeals regarding content linked to the Middle East and North Africa region. In October, Meta informed CNN that it had set up a special operations center with a team of experts, including fluent Hebrew and Arabic speakers, to closely monitor and respond to the rapidly evolving situation. It also stated that it was collaborating with third-party fact checkers in the region.

The Oversight Board announced on Tuesday that Meta had implemented temporary measures in response to the outbreak of conflicts, including the adjustment of thresholds for automatic removal of potentially harmful content violating its hate speech, violence, incitement, and bullying and harassment policies.

Essentially, the board stated that Meta had aggressively utilized its automated tools to eliminate prohibited content, noting that this action prioritized safety but also raised the risk of mistakenly removing non-violating conflict-related content. As of December 11, Meta had yet to revert the content moderation thresholds for its automated systems to their normal levels.

The board's review examined two pieces of a video shared on Instagram depicting the aftermath of a strike near Al-Shifa Hospital in Gaza City, and another video on Facebook showing two hostages being abducted by Hamas militants.

The initial video depicted "individuals, including children, injured or deceased, on the ground and/or in distress." According to the board, the video's caption in Arabic and English made reference to the Israeli army and indicated that the hospital had been "targeted by the usurping occupation."

Oversight Board Finds Meta's Removal of Two Israel-Hamas Videos Unjustified

The Threads application is seen running on a mobile device in this photo illustration July 8 in Warsaw, Poland.

Jaap Arriens/NurPhoto/Shutterstock

Metas Threads is now available in the EU

Meta's automated systems initially removed the post for violating its rules on graphic and violent content. When a user appealed the decision to restore the video, Meta's systems automatically rejected the appeal, citing a high confidence level in their determination that the content violated its rules. However, after the Oversight Board took up the case, Meta made the video viewable with a warning about its disturbing content. The warning restricts the video from being viewed by minors and from being recommended to adult users.

On Tuesday, the Oversight Board stated that the video should not have been removed in the first place and criticized Meta for limiting the video's circulation, stating that it did not align with the company's responsibility to respect freedom of expression.

The board reviewed a second video showing a woman on a motorbike and a man being taken away by Hamas militants, with a caption prompting viewers to gain a "deeper understanding" of the October 7 attack on Israel. Meta initially took down the post for violating its policy against dangerous organizations and individuals, which prohibits sharing imagery of terrorist attacks on visible victims. (Meta considers Hamas a dangerous organization and classified the October 7 attack as a terrorist attack.)

The company has decided to reinstate the video, adding a warning screen, following a review by the board. This is part of a broader effort to make limited exceptions to the policy on dangerous organizations and individuals, specifically in situations where the content aims to denounce, raise awareness of, or report on kidnappings, or to advocate for the release of hostages. Similar to the previous video, the warning screen limits visibility for minors and restricts the ability to recommend the video to other Facebook users.

The board has stated that, like the first video, the content should not have been removed and that preventing the video from being recommended is not in line with Meta's human rights obligations.

Meta announced on Tuesday that it will not be revising its guidelines for recommending videos that have been reviewed by the oversight board. The board disagreed with the existing limitations but did not provide a formal recommendation on how they should be addressed. According to the board's decision, excluding content that raises awareness of potential human-rights abuses, conflicts, or acts of terrorism from recommendations is deemed unnecessary and disproportionate, considering the significant public interest in such content.