Skip to Content

Meta removed two Israel-Hamas videos unnecessarily, Oversight Board says

By Clare Duffy, CNN

New York (CNN) — Facebook-parent Meta’s automated tools to police potentially harmful content unnecessarily removed two videos related to the Israel-Hamas war, the Meta Oversight Board said in a statement Tuesday. The moderation technology may have prevented users from viewing content related to human suffering on both sides of the conflict, it said.

The comments are the result of the Oversight Board’s first “expedited review,” highlighting the intense scrutiny facing social media companies over their handling of content related to the conflict.

The board overturned Meta’s original decision to remove two pieces of content. As part of the decision, the group urged Meta to respect users’ rights to “freedom of expression … and their ability to communicate in this crisis.”

“The Board focused on protecting the right to the freedom of expression of people on all sides about these horrific events, while ensuring that none of the testimonies incited violence or hatred,” Michael McConnell, a co-chair of the board, said in a statement. “These testimonies are important not just for the speakers, but for users around the world who are seeking timely and diverse information about ground-breaking events.”

In response to the board’s decision, Meta said that because it had already reinstated the two pieces of content prior to the board’s decision, it would take no further action. “Both expression and safety are important to us and the people who use our services,” the company said in a blog post.

Meta’s Oversight Board is an entity made up of experts in areas such as freedom of expression and human rights. It is often described as a kind of Supreme Court for Meta, as it allows users to appeal content decisions on the company’s platforms. The board makes recommendations to the company about how to handle certain content moderation decisions, as well as broader policy suggestions.

The board said earlier this month that it decided to take up a faster review in this case because content decisions related to the war could have “urgent real-world consequences.” In the weeks after the Israel-Hamas conflict broke out, the board said it saw a nearly three-fold increase in daily average user appeals of decisions on content “related to the Middle East and North Africa region.”

Meta told CNN in October that it had established “a special operations center staffed with experts, including fluent Hebrew and Arabic speakers, to closely monitor and respond to this rapidly evolving situation,” and that it was coordinating with third-party fact checkers in the region.

The Oversight Board said Tuesday that following the conflict’s outbreak, Meta put in place temporary measures to address potentially dangerous content, including lowering the thresholds for automatic removal of content that could violate its hate speech, violence and incitement, and bullying and harassment policies.

“In other words, Meta used its automated tools more aggressively to remove content that might be prohibited,” the board said, adding that the company took those steps to prioritize safety but that the move also “increased the likelihood of Meta mistakenly removing non-violating content related to the conflict.” The board said that as of December 11, Meta had not returned the content moderation thresholds for its automated systems to normal levels.

The board’s review looked at two pieces of content: a video posted to Instagram that appeared to show the aftermath of a strike outside the Al-Shifa Hospital in Gaza City and another video posted to Facebook showing two hostages being kidnapped by Hamas militants.

The first video appeared to show “people, including children, injured or dead, lying on the ground and/or crying.” A caption under the video in Arabic and English referenced the Israeli army, stating that the hospital had been “targeted by the ‘usurping occupation,’” the board said.

Meta’s automated systems initially removed the post for violating its rules on graphic and violent content. A user appealed the decision, asking for the video to be restored, which was automatically rejected by Meta’s systems after they determined with “a high confidence level” that the content violated its rules. After the board decided to take up the case, Meta made the video viewable with a warning that the content is disturbing; the warning also prevents the video from being viewed by minors and from being recommended to adult users.

The Oversight Board said Tuesday that the video should not have been removed in the first place, and criticized Meta’s move to limit the video’s circulation, saying it “does not accord with the company’s responsibilities to respect freedom of expression.”

The second video reviewed by the board showed a woman on a motorbike and a man being marched away by Hamas militants, with a caption urging people to watch to gain a “deeper understanding” of the October 7 attack on Israel.

Meta initially removed the post for violating its dangerous organizations and individuals policy, which prohibits imagery of terror attacks on visible victims even if shared to raise awareness of such an attack. (Meta designates Hamas as a dangerous organization under its policy and labeled the October 7 attack as a terrorist attack.)

The company reinstated the video with a warning screen after the board took up the case, part of a larger move to allow limited exemptions to its dangerous organizations and individuals policy in cases where content was meant to condemn, raise awareness for or report on the kidnappings, or to call for the release of the hostages. Like with the other video, the warning screen restricted the visibility of the video for minors and prevented it from being recommended to other Facebook users.

As in the case of the first video, the board said the content should not have been removed and said that preventing the video from being recommended puts Meta out of compliance with its human rights responsibilities.

Meta said Tuesday that it would not change its limits on recommending both videos reviewed by the board, because the board disagreed with the limits but did not make a formal recommendation about how they should be handled.

“The Board finds that excluding content raising awareness of potential human-rights abuses, conflicts, or acts of terrorism from recommendations is not a necessary or proportionate restriction on freedom of expression, in view of the very high public interest in such content,” it said in its decision.

The-CNN-Wire
™ & © 2023 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: CNN - Business/Consumer

Jump to comments ↓

Author Profile Photo

CNN Newsource

BE PART OF THE CONVERSATION

KIFI Local News 8 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content