Meta’s Oversight Board says it unfairly removed videos of Israel-Hamas war

Share via:

The Meta Oversight Board said removing the content could impact “freedom of expression and access to information” in the war. | Illustration: Nick Barclay / The Verge

Meta’s Oversight Board has criticized the company’s automated moderation tools for being too aggressive after two videos that depicted hostages, injured civilians, and possible casualties in the Israel-Hamas war were — it says — unfairly removed from Facebook and Instagram. In a report published on Tuesday, the external review panel determined that the posts should have remained live and that removing the content has a high cost to “freedom of expression and access to information” in the war. (A warning for our readers: the following descriptions of the content may be disturbing.)

One of the removed videos, posted to Facebook, depicts an Israeli woman during the October 7th attack on Israel by Hamas, pleading with kidnappers who were taking her hostage not to kill her. The other video was published on Instagram and shows what appears to be the aftermath of an Israeli strike on or near al-Shifa Hospital in Gaza City. The post contains footage of killed or injured Palestinians, including children.

The board says that, in the case of the latter video, both the removal and a rejection of the user’s appeal to restore the footage were conducted by Meta’s automated moderation tools, without any human review. The board took up a review of the decision on an “accelerated timeline of 12 days,” and after the case was taken up, the videos were restored with a content warning screen.

In its report, the board found the moderation thresholds that had been lowered to more easily catch violating content following the attack on October 7th “also increased the likelihood of Meta mistakenly removing non-violating content related to the conflict.” The board says that the lack of human-led moderation during these types of crises can lead to the “incorrect removal of speech that may be of significant public interest” and that Meta should have been swifter to allow content “shared for the purposes of condemning, awareness-raising, news reporting or calling for release” with a warning screen applied.

The board also criticized Meta for demoting the two reviewed posts with warning screens, preventing them from appearing as recommended content to other Facebook and Instagram users despite the company acknowledging that the posts were intended to raise awareness. Meta has since responded to the board’s decision to overturn the removals, saying that because no recommendations were provided by the panel, there will be no further updates to the case.

Meta is hardly the only social media giant being scrutinized for its handling of content surrounding the Israel-Hamas war. Verified users on X (formerly Twitter) have been accused of being “misinformation super-spreaders” by misinformation watchdog organization NewsGuard. TikTok and YouTube are also being scrutinized under the EU’s Digital Services Act following a reported surge of illegal content and disinformation on the platforms, and the EU has opened a formal investigation into X. The Oversight Board case, by contrast, highlights the risks of overmoderation — and the tricky line platforms have to walk.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

Meta’s Oversight Board says it unfairly removed videos of Israel-Hamas war

The Meta Oversight Board said removing the content could impact “freedom of expression and access to information” in the war. | Illustration: Nick Barclay / The Verge

Meta’s Oversight Board has criticized the company’s automated moderation tools for being too aggressive after two videos that depicted hostages, injured civilians, and possible casualties in the Israel-Hamas war were — it says — unfairly removed from Facebook and Instagram. In a report published on Tuesday, the external review panel determined that the posts should have remained live and that removing the content has a high cost to “freedom of expression and access to information” in the war. (A warning for our readers: the following descriptions of the content may be disturbing.)

One of the removed videos, posted to Facebook, depicts an Israeli woman during the October 7th attack on Israel by Hamas, pleading with kidnappers who were taking her hostage not to kill her. The other video was published on Instagram and shows what appears to be the aftermath of an Israeli strike on or near al-Shifa Hospital in Gaza City. The post contains footage of killed or injured Palestinians, including children.

The board says that, in the case of the latter video, both the removal and a rejection of the user’s appeal to restore the footage were conducted by Meta’s automated moderation tools, without any human review. The board took up a review of the decision on an “accelerated timeline of 12 days,” and after the case was taken up, the videos were restored with a content warning screen.

In its report, the board found the moderation thresholds that had been lowered to more easily catch violating content following the attack on October 7th “also increased the likelihood of Meta mistakenly removing non-violating content related to the conflict.” The board says that the lack of human-led moderation during these types of crises can lead to the “incorrect removal of speech that may be of significant public interest” and that Meta should have been swifter to allow content “shared for the purposes of condemning, awareness-raising, news reporting or calling for release” with a warning screen applied.

The board also criticized Meta for demoting the two reviewed posts with warning screens, preventing them from appearing as recommended content to other Facebook and Instagram users despite the company acknowledging that the posts were intended to raise awareness. Meta has since responded to the board’s decision to overturn the removals, saying that because no recommendations were provided by the panel, there will be no further updates to the case.

Meta is hardly the only social media giant being scrutinized for its handling of content surrounding the Israel-Hamas war. Verified users on X (formerly Twitter) have been accused of being “misinformation super-spreaders” by misinformation watchdog organization NewsGuard. TikTok and YouTube are also being scrutinized under the EU’s Digital Services Act following a reported surge of illegal content and disinformation on the platforms, and the EU has opened a formal investigation into X. The Oversight Board case, by contrast, highlights the risks of overmoderation — and the tricky line platforms have to walk.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

Q-Commerce Will Create More Jobs Than Railways: Zepto CEO

SUMMARY Palicha said that India’s quick commerce ecosystem will...

China’s tiny robot Erbai ’kidnaps’ 12 larger robots, sparks...

In a wild turn of events straight out of...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!