Instagram and Facebook users attempting to share scenes of devastation from a crowded hospital in Gaza City claim their posts are being suppressed, despite previous company policies protecting the publication of violent, newsworthy scenes of civilian death.
Late Tuesday, amid a 10-day bombing campaign by Israel, the Gaza Strip’s al-Ahli Hospital was rocked by an explosion that left hundreds of civilians killed and wounded. Footage of the flaming exterior of the hospital, as well as dead and wounded civilians, including children, quickly emerged on social media in the aftermath of the attack.
While the Palestinian Ministry of Health in the Hamas-run Gaza Strip blamed the explosion on an Israeli airstrike, the Israeli military later said the blast was caused by an errant rocket misfired by militants from the Gaza-based group Islamic Jihad.
While widespread electrical outages and Israel’s destruction of Gaza’s telecommunications infrastructure have made getting documentation out of the besieged territory difficult, some purported imagery of the hospital attack making its way to the internet appears to be activating the censorship tripwires of Meta, the social media giant that owns Instagram and Facebook.
Since Hamas’s surprise attack against Israel on October 7 and amid the resulting Israeli bombardment of Gaza, groups monitoring regional social media activity say censorship of Palestinian users is at a level not seen since May 2021, when violence flared between Israel and Gaza following Israeli police incursions into Muslim holy sites in Jerusalem.
Two years ago, Meta blamed the abrupt deletion of Instagram posts about Israeli military violence on a technical glitch. On October 15, Meta spokesperson Andy Stone again attributed claims of wartime censorship on a “bug” affecting Instagram. (Meta could not be immediately reached for comment.)
“It’s censorship mayhem like 2021. But it’s more sinister given the internet shutdown in Gaza.”
Since the latest war began, Instagram and Facebook users inside and outside of the Gaza Strip have complained of deleted posts, locked accounts, blocked searches, and other impediments to sharing timely information about the Israeli bombardment and general conditions on the ground. 7amleh, a Palestinian digital rights group that collaborates directly with Meta on speech issues, has documented over hundreds user complaints of censored posts about the war, according to spokesperson Eric Sype, far outpacing deletion levels seen two years ago.
“It’s censorship mayhem like 2021,” Marwa Fatafta, a policy analyst with the digital rights group Access Now, told The Intercept. “But it’s more sinister given the internet shutdown in Gaza.”
In other cases, users have successfully uploaded graphic imagery from al-Ahli to Instagram, suggesting that takedowns are not due to any formal policy on Meta’s end, but a product of the company’s at times erratic combination of outsourced human moderation and automated image-flagging software.
Alleged Photo of Gaza Hospital Bombing
One image rapidly circulating social media platforms following the blast depicts what appears to be the flaming exterior of the hospital, where a clothed man is lying beside a pool of blood, his torso bloodied.
According to screenshots shared with The Intercept by Fatafta, Meta platform users who shared this image had their posts removed or were prompted to remove them themselves because the picture violated policies forbidding “nudity or sexual activity.” Mona Shtaya, nonresident fellow at the Tahrir Institute for Middle East Policy, confirmed she had also gotten reports of two instances of this same image deleted. (The Intercept could not independently verify that the image was of al-Ahli Hospital.)
One screenshot shows a user notified that Instagram had removed their upload of the photo, noting that the platform forbids “showing someone’s genitals or buttocks” or “implying sexual activity.” The underlying photo does not appear to show anything resembling either category of image.
In another screenshot, a Facebook user who shared the same image was told their post had been uploaded, “but it looks similar to other posts that were removed because they don’t follow our standards on nudity or sexual activity.” The user was prompted to delete the post. The language in the notification suggests the image may have triggered one of the company’s automated, software-based content moderation systems, as opposed to a human review.
Meta has previously distributed internal policy language instructing its moderators to not remove gruesome documentation of Russian airstrikes against Ukrainian civilians, though no such carveout is known to have been provided for Palestinians, whether today or in the past. Last year, a third-party audit commissioned by Meta found that systemic, unwarranted censorship of Palestinian users amounted to a violation of their human rights.