Meta’s Oversight Board, also known as the “supreme court” of Facebook and Instagram’s parent company, Meta, said that both social media platforms are censoring posts by Palestinian and pro-Palestinian users.
According to Nighat Dad, a member of the Meta Oversight Board who has been with the company since its inception, Meta reduces the visibility of posts to limit access to “pro-Palestinian content.”
Dad emphasized Meta’s lack of clear policy and transparency on the matter, leading users to uncertainty about their position. Meta has faced criticism from human rights organizations and users in the past, and internal dissent has also emerged within the company.
Last week, Meta employees demanded an end to censorship of pro-Palestinian views.
Meta employees sent a letter to Zuckerberg
Allegations of censorship against Facebook and Instagram were previously highlighted in reports by Human Rights Watch in 2021 and 2023, documenting instances of content removal and censorship of pro-Palestinian content because of inconsistencies in Meta’s content moderation policies.
The reports also revealed instances where posts by Palestinians and their supporters, including those documenting human rights violations, were removed or hidden on Instagram and Facebook.
Meta employees expressed disappointment and disbelief in a letter sent to Meta’s Executive Board Chairperson Mark Zuckerberg last week, citing a lack of attention to the Palestinian community and allies and calling for an end to censorship of pro-Palestinian views.
Nighat Dad echoed concerns about Meta’s “dangerous individuals and organizations” list, which she believes to be problematic because of its inadequate distinction between neutral messages and those endorsing terrorism.
Instagram labels those who use the Palestinian flag as “terrorists”
Concerns have also been raised about Meta’s response to posts related to the conflict between Israel and Palestine, with allegations of increased censorship following attacks by Hamas in October 2023.
Instagram reportedly added the word “terrorist” to the profiles of some Palestinian users who wrote “alhamdulillah” and used the Palestinian flag emoji in their profile descriptions after the attacks.
Meta apologized, attributing the error to inappropriate automatic translation. The controversy surrounding Meta’s handling of content moderation policies continues, with ongoing criticism from both internal and external stakeholders.
Source: Newsroom