How Meta impacted the Israeli-Palestinian conflict in May 2021

DI PAULA MURESAN

27/10/2022

A recent study has shown how Meta (Facebook, Instagram and WhatsApp) impacted human rights in the outbreak of violence transpired after the eviction of Palestinian families in the Sheikh Jarrah neighborhood in May 2021, during the ongoing Israeli occupation of the West Bank. The Israeli-Palestinian conflict was in fact brought back in the media spotlight after the increasing tensions, leading to palestinian riots and culminating with Hamas rockets attacks and Israeli airstrikes targeting the Gaza Strip.


Inside the report

The report was brought forward by Business for Social Responsibility and was commissioned by Meta itself: it was found how both Facebook and Instagram played a role in the conflict by reducing “the rights of Palestinian users to freedom of expression, freedom of assembly, political participation and therefore on the ability of Palestinians to share information and insights about their experiences as they occurred.” (BSR: Human Rights Due Diligence of Meta’s Impacts in Israel and Palestine report).

More specifically both over-enforcement (defined as “erroneously removed content and account penalties”) and under-enforcement (“failure to remove violating content and failure to apply penalties to offending accounts”) emerged: while the first one was more likely to be related to Arabic content, the second one was mostly linked to Hebrew content. A possible explanation could be the lack of a Hebrew “classifier” for “hostile speech” and several existing classifiers against terrorist organization, therefore leading to a higher likelihood in Arabic content censorship.


Investigation about a biased system

BSR then scrutinized the possibility of a biased content moderation system: they started by considering both intentional bias and unintentional bias. The latter refers to policies and processes that may figure as neutral on the outside but that actually have an impact on some people and not on others. In fact, while there was no evidence of ethnic, racial or religious motive, therefore an intentional bias, various instances of unintentional bias were found.

Based on the reviewed material, Arabic classifiers may indeed lead to higher error rates for Palestinian Arabic. This, combined with the shortage of Hebrew classifiers/data sets and the fact that potentially violating Arabic content may not have been further examined by content reviewers who speak or understand the specific dialect of the content, resulted in a less precise content moderation system for Arabic users compared to Hebrew users. Unlike English, Arabic is indeed a language with many dialects: the lack of enough content reviewers who properly possess the knowledge of the specific Arabic dialect is one of the causes of this system’s inaccuracy.


How were Palestinians essentially impacted

Palestinian users were not only prevented from sharing content and insight on what was happening but also unable to re-share posts. Searchability and visibility content was affected, not to mention that two instagram glitches occurred on May 5 and 6: glitches that, however, affected the reach of stories globally. As a consequence of the rising tension between the two parts, Jerusalem was marked as a Temporary High-Risk Location: this procedure is used to allow the system to apply stricter policies in geographical areas where there is a higher risk of violence. The policy was later expanded to all of Israel, the West Bank, and Gaza.

Some concrete episodes of over-enforcement and censorship:

1. The hashtag #AlAqsa being added to a hashtag blocklist, resulting in #AlAqsa being hidden from search results: this was only later marked as an error.

2. Palestinian journalists reporting blocking of their Whatsapp account.


How should Meta address his human rights impact on the matter

In the report conclusions we can find 21 recommendations on how Meta should “address its adverse human rights impacts”. Some of the most relevant ones could be taking in consideration the creation of arabic classifiers that respond to arabic dialects of the specific area and establishing a crisis response team. Among the 21 recommendations, Meta committed they will be implementing 10, some of them being: providing “more detail on the specific policy area that was violated” and  “hiring more content reviewers with diverse dialect and language capabilities”.


Final considerations

Although Meta wasn’t blamed of intentional bias and racial motive, we could overall state that Meta had a role in reinforcing power asymmetries by extending the real world palestinian occupation to a sort of “digital occupation”. In a world where everything passes through the lens of the online world we can easily notice how not only the online has all the means in shaping the offline, but most of all how it has the power in arbitrating conflicts by gatekeeping the published content.

We should, finally, acknowledge the fact that social media possesses a political decision making power. In this case precisely, the power lies in the capacity Meta has to define, for instance, what hate speech or antisemitic means. This gives them the ability to choose the definition on their own, therefore enhances the opportunity to silence “inconvenient” voices. And this, according to the report, is exactly what happened to Palestinian people in May 2021.

With that being said, what emerged from this report gives us a new perspective on Meta’s policy dynamics: hopefully, this episode will prevent future human rights violations on social media from ever occurring again.

Articoli Correlati

Facebook, l'ora più buia

SOS CEDAR COUNTRY ON FIRE:THREE PROBLEMS IN LEBANON

Verso la normalizzazione dei rapporti con Israele

SOS CEDAR COUNTRY ON FIRE:THREE PROBLEMS IN LEBANON

SOS CEDAR COUNTRY ON FIRE:THREE PROBLEMS IN LEBANON

SOS CEDAR COUNTRY ON FIRE:THREE PROBLEMS IN LEBANON