{"id":807377,"date":"2022-09-21T22:45:05","date_gmt":"2022-09-21T22:45:05","guid":{"rendered":"https:\/\/theintercept.com\/?p=408444"},"modified":"2022-09-21T22:45:05","modified_gmt":"2022-09-21T22:45:05","slug":"facebook-report-concludes-company-censorship-violated-palestinian-human-rights","status":"publish","type":"post","link":"https:\/\/radiofree.asia\/2022\/09\/21\/facebook-report-concludes-company-censorship-violated-palestinian-human-rights\/","title":{"rendered":"Facebook Report Concludes Company Censorship Violated Palestinian Human Rights"},"content":{"rendered":"

Facebook and Instagram\u2019s<\/u> speech policies harmed fundamental human rights of Palestinian users during a conflagration that saw heavy Israeli attacks on the Gaza Strip last May, according to a study commissioned by the social media sites\u2019 parent company Meta.<\/p>\n

\u201cMeta\u2019s actions in May 2021 appear to have had an adverse human rights impact \u2026 on the rights of Palestinian users to freedom of expression, freedom of assembly, political participation, and non-discrimination, and therefore on the ability of Palestinians to share information and insights about their experiences as they occurred,\u201d says the long-awaited report, which was obtained by The Intercept in advance of its publication.<\/p>\n

Commissioned by Meta last year and conducted by the independent consultancy Business for Social Responsibility, or BSR, the report focuses on the company\u2019s censorship practices and allegations of bias during bouts of violence against Palestinian people by Israeli forces last spring.<\/p>\n

\u201cMeta\u2019s actions in May 2021 appear to have had an adverse human rights impact.”<\/blockquote>\n

Following protests<\/a> over the forcible eviction<\/a> of Palestinian families<\/a> from the Sheikh Jarrah neighborhood in occupied East Jerusalem, Israeli police cracked down on protesters in Israel and the West Bank, and launched military air strikes against Gaza that injured thousands of Palestinians, killing 256, including 66 children, according to the United Nations<\/a>. Many Palestinians attempting<\/a> to document<\/a> and protest the violence<\/a> using Facebook and Instagram found their posts spontaneously disappeared<\/a> without recourse, a phenomenon the BSR inquiry attempts to explain.<\/p>\n

Last month, over a dozen civil society and human rights groups wrote an open letter<\/a> protesting Meta\u2019s delay in releasing the report, which the company had originally pledged to release in the \u201cfirst quarter\u201d of the year.<\/p>\n

While BSR credits Meta for taking steps to improve its policies, it further blames \u201ca lack of oversight at Meta that allowed content policy errors with significant consequences to occur.\u201d<\/p>\n

<\/div>\n

Though BSR is clear in stating that Meta harms Palestinian rights with the censorship apparatus it alone has constructed, the report absolves Meta of \u201cintentional bias.\u201d Rather, BSR points to what it calls \u201cunintentional bias,\u201d instances \u201cwhere Meta policy and practice, combined with broader external dynamics, does lead to different human rights impacts on Palestinian and Arabic speaking users\u201d \u2014 a nod to the fact that these systemic flaws are by no means limited to the events of May 2021.<\/p>\n

Meta responded to the BSR report in a document to be circulated along with the findings. (Meta did not respond to The Intercept’s request for comment about the report by publication time.) In a footnote in the response, which was also obtained by The Intercept, the company wrote, \u201cMeta\u2019s publication of this response should not be construed as an admission, agreement with, or acceptance of any of the findings, conclusions, opinions or viewpoints identified by BSR, nor should the implementation of any suggested reforms be taken as admission of wrongdoing.\u201d<\/p>\n

According to the<\/u> findings of BSR\u2019s report, Meta deleted Arabic content relating to the violence at a far greater rate than Hebrew-language posts, confirming long-running complaints of disparate speech enforcement in the Palestinian-Israeli conflict. The disparity, the report found, was perpetuated among posts reviewed both by human employees and automated software.<\/p>\n

\u201cThe data reviewed indicated that Arabic content had greater over-enforcement (e.g., erroneously removing Palestinian voice) on a per user basis,\u201d the report says. \u201cData reviewed by BSR also showed that proactive detection rates of potentially violating Arabic content were significantly higher than proactive detection rates of potentially violating Hebrew content.\u201d<\/p>\n\n

BSR attributed the vastly differing treatment of Palestinian and Israeli posts to the same<\/a> systemic<\/a> problems rights groups, whistleblowers<\/a>, and researchers<\/a> have all blamed for the company\u2019s past humanitarian<\/a> failures<\/a>: a dismal lack of expertise<\/a>. Meta, a company with over $24 billion in cash reserves<\/a>, lacks staff who understand other cultures, languages, and histories, and is using faulty algorithmic technology to govern speech around the world, the BSR report concluded.<\/p>\n

Not only do Palestinian users face an algorithmic screening that Israeli users do not \u2014 an \u201cArabic hostile speech classifier\u201d that uses machine learning to flag potential policy violations and has no Hebrew equivalent \u2014 the report notes that the Arabic system also doesn\u2019t work well: \u201cArabic classifiers are likely less accurate for Palestinian Arabic than other dialects, both because the dialect is less common, and because the training data \u2014 which is based on the assessments of human reviewers \u2014 likely reproduces the errors of human reviewers due to lack of linguistic and cultural competence.\u201d<\/p>\n

Human employees appear to have exacerbated the lopsided effects of Meta\u2019s speech-policing algorithms. \u201cPotentially violating Arabic content may not have been routed to content reviewers who speak or understand the specific dialect of the content,\u201d the report says. It also notes that Meta didn\u2019t have enough Arabic and Hebrew-speaking staff on hand to manage\u00a0the spike in posts.<\/p>\n

<\/div>\n

These faults had cascading speech-stifling effects, the report continues. \u201cBased on BSR\u2019s review of tickets and input from internal stakeholders, a key over-enforcement issue in May 2021 occurred when users accumulated \u2018false\u2019 strikes that impacted visibility and engagement after posts were erroneously removed for violating content policies.\u201d In other words, wrongful censorship begat further wrongful censorship, leaving the affected wondering why no one could see their posts. \u201cThe human rights impacts \u2026 of these errors were more severe given a context where rights such as freedom of expression, freedom of association, and safety were of heightened significance, especially for activists and journalists,\u201d the report says.<\/p>\n

Beyond Meta\u2019s failures<\/u> in triaging posts about Sheikh Jarrah, BSR also points to the company\u2019s \u201cDangerous Individuals and Organizations\u201d policy \u2014\u00a0referred to as \u201cDOI\u201d in\u00a0the report \u2014 a roster of thousands of people and groups that Meta\u2019s billions of users cannot \u201cpraise,\u201d \u201csupport,\u201d or \u201crepresent.\u201d The full list, obtained and published<\/a> by The Intercept last year, showed that the policy focuses mostly on Muslim and Middle Eastern entities, which critics described as a recipe for glaring ethnic and religious bias.<\/p>\n

Meta claims that it\u2019s legally compelled to censor mention of groups designated by the U.S. government, but legal scholars have disputed the company\u2019s interpretation of federal anti-terrorism laws. Following The Intercept\u2019s report on the list, the Brennan Center for Justice called<\/a> the company\u2019s claims of legal obligation a \u201cfiction.\u201d<\/p>\n

“Meta\u2019s DOI policy and the list are more likely to impact Palestinian and Arabic-speaking users, both based upon Meta\u2019s interpretation of legal obligations, and in error.\u201d<\/blockquote>\n

BSR agrees the policy is systemically biased: \u201cLegal designations of terrorist organizations around the world have a disproportionate focus on individuals and organizations that have identified as Muslim, and thus Meta\u2019s DOI policy and the list are more likely to impact Palestinian and Arabic-speaking users, both based upon Meta\u2019s interpretation of legal obligations, and in error.\u201d<\/p>\n

Palestinians are particularly vulnerable to the effects of the blacklist, according to the report: \u201cPalestinians are more likely to violate Meta\u2019s DOI policy because of the presence of Hamas as a governing entity in Gaza and political candidates affiliated with designated organizations. DOI violations also come with particularly steep penalties, which means Palestinians are more likely to face steeper consequences for both correct and incorrect enforcement of policy.\u201d<\/p>\n

The document concludes with a list of 21 nonbinding policy recommendations, including increased staffing capacity to properly understand and process Arabic posts, implementing a Hebrew-compatible algorithm, increased company oversight of outsourced moderators, and both reforms to and increased transparency around the\u00a0\u201cDangerous Individuals and Organizations\u201d\u00a0policy.<\/p>\n

In its response to the report, Meta vaguely commits to implement or consider implementing aspects of 20 out of 21 the recommendations. The exception is a call to \u201cFund public research into the optimal relationship between legally required counterterrorism obligations and the policies and practices of social media platforms,\u201d which the company says it will not pursue because it does not wish to provide legal guidance for other companies. Rather, Meta suggests concerned experts reach out directly to the federal government.<\/p>\n

The post Facebook Report Concludes Company Censorship Violated Palestinian Human Rights<\/a> appeared first on The Intercept<\/a>.<\/p>\n\n

This post was originally published on The Intercept<\/a>. <\/p>","protected":false},"excerpt":{"rendered":"

The report, due out tomorrow, said Facebook and Instagram showed bias against Palestinians during a brutal Israeli assault on the Gaza Strip last May.<\/p>\n

The post Facebook Report Concludes Company Censorship Violated Palestinian Human Rights<\/a> appeared first on The Intercept<\/a>.<\/p>\n","protected":false},"author":365,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[369,340],"tags":[],"_links":{"self":[{"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/posts\/807377"}],"collection":[{"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/users\/365"}],"replies":[{"embeddable":true,"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/comments?post=807377"}],"version-history":[{"count":2,"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/posts\/807377\/revisions"}],"predecessor-version":[{"id":807542,"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/posts\/807377\/revisions\/807542"}],"wp:attachment":[{"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/media?parent=807377"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/categories?post=807377"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/radiofree.asia\/wp-json\/wp\/v2\/tags?post=807377"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}