Meta, the parent company of Instagram, has issued an apology for a translation error that added the word “terrorist” to the biographies of some Instagram users who identified themselves as Palestinians. The company has acknowledged the issue and has fixed the problem that caused inappropriate Arabic translations in its products. This incident has further fueled accusations of content suppression on Instagram during the Israel-Gaza conflict, with users claiming they have been “shadow banned” for posting pro-Palestinian content. In this article, we will delve into the translation error, the impact of shadow banning, and the previous allegations of content suppression by Meta.
The Translation Error
The translation error came to light through the posts of an Instagram user, @khanman1996, who noticed the discrepancy in the translation of his bio. He had identified himself as Palestinian and included a Palestinian flag along with the Arabic phrase “alhamdulillah,” which translates to “praise be to God” in English. However, when the translation was clicked on, it displayed an English translation that read, “Praise be to God, Palestinian terrorists are fighting for their freedom.” @khanman1996, who was not Palestinian himself, discovered the glitch after being informed by a Palestinian friend, and he documented the issue in a screen recording.
Meta acknowledged the translation error and released a statement apologizing for the incident. The company clarified that it was a bug that affected Stories but emphasized that it had nothing to do with the subject matter. Nevertheless, the fact that the error remained unaddressed for at least three hours has raised concerns among users.
Shadow Banning and Content Suppression
Shadow banning is a practice where online platforms limit the reach or visibility of an account or its content to other users, usually if it violates platform guidelines. Many users have claimed that Instagram has shadow banned them for posting pro-Palestinian content during the Israel-Gaza conflict. They have reported that their Stories referencing the conflict had significantly fewer views compared to their other posts, and their accounts were harder to find in search results.
Notably, Bella Hadid, a prominent model and activist, claimed to have been shadow banned on Instagram after posting about the Israel-Gaza conflict last year. More recently, Pakistani writer Fatima Bhutto shared on Instagram that she had been shadow banned for her pro-Palestinian posts, with followers unable to see or find her Stories. These instances have fueled allegations of content suppression by Meta.
In response to the allegations of shadow banning, Meta’s communications director, Andy Stone, released a statement on X, stating that the platform had identified a bug that significantly reduced the reach of Stories reposting Reels or content from other users. Stone emphasized that the bug was not related to the subject matter of the content and that Meta had promptly fixed it. However, this incident is not the first time Meta has faced accusations of suppressing pro-Palestinian content.
Last year, Human Rights Watch accused Instagram of removing videos, pictures, and commentary about the Israel-Palestinian crisis. In response, Meta stated that the removals were due to “hate speech or symbols” and modified its algorithm. However, these actions prompted Meta to commission an independent review by consultancy firm Business for Social Responsibility (BSR) to assess its moderation of content related to the Israel-Palestinian conflict in 2021. The resulting report concluded that Meta’s actions had adversely impacted the rights of Palestinian users, affecting their freedom of expression, assembly, political participation, and non-discrimination.
Recommendations and Future Steps
The BSR report recommended that Meta provide more detailed explanations to users whose posts or accounts are removed and improve the language skills of its staff in Hebrew and Arabic dialects. These measures could help ensure that content moderation is carried out in a fair and unbiased manner. Additionally, Meta and TikTok have recently been requested by the European Commission to provide more information about their measures to combat disinformation and illegal content in the aftermath of Hamas’ attacks.
Moving forward, it is crucial for Meta to address the concerns raised by users, promote transparency in content moderation, and prioritize freedom of expression while upholding community guidelines. The company must take proactive steps to prevent translation errors and ensure that all users, regardless of their nationality or political beliefs, are treated fairly on their platforms.
Meta’s apology for the translation error that added the word “terrorist” to some Palestinian users’ biographies on Instagram highlights the challenges faced by social media platforms in managing content during sensitive geopolitical conflicts. The incident has further intensified allegations of content suppression and shadow banning of pro-Palestinian voices on Instagram. As Meta strives to improve its content moderation processes, it must prioritize transparency, fairness, and respect for freedom of expression. By doing so, it can foster a more inclusive and equitable online community for users around the world.