The Wall Street Journal recently published an article exposing the ties between Facebook India and Indian Prime Minister Narendra Modi’s Bhartiya Janata Party (BJP).
According to the report, Ankhi Das, the public policy director of Facebook’s India, South and Central Asia division, “opposed applying hate-speech rules” on at least four individuals and groups linked with the BJP even though they were “flagged internally for promoting or participating in violence” by the staff.
Das protected BJP leaders like state legislator T. Raja Singh, who had used his Facebook page to say that Muslims who kill cows should be slaughtered, threatened to raze mosques, and posted a photo of him with a sword alongside inflammatory anti-Muslim text, according to WSJ.
Singh remains active on Instagram and Facebook.
Ankhi Das had instructed staff members that “punishing violations” by BJP politicians “would damage the company’s business prospects in the country, Facebook’s biggest global market by number of users,” WSJ reported.
Facebook’s pandering to right-wing publishers and pages is not limited to just South Asia. Executives including Joel Kaplan, Facebook’s vice president of global public policy and Das’s boss, had actively helped conservative pages including Breitbart News and PragerU skirt content regulation rules in the United States too, Buzzfeed recently reported.
It is unlikely that Facebook’s top management is unaware of the impact of content on the site. CEO Mark Zuckerberg reportedly quoted BJP leader Kapil Mishra’s threat of vigilante actions against anti-CAA protesters as content against its regulations. Contrarily, the WSJ report names Mishra as one of the politicians protected by Ankhi Das from Facebook’s hate speech community standards.
While Mishra’s video was pulled down by Facebook, the damage was already done. The video had gone viral and within hours communal riots erupted in New Delhi, killing 53 people, mostly Muslims. According to documents filed by the police and reports by Indian newspapers, these riots were largely organized through Facebook-owned WhatsApp, reports WSJ.
India is WhatsApp’s largest market, with 400 million Indian users. The app has increasingly become a source of misinformation even more than its parent company. In 2018, more than two dozen people were lynched based on false rumors spread through WhatsApp. A 2017 Lokniti-CSDS Mood of the Nation (MOTN) survey found that one-sixth of WhatsApp users were members of a group started by a political leader or party.
Earlier this month, three people died in Bangalore after a Facebook post offensive to Muslims was posted by Congress MLA Akhanda Srinivasamurthy’s nephew and led to clashes between a mob and the police.
This is not the first time Facebook and its lax content practices have played a role in inciting violence in South Asia. In the last two years, the social media giant has had to apologize for its role in violence against minorities in at least two other countries – Myanmar and Sri Lanka.
In 2018, UN human rights investigators stated that Facebook had played a key role in spreading hate speech that had fueled violence in Myanmar. The report found that tens and thousands of Rohingya Muslims were raped and killed, villages were burned and razed to the ground, and over 750,000 Rohingya refugees fled the country to the neighboring Bangladesh after a crackdown by Myanmar’s security forces against the country’s minority Rohingya Muslims.
The United Nations called the ensuing violence “a textbook example of ethnic cleansing.” More than 1 million Rohingyas are still living in camps in Bangladesh.
A few months after the UN report, Facebook admitted that it was used to “foment division and incite offline violence” in Myanmar. In its assessment the social media Goliath added that the platform had not done enough to prevent the violence and that it was bolstering its efforts to stop such instances.
However, last week, Facebook objected to a request from Gambia to release posts and communications by members of Myanmar’s military and police. Gambia is leading a case against Myanmar for committing genocide against Rohingyas in the International Court of Justice.
Facebook’s Das had also met the recently elected Sri Lankan Prime Minister Mahinda Rajapaksa in early 2019 over the “increasing circulation of fake news.”
Ahead of the November 2019 presidential elections in Sri Lanka, The Guardian reported that the official Facebook page of Gotabaya Rajapaksa, now president of Sri Lanka and brother of Mahinda, promoted a post featuring misinformation that “Muslim extremists” had razed a Sri Lankan heritage site. AFP Sri Lanka confirmed with the temple’s chief monk that there had been no such attack.
The post remained online despite criticism.
Earlier this year, Facebook released an apology, this time for its role in Sri Lanka during the 2018 anti-Muslim violence in Kandy. A report by Article One found that the “proliferation of hate speech (e.g., ‘Kill all Muslims, don’t even save an infant; they are dogs’) and misinformation (e.g., that a Muslim restaurateur was adding sterilization pills to his customers’ food)” on Facebook that may have contributed to the 2018 unrest.
“We deplore this misuse of our platform,” Facebook said in May 2020 in response to the report. “We recognize, and apologize for, the very real human rights impacts that resulted.”
Facebook long ago stopped being a bystander in the South Asian politics. After years of damaging legacy in the region, all eyes turn to the American social media behemoth once again on how and if it will be held accountable for its role in spreading fake news and misinformation. Concrete action needs to be undertaken by Facebook to counter propaganda and incitement to violence on its platforms in South Asia and globally.