LONDON – In Sri Lanka and Myanmar, Facebook continued to post warnings that they had contributed to the violence. In India, activists have called on the company to fight against positions held by politicians against Muslims. In Ethiopia, groups advocated the social network blocking hate speech after hundreds were killed in ethnic violence on social media.
“The offline problems that rocked the country are fully visible online,” wrote activists, civil society groups and journalists in Ethiopia in an open letter last year.
For years, Facebook and Twitter have rejected calls to remove hate speech or other comments from public figures and government officials that civil society groups and activists have said risk inciting violence. Companies stuck to guidelines, driven by American ideals of free speech, that give such numbers more leeway to use their platforms for communication.
But last week, Facebook and Twitter cut President Trump off their platforms for inciting a crowd to attack the U.S. Capitol. These decisions have angered human rights groups and activists who are now calling on companies to apply their policies evenly, especially in smaller countries where platforms dominate communication.
“When I saw what the platforms were doing to Trump, I thought, ‘You should have done this before, and you should have done this consistently in other countries around the world,” said Javier Pallero, Policy Director at Access Now, one Human rights ombudsman group involved in the letter from Ethiopia: “All over the world we are at the mercy if they choose to act.”
“Sometimes they act very late,” he added, “and sometimes they don’t act at all.”
David Kaye, a law professor and former United Nations observer on freedom of expression, said political figures in India, the Philippines, Brazil and elsewhere deserve a review of their online behavior. But he said the actions against Mr. Trump raise difficult questions about how the power of American internet companies is being used and whether their actions set a new precedent for more aggressive police speech around the world.
“The question for the future is whether this is a new type of standard that they want to adopt for executives around the world and whether they have the resources to do so.” Mr. Kaye said. “There will be a real increase in demand to do this elsewhere in the world.”
Facebook, which also owns Instagram and WhatsApp, is the world’s largest social network with more than 2.7 billion monthly users. More than 90 percent of them live outside the United States. The company declined to comment, but said the actions against Mr Trump are based on his violation of existing rules and do not constitute a new global policy.
“Our guidelines apply to everyone,” said Sheryl Sandberg, Facebook’s chief operating officer, in a recent interview with Reuters. “The policy is that you cannot incite violence, you cannot be part of the incitement to violence.”
Twitter, which has around 190 million users every day around the world, said its rules for world leaders are not new. When reviewing posts that could lead to violence, the context of the events is crucial.
“Offline damage from online speech is proven to be real, and most importantly, drives our policies and enforcement,” said Jack Dorsey, CEO of Twitter, in a post on Wednesday. However, he said the decision “sets a precedent that I consider dangerous: the power that an individual or a company has over any part of global public debate.”
There are signs that Facebook and Twitter have started to act more confidently. Following the attack on the Capitol, Twitter updated its policy to permanently ban the accounts of repeat offenders of its political content rules. Facebook has taken action against a number of accounts outside the United States, including the deletion of the account of a state-owned media company in Iran and the closure of government accounts in Uganda, where violence erupted before the elections. Facebook said the shutdowns had nothing to do with the Trump decision.
Many activists have recognized Facebook for its global influence and non-uniform application of rules. They said that in many countries there is a lack of cultural understanding to determine when posts could lead to violence. Too often, they said, Facebook and other social media companies don’t act even when they receive warnings.
In 2019, in Slovakia, Facebook did not cut down on posts by a member of parliament who was convicted by a court and robbed of his seat of government for incitement and racist remarks. In Cambodia, Human Rights Watch said the company was slow to respond to government officials participating in a social media campaign to tarnish a prominent Buddhist monk who campaigned for human rights. In the Philippines, President Rodrigo Duterte used Facebook to reach journalists and other critics.
After a wave of violence, Ethiopian activists said Facebook was being used to incite violence and promote discrimination.
“The truth is, despite good intentions, these companies do not guarantee uniform application or enforcement of their rules,” said Agustina Del Campo, director of the Center for Freedom of Expression Studies at the University of Palermo in Buenos Aires. “And often they lack context and understanding when they try.”
In many countries, it is believed that Facebook bases its actions on its business interests rather than human rights. In India, home of most of Facebook’s users, the company has been accused of not monitoring anti-Muslim content from political figures for fear of angering the government of Prime Minister Narendra Modi and his ruling party.
“The developments in our countries are not being seriously addressed,” said Mishi Choudhary, a technology lawyer and founder of the Software Freedom Law Center, a digital rights group in India. “Any deletion of content raises the question of freedom of expression, but inciting violence or using a platform for dangerous speech is not a question of free speech, but a question of democracy, law and order.”
But while many activists urged Facebook and Twitter to be more active in protecting human rights, they expressed their anger at the power companies have to control language and influence public opinion.
Some also warned that actions against Mr Trump would provoke a backlash, with political leaders in some countries taking steps to prevent social media companies from censoring the language.
Government officials in France and Germany raised alarm over the ban on Mr Trump’s accounts, questioning whether private corporations should be able to unilaterally silence a democratically elected leader. A draft law that is being examined for the European Union of 27 states would set new rules for the content moderation policy of the largest social networks.
Barbora Bukovská, senior director of law and politics at Article 19, a digital rights group, said the risk is particularly high in countries whose leaders have historically used social media to fuel divisions. She said the events in Washington sparked a bill in Poland by the ruling right-wing nationalist party that would punish social media companies for not removing explicitly illegal content, which could allow for greater targeting of LGBTQ people.
“These decisions about Trump were the right decisions, but there are broader questions that go beyond Trump,” said Ms. Bukovská.