Facebook owner Meta’s use of algorithms to promote user engagement and increase ad revenue contributed to anti-Rohingya sentiment in Myanmar ahead of a brutal military campaign against the ethnic group in 2017, rights group Amnesty International said Thursday.
In a new report entitled, “The Social Atrocity: Meta and the right to remedy for the Rohingya,” Amnesty lays out how Meta failed to prevent Facebook from amplifying the kind of hateful rhetoric that led to communal violence against the ethnic group and a state-sanctioned “clearance operation” in 2017 that forced more than 700,000 across the border to Bangladesh, where many continue to languish in refugee camps.
“In 2017, the Rohingya were killed, tortured, raped and displaced in the thousands as part of the Myanmar security forces’ campaign of ethnic cleansing,” Amnesty Secretary General Agnès Callamard said in a statement accompanying the release of the report.
“In the months and years leading up to the atrocities, Facebook’s algorithms were intensifying a storm of hatred against the Rohingya which contributed to real-world violence.”
Callamard said that while the military was committing crimes against humanity against the Rohingya, “Meta was profiting from the echo chamber of hatred created by its hate-spiraliing algorithms.
“Meta must be held to account. The company now has a responsibility to provide reparations to all those who suffered the violent consequences of their reckless actions,” she said.
Meta did not immediately respond to requests by RFA Burmese for comment on Amnesty’s findings. Amnesty said that in June, Meta declined to comment when asked to respond to the allegations contained in its report.
Social media role ‘significant’
In its report, Amnesty specifically pointed to actors linked to the military and radical Buddhist nationalist groups who “systematically flooded” the Facebook platform with disinformation regarding an impending Muslim takeover of the country and seeking to portray Rohingya as sub-human invaders.
“The mass dissemination of messages that advocated hatred, inciting violence and discrimination against the Rohingya, as well as other dehumanizing and discriminatory anti-Rohingya content, poured fuel on the fire of long-standing discrimination and substantially increased the risk of an outbreak of mass violence,” Amnesty said in its report.
Following the 2017 violence, the U.N.’s Independent International Fact-Finding Mission on Myanmar called for senior military officials to be investigated and prosecuted for war crimes, crimes against humanity and genocide.
The body found that “[t]he role of social media [was] significant” in the atrocities. Amnesty said its report found that Meta’s contribution “was not merely that of a passive and neutral platform that responded inadequately.” Instead, it said, Meta’s algorithms “proactively amplified and promoted content on the Facebook platform which incited violence, hatred and discrimination” against the Rohingya.
Because Meta’s business model is based on targeted advertising, the more engaged users are, the more ad revenue Meta earns, the report said.
“As a result, these systems prioritize the most inflammatory, divisive and harmful content as this content is more likely to maximize engagement,” it said.
Examples of anti-Rohingya content cited by Amnesty included a Facebook post referring to a human rights defender who allegedly cooperated with the U.N. fact-finding mission as a “national traitor” and which consistently added the adjective “Muslim.” The post was shared more than 1,000 times and sparked calls for their death. The U.N. group called Meta’s response to its attempts to report the post “slow and ineffective.”
Unheeded warnings
Amid the swelling rancor and growing likelihood of communal violence, local civil society activists repeatedly called on Meta to act between 2012 and 2017, but Amnesty said the company failed to heed the warnings.
Instead, the report said, internal Meta documents leaked by a whistleblower show that the core content-shaping algorithms that power the Facebook platform “all actively amplify and distribute content which incites violence and discrimination, and deliver this content directly to the people most likely to act upon such incitement.”
By failing to engage in appropriate human rights due diligence in respect to its operations in Myanmar ahead of the 2017 atrocities, “Meta substantially contributed to adverse human rights impacts suffered by the Rohingya and has a responsibility to provide survivors with an effective remedy,” Amnesty said.
Amnesty’s report called on Meta to work with survivors and civil society organizations to support them to provide an effective remedy to affected Rohingya communities and to undertake a comprehensive review and overhaul of its human rights due diligence to address what it called “the systemic and widespread human rights impacts” of its business model.