AP

Rohingya seek reparations from Facebook for role in massacre

Sep 28, 2022, 5:01 AM | Updated: 5:33 pm

FILE - A car passes Facebook's new Meta logo on a sign at the company headquarters on Oct. 28, 2021...

FILE - A car passes Facebook's new Meta logo on a sign at the company headquarters on Oct. 28, 2021, in Menlo Park, Calif. For years, Facebook, now called Meta, has pushed a narrative that it was a neutral platform in Myanmar that was misused by bad actors and failed to moderate violent and hateful material adequately. But a new report by Amnesty International says Facebook was not merely a passive site with insufficient content moderation. Rather, Meta’s algorithms “proactively amplified" material that incited violent hatred against the Rohingya beginning as early as 2012. (AP Photo/Tony Avelar, File)

(AP Photo/Tony Avelar, File)

With roosters crowing in the background as he speaks from the crowded refugee camp in Bangladesh that’s been his home since 2017, Maung Sawyeddollah, 21, describes what happened when violent hate speech and disinformation targeting the Rohingya minority in Myanmar began to spread on Facebook.

“We were good with most of the people there. But some very narrow minded and very nationalist types escalated hate against Rohingya on Facebook,” he said. “And the people who were good, in close communication with Rohingya. changed their mind against Rohingya and it turned to hate.”

For years, Facebook, now called Meta Platforms Inc., pushed the narrative that it was a neutral platform in Myanmar that was misused by malicious people, and that despite its efforts to remove violent and hateful material, it unfortunately fell short. That narrative echoes its response to the role it has played in other conflicts around the world, whether the 2020 election in the U.S. or hate speech in India.

But a new and comprehensive report by Amnesty International states that Facebook’s preferred narrative is false. The platform, Amnesty says, wasn’t merely a passive site with insufficient content moderation. Instead, Meta’s algorithms “proactively amplified and promoted content” on Facebook, which incited violent hatred against the Rohingya beginning as early as 2012.

Despite years of warnings, Amnesty found, the company not only failed to remove violent hate speech and disinformation against the Rohingya, it actively spread and amplified it until it culminated in the 2017 massacre. The timing coincided with the rising popularity of Facebook in Myanmar, where for many people it served as their only connection to the online world. That effectively made Facebook the internet for a vast number of Myanmar’s population.

More than 700,000 Rohingya fled into neighboring Bangladesh that year. Myanmar security forces were accused of mass rapes, killings and torching thousands of homes owned by Rohingya.

“Meta — through its dangerous algorithms and its relentless pursuit of profit — substantially contributed to the serious human rights violations perpetrated against the Rohingya,” the report says.

A spokesperson for Meta declined to answer questions about the Amnesty report. In a statement, the company said it “stands in solidarity with the international community and supports efforts to hold the Tatmadaw accountable for its crimes against the Rohingya people.”

“Our safety and integrity work in Myanmar remains guided by feedback from local civil society organizations and international institutions, including the U.N. Fact-Finding Mission on Myanmar; the Human Rights Impact Assessment we commissioned in 2018; as well as our ongoing human rights risk management,” Rafael Frankel, director of public policy for emerging markets, Meta Asia-Pacific, said in a statement.

Like Sawyeddollah, who is quoted in the Amnesty report and spoke with the AP on Tuesday, most of the people who fled Myanmar — about 80% of the Rohingya living in Myanmar’s western state of Rakhine at the time — are still staying in refugee camps. And they are asking Meta to pay reparations for its role in the violent repression of Rohingya Muslims in Myanmar, which the U.S. declared a genocide earlier this year.

Amnesty’s report, out Wednesday, is based on interviews with Rohingya refugees, former Meta staff, academics, activists and others. It also relied on documents disclosed to Congress last year by whistleblower Frances Haugen, a former Facebook data scientist. It notes that digital rights activists say Meta has improved its civil society engagement and some aspects of its content moderation practices in Myanmar in recent years. In January 2021, after a violent coup overthrew the government, it banned the country’s military from its platform.

But critics, including some of Facebook’s own employees, have long maintained such an approach will never truly work. It means Meta is playing whack-a-mole trying to remove harmful material while its algorithms designed to push “engaging” content that’s more likely to get people riled up essentially work against it.

“These algorithms are really dangerous to our human rights. And what happened to the Rohingya and Facebook’s role in that specific conflict risks happening again, in many different contexts across the world,” said Pat de Brún, researcher and adviser on artificial intelligence and human rights at Amnesty.

“The company has shown itself completely unwilling or incapable of resolving the root causes of its human rights impact.”

After the U.N.’s Independent International Fact-Finding Mission on Myanmar highlighted the “significant” role Facebook played in the atrocities perpetrated against the Rohingya, Meta admitted in 2018 that “we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence.”

In the following years, the company “touted certain improvements in its community engagement and content moderation practices in Myanmar,” Amnesty said, adding that its report “finds that these measures have proven wholly inadequate.”

In 2020, for instance, three years after the violence in Myanmar killed thousands of Rohingya Muslims and displaced 700,000 more, Facebook investigated how a video by a leading anti-Rohingya hate figure, U Wirathu, was circulating on its site.

The probe revealed that over 70% of the video’s views came from “chaining” — that is, it was suggested to people who played a different video, showing what’s “up next.” Facebook users were not seeking out or searching for the video, but had it fed to them by the platform’s algorithms.

Wirathu had been banned from Facebook since 2018.

“Even a well-resourced approach to content moderation, in isolation, would likely not have sufficed to prevent and mitigate these algorithmic harms. This is because content moderation fails to address the root cause of Meta’s algorithmic amplification of harmful content,” Amnesty’s report says.

The Rohingya refugees are seeking unspecified reparations from the Menlo Park, California-based social media giant for its role in perpetuating genocide. Meta, which is the subject of twin lawsuits in the U.S. and the U.K. seeking $150 billion for Rohingya refugees, has so far refused.

“We believe that the genocide against Rohingya was possible only because of Facebook,” Sawyeddollah said. “They communicated with each other to spread hate, they organized campaigns through Facebook. But Facebook was silent.”

Copyright © The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

AP

Image:A portrait of Officer Vancouver Police Department officer Donald Sahota is displayed as he is...

The Associated Press

Man convicted of murder in death of Vancouver police officer shot by deputy

A Yakima man was found guilty of murder for his role in the death of a Vancouver police officer who was shot by a sheriff's deputy in error.

1 day ago

Image:The headquarters of Live Nation is shown June 29, 2020, in Beverly Hills, California. The Jus...

Associated Press

Government sues Ticketmaster owner and asks court to break up company’s monopoly on live events

The DOJ has accused Ticketmaster and parent company Live Nation Entertainment of running an illegal monopoly over live events in America.

4 days ago

Image: Ambulances are seen at the airport where a London-Singapore flight that encountered severe t...

Associated Press

World news roundup: Man dies after turbulence; Iran funerals; Israel media law

A Singapore Airlines flight hit severe turbulence over the Indian Ocean and descended 6,000 feet in a span of about three minutes.

6 days ago

Image: Bruce Nordstrom, left, and Jeannie Nordstrom attend the Nordstrom NYC Flagship Opening Party...

Associated Press

Bruce Nordstrom, who helped grow family-led department store chain, dies at 90

Bruce Nordstrom, a retail executive who helped expand his family's Pacific Northwest chain into an upscale national brand, has died.

7 days ago

Image:Israeli Prime Minister Benjamin Netanyahu attends the weekly cabinet meeting at the prime min...

Associated Press

World news roundup: Netanyahu arrest warrant; Assange can appeal; UK blood scandal

The International Criminal Court said it is seeking arrest warrants for Israeli and Hamas leaders, including Benjamin Netanyahu.

7 days ago

Image: In this photo released by the Iranian Presidency Office, President Ebrahim Raisi attends a m...

Associated Press

Iran president and others found dead at helicopter crash site, state media says

Iran President Ebrahim Raisi, the country’s foreign minister and others have been found dead at the site of a helicopter crash Monday.

7 days ago

Rohingya seek reparations from Facebook for role in massacre