Facebook admits it didn't do enough to prevent 'offline violence' in Myanmar
A night before the U.S. midterm elections, Facebook has dropped an independent report into the platform's effect in Myanmar.
The report into Facebook's impact on human rights within the country was commissioned by the social media giant, but completed by non-profit organization BSR (Business for Social Responsibility).
And it affirms what many have suspected: Facebook didn't do enough to prevent violence and division in Myanmar.
"The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more," Facebook's product policy manager Alex Warofka wrote in a statement.
For the southeast Asian country's 20 million citizens that are online, Facebook is the internet. The report notes digital literacy is low, where many people find it "difficult to verify or differentiate content (for example, real news from misinformation)."
While Facebook has "substantially increased opportunities for freedom of expression" for the country's citizens, it has been a "useful platform" for people looking to incite violence and cause offline harm.
"A minority of users is seeking to use Facebook as a platform to undermine democracy and incite offline violence, including serious crimes under international law; for example, the Report of the Independent International Fact-Finding Mission on Myanmar describes how Facebook has been used by bad actors to spread anti-Muslim, anti-Rohingya, and anti-activist sentiment," the report states.
In August, Facebook removed pages and groups belonging to military officials who were using the platform to incite violence and ethnic cleansing of Rohingya Muslims.
The report notes that this could have an adverse effect on the company establishing staff in the country, of which there are none currently.
"Facebook’s action against senior military officials in August 2018 also increased the risks associated with locating Facebook staff in Myanmar, at least in the near term, and it is unclear whether Facebook could have acted against the military if Facebook staff had been present in Myanmar," the report reads.
Some of the recommendations include that Facebook create a stand-alone human rights policy, improve its enforcement of community standards — especially in relation to credible violence, plus preserve and share data which can be used to evaluate human rights violations.
Facebook has been making steps on these recommendations already, but with Myanmar's elections in 2020 — which is set to be a flashpoint for hate speech and harassment — time is running short.
via Mashable https://ift.tt/2DCFv97
November 5, 2018 at 08:02PM