x
Help Us Guide You Better
best online ias coaching in india
2021-06-25

Download Pdf

banner

International Relations
www.thehindu.com

Representational image   | Photo Credit: Reuters

(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

Facebook's recommendation algorithm in March prompted users to view and 'like' Myanmar pro-military pages containing violent and misleading content, according to a report by human rights not-for-profit Global Witness.

Myanmar's military seized power in a coup in February, imprisoning the country's leaders and killing hundreds of protestors. Facebook banned the armed forces from the platform citing the military's history of exceptionally severe human rights abuses and the risk of future military-initiated violence.

Facebook had said it is treating the situation in Myanmar as an emergency and will do everything possible to "prevent online content from being linked to offline harm and keep the community safe".

But a month later, just days leading to the bloodiest day since the coup, the social network was pushed misinformation that could lead to physical harm, praised the military and glorified the abuses, according to Global Witness.

Also Read | How Myanmar's military moved in on the telecoms sector to spy on citizens

Global Witness set up a new Facebook account in March, just before the peak of military violence against civilians. The account had no history of liking or following specific topics including the armed forces. The search results were filtered to show "pages" and top result was a military fan page whose names translates as 'a gathering of military lovers'.

Posts on the page conveyed respect for Myanmar’s soldiers and sympathy for their cause, and had at least two people advertising for young people to join the military, the report stated.

On liking the page, Facebook generated a pop-up box containing related pages which are chosen by the social network's algorithm. The first five related pages alone were followed by over 90,000 Facebook users and contained content that violates Facebook's policies on Myanmar, the report noted.

The findings come after the California-based company introduced several measures to curb violent content relating to Myanmar's situation. It rolled out a safety feature in Myanmar in March, that would allow users to lock and apply additional privacy settings to their profiles. In April, the company said it will take down posts showing praise, support and advocacy of violence by Myanmar security forces and protestors on the platform. The April update also meant Facebook will remove this type of content, which remain on the platform, according to the report.

"The fact that it is this easy to find problematic content on Facebook, even in a situation it has declared to be an emergency with its crisis centre "running around the clock", is an example of how self-regulation has failed," Global Witness said.


Our code of editorial values

Please enter a valid email address.

END
© Zuccess App by crackIAS.com