According to Myanmar Facebook users, their posts are being taken down by the popular social media company for containing the word 'kalar'.
Although the etymology of the word is still debated, it is traditionally used to refer to people of east Indian origin or as an adjective meaning "Indian" in general. Facebook is censoring the word "kalar" or ကုလား (in Burmese script) as part of its initiative to tackle the problem of widespread hate speech on the Burmese language social network.
In recent years, the rise of radical nationalist movements has given the word an extremely derogatory connotation. In particular, it is a word most used by ultra-nationalist as a hate speech against Muslims who constitute a minority population in Myanmar.
But in this effort to combat hate speech in Myanmar, the company has censored a good deal of viable content from its platform.
'Kalar' may be commonly associated with racism today, but the word on its own does not necessarily constitute hate speech. Context matters — many people have reported that posts in which they discussed the use of the term, or expressed concern about its usage, were censored as well.
Moreover, there are several Burmese words with completely different meanings that contain the same string of characters as 'kalar'. For instance, chair in Burmese is also written 'kalar htaing', which contains the same characters, as well as other words such as 'kalar pae' (split pea), 'kalar oat' (camel) or 'kalarkaar' (curtain).
Facebook user Aung Kaung Myat explained the different meanings of words that sound like 'kalar':
A post made by one of my friends was deleted today because there was the word, kalar, in it. However, he was wittily asking his friends if they know where he can cure his lower back pain by using a pun. The phrase, ku la — 'cure' (ku) + question word (la) in Burmese language — bears striking resemblance verbatim with the racist term but they are wholly different in context.
There have been various reactions to this initiative of Facebook. Zin Win Htet thinks that it will reduce hate speech posts of radical nationalists, but also stressed that Facebook needs to be more analytical before it removes any post suspected of promoting racism:
While the move may send a strong signal to those spouting hate speech, the efficacy of this strategy might not last long. The experiences of social media platforms in China, such as Sina Weibo and WeChat, have proven that keyword censorship often becomes a game of cat-and-mouse, wherein social media users will simply begin using a new word or alternate spelling of the censored word in order to keep expressing their views.
Writer Aung Kaung Myat described how Facebook started to remove his posts when he simply notified his friends that the word 'kalar' is already banned on the social network:
…when I discovered this new policy of Facebook, I made a post telling my Facebook friends the word is banned. Ironically, my post was removed by Facebook and I was banned from liking, posting, and sharing content on Facebook for 24 hours because the post "doesn't follow the Facebook Community Standards".
Facebook has removed numerous posts by people who do not have any negative intentions or who were simply trying to show their opinions against the hate speech used by radical nationalists.
Patrick Murphy wrote that his post not related to hate speech was taken down:
Non-hate speech post removed by Facebook. In the post, the author was sharing his opinion on why extreme nationalism and religious fundamentalism in the country are bad.
Chan Myae Khine believes Facebook should have done more before launching this initiative like consulting the Burmese Internet community:
Facebook might have good intention to minimise racism in Burma through their platform but that's not how it works. Censoring such words will just bring more hatred among different communities. Plus, they seemed to initiate that without proper local context nor tech support hence banning words like "chair" and "pea curry". Even when they found out that they made mistakes, they don't attempt to rectify wrongly deleted posts. If only they could respect a bit more to 15 million user base that's generating a great revenue for them, it'd be great.
Facebook's automated censorship has made users to mock its approach. Instead of taking it seriously, users are now making fun of Facebook by deliberately writing the word 'kalar' in non-hate speech context to see if their posts would be taken down.
Here are some screenshots of Facebook posts that have been removed despite containing nothing resembling hate speech:
Over the past five years, Myanmar has seen the rise of extreme nationalist ideology and religious fundamentalistswho have been using social media to amplify their voices and influence. Hate speech is blamed, among others, for stirring communal violence in Myanmar, especially in the Rakhine state where clashes between Muslims and Buddhists displaced thousands of residents.
Facebook was criticitized for its failure to tackle the rampant hate speech occurring on its Burmese pages. But instead of simply deciding to censor the word 'kalar', it should have reviewed and learned from ongoing initiativesthat aim to combat online hate speech in Myanmar that focus on context, rather than code.
No comments:
Post a Comment