Minha Khan

Cannon Editor


Trigger Warning: Strong references to and mention of rape, murder, arson 

Facebook. What comes to your mind when you hear that 8-letter word? 

The 700 billion dollar worth social media platform is ubiquitous worldwide, and now, hatred on Facebook is too. With over 2.7 billion active consumers on Facebook ⅓ of the entire global population Facebook’s global influence is undeniable. The platform hosts memes, college groups, and our entire social networks, but it also harvests misinformation, propaganda, and hostility. The coexistence of good and bad is like that of any other society; the difference though is that the leader of this virtual nation, Mark Zuckerburg, acts independent of any governing legislature and is paid the salary of the fifth richest person in the world. 

Does this matter? Is Facebook actually causing any real human damage through “fake news”? Zuckerburg is only a businessman; does he really need to be treated like an authoritative figure? A simple look at the case of the Rohingya, a people known as the “world’s most persecuted minority”, makes those answers clear: yes, yes, and yes. 

The Rohingya are a primarily Muslim ethnic group from Buddhist-majority Myanmar. Most of them live in the western state of Rakhine, with ancestry going back generations in the Southeast Asian country. They first arrived in Myanmar in the 1430s and lived under what was then the Kingdom of Arakan, which was later taken over by the Burmese Empire in 1784. Today, the 600,000 Rohingya that are in Myanmar live in a state of endangerment, left to the persecution and abuse of the country’s military forces, while 900,000 remain as refugees and stateless in Bangladesh. The homes and villages of the Rohingya have been incinerated and they have been raped, tortured, and murdered by security forces. 

‘The pages of my notebook are stained with my own tears,’ says Peter Bouckaert, 

Human Rights Watch’s hard-boiled emergencies director [regarding the crisis]. A veteran of the Balkans and Iraq, he had just finished interviewing a Tula Toli survivor who says her six children were murdered in front of her before she was gang-raped and left for dead in a burning home. ‘We’re not talking about an ordinary war,’ he says. ‘These are unarmed villagers who are being attacked by an army that is murdering them.’ For a moment, he chokes up. ‘We are faced with an entire people being forced out of Burma.’”

— Rolling Stone

The Rohingya have been facing systematic oppression, military violence, and restrictions on their rights since the 1970s in Myanmar, long before Facebook’s conception. The narrative runs far back to the time Myanmar gained independence from the British in 1948 and the Muslim workers that had migrated to Myanmar from India and Bangladesh as internal travellers during British control began to be viewed as illegal entrants. Despite centuries old Muslim civilization in Myanmar, this stance of the Rohingya being infiltrants would underpin all future anti-Rohingya sentiment among the Buddhist public and government. The historical transformation of the Rohingya from protected citizens to victims of genocide spanned decades. But in a nation with decades worth of festered hatred and enmity for a minority group, a digital gathering place with unprecedented audience reach and zero regulation can catalyze the catastrophe of a people faster than ever. And that is exactly what happened on Facebook to the Rohingya. 

To understand Facebook’s role in the Rohingya crisis, it’s important to know about the recent waves of violence against the Rohingya, in particular a defining episode beginning in August 2017. On August 25, 2017, members of the Arakan Rohingya Salvation Army (ARSA) launched an attack on police posts in protest of military treatment. The casualties were 12 officers and 80  ARSA militants. The military’s response? “Clearance operations,” aided by “Buddhist mobs,” carried out until September with no demarcation between ARSA militants and noncombatants that killed 6,700 Rohingya, 11% being children below five, incinerated villages, and gang raped women and girls. There is evidence the abuse continued after September, despite government claims stating otherwise. Other evidence asserts that the operation was planned beforehand. The crackdown forced 700,000 Rohingya to flee and live in refugee camps, risking their lives in the process. 

The conflict in Myanmar has been ongoing, but the turn of events in 2017 were unlike any before. How do you reach such a pressurized point, and why is the majority of Myanmar’s Buddhist population indifferent to the cries of the Rohingya? 

Experts point to Facebook as the farm where the seeds of genocide were planted, a slow-boil of years of posts before the public began to seethe. Instead of connecting the community, the platform was and still is an outlet for hate speech against the Rohingya. An inquiry by Reuters uncovered over 1000 Burmese pieces of targeted, “extremely violent and graphic” content, as of August 2018, against the Rohingya. Some of the content found was posted as far back as 2013.  

Particularly active on Facebook was the Ma Ba Tha, a nationalist Buddhist organization, Facebook group of 55,000 members. On August 24th and 25th 2017, the time of the ARSA attack and the ensuing wide scale military campaign that forced hundreds of thousands to flee, the Facebook group became flooded with double the posts it had seen before. Misinformation and fake news such as mosques concealing arms and scheming to use them to bomb Buddhist pagodas and the Shwedagon pagoda, a highly sacred Buddhist temple, began to spread. Slurs such as “kalars” and “Bengali terrorists” were used, and photos of “Muslim-free” signposts received over 11,000 shares. The Reuters team found posts comparing Rohingya to dogs and pigs. One of the journalists on the team, Steve Stecklow, described the impact astutely, “This is a way of dehumanising a group. Then when things like genocide happen, potentially there may not be a public uproar or outcry as people don’t even view these people as people.” 

And that’s just half of the story. 

Military personnel created fan accounts and pages for Burmese celebrities and famed stars on Facebook, posing as civilians. They then worked to gain large followings and also took over a well-known blog called Opposite Eyes. The military began to funnel fake stories and make stirring posts and comments meant to rise people up against the Rohingya. Posts tried to paint Rohingya as terrorists by making accusations of Rohingya-led slaughter, like a post made with photo evidence of Rohingya persecution of native Rakhines in the 40s. The photos were from the Bangladesh-Pakistan conflict in the 70s. Or, around the time of 9/11 in 2017, when the military used their sham news pages to alert Buddhists through Facebook Messenger about upcoming “jihad attacks.” They also told Muslims that Buddhists were planning to protest them. This undercover campaign had been ongoing for five years and had a team of 700 people dedicated to it, and it was executed entirely covertly. Tools of psychological warfare and tactics learned by military personnel from studying in Russia were deftly applied. Experts say the military’s intent was to create an environment of fear and terror so as to warrant the need for a saviour: the military. Others say the goal was “ethnic cleansing.”

We know that hate speech and misinformation on the platform contributed to what happened to the Rohingya. Yanghee Lee, a UN Myanmar inquirant, said “We know that the ultra-nationalist Buddhists have their own Facebooks and are really inciting a lot of violence and a lot of hatred against the Rohingya or other ethnic minorities.” Human rights organizations claim the military operation on Facebook caused murder, rape, and forced the Rohingya to flee. The correlation between the hate on Facebook and the violence becomes stark after connecting the story to how Facebook originated in Myanmar. 

Back in 2013, Myanmar’s telecommunications market went public and the price for a SIM card dropped from $200 to $2. People could now purchase phones with unprecedented ease and get internet access. The manifestation of this impact is visible through the internet penetration rates in Myanmar: from 0.3% in 2010 to 30% in 2017. Most users flocked to Facebook because it was one of the rare platforms that was compatible with the Burmese language, which not even Google supported. Today, Facebook is considered as the only internet and digital news source in Myanmar by civilians; In 2018, 18 million of Myanmar’s 54 million population were on Facebook. This is troubling given the fact that the rapid spread of Internet use in Myanmar meant that there was little time to build cultural awareness on how to use the Internet and left many unaware of concepts like fake news. 

Why Facebook failed to remove anti-Rohingya hate speech was dependent on a number of problems. Software issues meant that a large number of users could not read Facebook guidelines on reporting certain content. A lack of context meant slurs were harder to identify and ban. But the main reason was a gaping lack in Burmese content moderators. The company had a total of one in 2014, and four in 2015. 

Facebook has responded with a number of reparative actions since 2017. They created an internal team focused on flagging and removing hate speech on Facebook faster and more effectively. In November 2018, they publicly acknowledged their role: “We weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence.”  They took down 425 Pages linked to the military, 17 Groups, and 135 accounts in December 2018 and promised to add more Burmese content moderators, having increased the total to 60 in 2018. 

Facebook had been aware of what was happening on the platform before they took action in 2017, from an Australian documentarian in 2013, a doctoral student in 2014, and even in 2015, when an entrepreneur visited headquarters and delivered a talk on how Facebook was stirring trouble in Myanmar; they were alerted. A multi-billion dollar company was given many opportunities to act early and preventatively before the crisis of 2017. 

Now, the West African nation of Gambia is asking Facebook to provide access to its stored database and information, including military profiles and messaging, which would be instrumental in officially charging Myanmar with genocide in the International Court of Justice. Facebook declined the U.S. federal request. They claim Gambia’s request is a breach of the Stored Communications Act (SCA), a law that regulates external data transference from social media companies. The grounds on which Facebook are arguing are insubstantial; the SCA is a means of ensuring civilians data privacy, not Facebook’s. Moreover, the SCA need not apply to publicly posted content. The case is ongoing at the moment. 

Facebook has a chance to act according to its own guidelines and publicly-touted values. How it responds to Gambia will be a determining moment in how Facebook and other social media companies are to be seen by the public and in history. If Facebook continues to be uncooperative, perhaps it is a sign for a new precedent to be set. One where social media companies are treated as public entities with global impact subject to ongoing examination and legal consequences, like a fine for lack of action leading to genocide. Enforced measures can be bought in such as providing mandatory digital literacy instruction for all new users to Facebook, ongoing regional- level communication and collaboration with civilians, minorities, and rights groups to highlight problems on the platform and tackle them as they arise, and requiring Facebook to invest in finding content moderators immediately as they predict growth in app usage in a particular region. It may be time to force Facebook to spend less resources on algorithmic ad targeting and shift the focus to developing algorithmic hate speech targeting. Of course, Facebook can pursue all these initiatives on their own, just like how it can choose to assist Gambia in its case against Myanmar. As we see Facebook in the courtroom more and more, Facebook may find itself without a choice in the future if it chooses wrong now. 

Breaking News Update: On February 1st, 2021, the military staged a coup, assuming power of the nation. De facto leader Aung San Suu Kyi and other government ministers were detained. The UN fears that the condition of the Rohingya will worsen following this turn of events. 


How you can help the Rohingya:

Make a donation or ask family & friends to donate:

UNHCR – https://give.unhcr.ca/page/52681/donate/1?ea.tracking.id=OL21_Rohingya_HQ&utm_source=unhcr.org&utm_medium=referral&utm_campaign=CA_PS_EN_bangladeshrohingya or Google “UNHCR Rohingya donate”

Choose a specific organization/effort to donate to – https://www.nytimes.com/2017/09/29/world/asia/rohingya-aid-myanmar-bangladesh.html or Google “New York Times Rohingya aid”

Spread the word by sharing this easy to read, quick article with your social networks: https://www.bbc.com/news/world-asia-41566561#:~:text=In%20August%202017%2C%20a%20deadly,textbook%20example%20of%20ethnic%20cleansing%22 or Google “BBC Rohingya what you need to know”

Leave a Reply

Your email address will not be published. Required fields are marked *