How Facebook can influence a nation

A mousepad with the Facebook logo. (AFP Photo/Daniel Leal-Olivas)

After an investigation by Facebook last year, it was found that Russian operatives spent US$100,000 on ads with “divisive messages” during the election campaign season in the United States (US). These “divisive messages” are thought to be fake news used to influence the American electorate. According to an analysis done by Buzzfeed, leading up to the election, the top 20 fake news stories had over one million more shares and comments on Facebook than the top 20 hard news stories.

How much influence this fake news had on the American electorate is hard to measure, but many observers believe that among other factors, it did play a significant role in influencing voters.

Then in March this year, it was revealed that Donald Trump had hired Cambridge Analytica as political consultants to work on his election campaign. Investigations by the Guardian and New York Times showed that Cambridge Analytica may have had access to over 70 million Facebook users’ accounts in the US. This data allowed Cambridge Analytica to combine social psychology with data analytics that allowed them to micro-target users with specific content.

Hiring political consultants to help out with elections is not something new. Even fake news and political advertising have been around for decades. Some have even argued that fake news goes back as far as the Vietnam War. However, the emergence of user data and social media has changed the entire political landscape. Previously, a political advertisement had a one size fits all model where everyone saw the same thing. Nowadays, everything is bespoke, content is created specifically based on the psychological data firms have built on you. This is where ethical lines can be blurred.

The insidious use of social media for political ends has also seeped into Southeast Asia. A study by University of Oxford last year found that US$200,000 was spent hiring paid trolls to spread propaganda for President Rodrigo Duterte. The study highlighted that Duterte had a team of over 400 cyber troopers to post pro-Duterte comments. There have also been reports of people who are critical of Duterte or the government being targeted and harassed by these cyber troopers.

In Myanmar, Facebook has been used for more violent means. In 2017, a Facebook post calling for violence against the Rohingya was shared over 13,000 times and had over 2,000 comments. This is among one of many examples of hate speech being shared on Facebook. Last week, a group of six Myanmar civil society organisations posted an open letter to Mark Zuckerberg, criticizing the inadequate response by Facebook on the reports of hate speech on the platform.

Source: Statista (2017)

While Facebook has a policy of banning hate speech, many of the messages inciting racial violence remain online undetected by Facebook’s moderators, revealing a fault in their moderation system. Mark Zuckerberg personally responded to the open letter, apologising and mentioned that he is working on AI technology to identify abuses. When questioned about Facebook’s role in Myanmar during his US Congress hearing, Mark Zuckerberg pledged that he would implement a 24-hour target to block hate speech.

Governments in the region need to figure out how to tackle fake news themselves without clamping down on freedom of speech. They have begun implementing harsh laws against free speech under the guise of clamping down on fake news. For example, the operating license of Philippines’ website Rappler was revoked after Duterte called it a “fake news outlet”. In Malaysia, the government there passed an anti-fake news law that activists have criticised as being too vague in its definition of fake news.

It is obvious that there are other actors involved when it comes to spreading fake news or manipulating data and Facebook doesn’t bear the sole responsibility for it. However, Facebook needs to be aware that it provides a platform for millions and that information on Facebook can spread like wildfire.