Tuesday, 2024 November 26

Can Facebook’s new Oversight Board beat fake news in Southeast Asia?

Facebook faced a series of scandals in 2018, including the prevalence of Russian disinformation campaigns on its site, the United Nations’ indictment of the platform related to the slaughter of Rohingya Muslims in Myanmar, and bombshell reports about Cambridge Analytica’s misappropriation of tens of millions of Facebook users’ personal data. Soon after, Mark Zuckerberg vowed to create an independent oversight board that, in theory, would hold the social media giant to account whenever there are signs of coordinated disinformation campaigns or moves to use Facebook to legitimize violence. So far, it has cost the company USD 130 million.

The quasi-independent, 20-member Facebook Oversight Board (FOB) needs to balance free expression against decisions to remove problematic content on Facebook and Instagram, spanning a range of posts and comments that covers nudity and hate speech. (Notably, direct messages on Instagram, Facebook Messenger, and WhatsApp will not fall under review.) Users who do not agree with the platform’s content moderation decisions can submit an appeal to FOB, which has been tagged as Facebook’s “Supreme Court.”

Representing Southeast Asia on the board is Endy Bayuni, a senior journalist from Indonesia’s English publication The Jakarta Post. FOB will begin reviewing complaints in mid-October and aims to resolve at least 100 cases globally in its first year, focusing specifically on issues that have broad societal impact, Bayuni told KrASIA. “We’ll take cases with huge impact and real-world consequences. The board must decide on a case within a maximum of 90 days.”

FOB’s formation is Facebook’s latest effort to win public trust, but there are doubts as to whether the board’s limited membership can truly make a dent in the problems related to how extreme elements among Facebook’s users utilize the platform. Many have also questioned FOB’s stated independence and authority. How will the board’s operation impact Southeast Asia, where we have seen truly detrimental effects that were linked to posts on Facebook, from election interference to acts of genocide?

kr asia community

Business as usual?

Bayuni maintains that FOB will make a difference. “We’re aware of this skepticism, considering that there are millions of pieces of problematic content that Facebook faces every day. But let us work for now as we’ll improve from time to time in order to produce truly credible content moderation,” he said. “The board is not under Facebook, we have the authority to overrule Facebook’s decision or even Mark Zuckerberg’s.”

Yet when it comes to Southeast Asia, the numbers speak for themselves: Facebook’s total regional user count is around 377 million as of January 2019, according to Statista. That’s nearly 19% of the platform’s 2 billion monthly active users, a far larger share than a single Southeast Asian member among FOB’s 20-member roster.

The company has a troubling record in Southeast Asia, where misuse of its platforms have led to dire consequences. In 2018, Facebook’s director of Asia Pacific policy, Mia Garlick, admitted to Reuters that the company was “too slow to respond to the concerns raised by civil society, academics, and other groups in Myanmar,” where the Rohingya population is still persecuted. Elsewhere in the region, Facebook and WhatsApp have been used to spread misinformation during the presidential elections in the Philippines in 2016, then in Indonesia in 2019.

Bayuni agreed that FOB should have more representatives from Southeast Asia. He hopes to see another figure from the region when the board adds 20 more members next year. “I believe Facebook is aware of problems in Southeast Asia. The firm has commissioned human rights impact assessments in Myanmar, Indonesia, and Cambodia, to evaluate its role in those countries. From those reports, we can see that Facebook is willing to improve standards and policies to create better outcomes in the society.”

FOB references the Rabat Plan of Action, outlined by the Office of United Nations High Commissioners for Human Rights, to define hate speech, considering the distinction between freedom of expression and incitement of violence, Bayuni said. While Facebook’s guidelines may be adjusted to fit local regulations, the board’s mandate is to adhere to international human rights law.

“Every country has its own laws. As a business platform, Facebook may need to comply with local laws so it can operate in that country. However, sometimes local regulations are not in accordance with international law, such as restriction to freedom of speech and expression. When this happens, the board will always stick to international laws,” Bayuni said, citing the company’s policy in Thailand, where the government sued Facebook in September for ignoring requests to restrict content that is critical of the monarchy.

Who jogs through smog? Mark does. Source: Mark Zuckerberg’s Facebook page.

Keyboard warriors and cyber troops in Indonesia

With 184.76 million users in 2019 in Indonesia, according to Statista, Facebook is fertile ground for individuals or groups that want to spread fabricated content and hate speech. The government is also able to use the platform for surveillance, censorship, and threats directed at dissidents or critics.

A 2019 research report by the University of Oxford indicates that the platform is used by Indonesian cyber troops to spread pro-government propaganda and run smear campaigns against opposition, driving societal division and polarization.

Two high profile cases for cyber attacks and manipulation happened during the 2017 Jakarta gubernatorial election. The first one involved Saracen, a group that funneled fake news and hate content to more than 800,000 followers on Facebook. Due to Saracen’s wide reach, a single post could pull in as much as IDR 100 million (USD 7,500). Saracen was actively fabricating and spreading hoaxes about former Jakarta governor Basuki Tjahaja Purnama, who was a candidate in the election.

The goal? To exploit the pre-existing ethnic, racial, and religious differences to fan divisions and political unrest until the voting day—while generating a profit.

Another group is the Muslim Cyber Army, or MCA, who operated like Saracen and had over 468,000 followers in its three Facebook groups. MCA, however, placed President Widodo in its crosshairs beginning in 2018, a year before Indonesia’s 2019 presidential election.

Facebook shut down both of the groups’ pages and linked accounts, then issued a statement saying that Saracen and MCA engaged in coordinated, inauthentic behavior in Indonesia, misleading others about who they were and what they were doing. Indonesian police arrested the ringleaders of both groups for “spreading hoaxes.”

In 2019, Facebook also shut down pro-government accounts spreading fake news about the controversial topic of human rights abuse in West Papua.

But in all of these cases, Facebook’s action was too little, too late. The damage had already been done. These groups were only removed from the social media platform after hitting peak reach, doing far more than seed conspiracies in the public consciousness.

Facebook Mark Zuckerberg
For now, critics don’t share Mark Zuckerberg’s view that the Oversight Board will have a huge impact on disinformation that permeates Facebook. Photo by Annie Spratt on Unsplash,

Face-offs with authoritarian states

Zuckerberg calls Facebook a “powerful new tool” that lets its users “stay connected to the people they love, make their voices heard, and build communities and businesses.” However, the social media platform has time and time again been a powerful tool for authoritarians and their proxies, as well as fake news churn houses that misinform for the sake of profit. These are the entities that FOB’s 20 board members are up against.

The Thai government has repeatedly accused Facebook of failing to comply with requests to restrict content that it deems defamatory to the monarchy, such as that posted on the million-strong Facebook group “Royalist Marketplace,” which was taken down in late August only to re-emerge under a similar name. As Bayuni mentioned, Facebook is being sued by the Thai government.

Thai authorities have continued to tighten their control over cyberspace access, and had plans to block over 2,000 websites before mid-September, including Facebook pages and Twitter accounts, while anti-government protests that seek to curb the monarchy’s power are still taking place.

Over in Vietnam, Facebook bowed to government’s pressure and agreed to censor posts after state-owned telecom firms knocked Facebook’s servers offline for seven weeks starting in mid-February, choking off traffic to the platform (and hence slashing ad revenue), according to Reuters. Human Rights Watch criticized Facebook’s move as setting “a terrible precedent by caving to the government of Vietnam’s extortion.”

“Now other countries know how to get what they want from the company, to make them complicit in violating the right to free speech,” John Sifton, Asia advocacy director at Human Rights Watch, said in a statement.

That’s all to say that Facebook’s moving goalposts from country to country seems to be beyond the purview of FOB, which can only review appeals regarding content that is contested. Even with FOB’s establishment, will Facebook’s own policies undermine the board’s effectiveness?

Nearby, there are steps in the right direction. In Myanmar, Facebook made the unprecedented move of shutting down the Facebook accounts of Myanmar’s army chief and several senior military officials in August 2018 for posting anti-Rohingya content, but only after UN investigators said the military carried out mass killings and gang rapes of Muslim Rohingya with “genocidal intent.”

As Myanmar is set to hold its general election on November 8, pressure is mounting again on Facebook, the country’s most widely used social media platform. Local observers have highlighted the lack of efficient measures to combat fake news. It’s up to users to utilize the “report” function on Facebook to flag problematic posts.

That’s all to say that FOB, under its current mandate, cannot address fundamental issues related to the prevalence of incendiary content on Facebook.

“Crucially, this system does not contemplate the review of content that has not been removed at all and provides no remedies to a user, or a non-user for that matter, who feels that content is injurious to them and wants to ‘appeal’ to the Board against the non-removal decisions at lower level,” wrote Gabriella Casanova Carlos Lopez and Sam Zarifi, two legal experts, in an op-ed for Opinio Juris, a blog associated with the International Commission of Jurists.

As FOB begins its works, a group of civil rights leaders, journalists, and scholars have formed their own group to keep fake news on Facebook in check. Called the “Real Facebook Overview Board,” the team counts among its ranks Maria Ressa, the co-founder and CEO of Philippines media company Rappler. Ressa has been sentenced to six years in prison for “cyber libel” in what Reporters Without Borders calls “Kafkaesque” legal proceedings.

The Real Facebook Overview Board aims to specifically discuss the platform’s role in shaping the upcoming US elections, although it is easy to imagine how the group’s conversations may extend to other geographies, including Southeast Asian countries, where an overhaul in content dissemination is desperately needed.

Khamila Mulia
Khamila Mulia
Khamila Mulia is a seasoned tech journalist of KrASIA based in Indonesia, covering the vibrant innovation ecosystem in Southeast Asia.
MORE FROM AUTHOR

Related Read