Countering misinformation online: Strengthening media literacy skills to fight “fake news”

In recent years, the scope and consequences of online disinformation and misinformation campaigns have drawn increasing attention, as concern builds about their ability to exacerbate social polarisation, undermine public trust in the media and influence political decisions. Certainly, the advent of social media platforms and other online forums have permitted dissemination of information at an unprecedented scale and speed, while questions of governance, legal frameworks and moral obligation in these spaces have continued to lag behind their technological development. This fraught situation begs the question: what should be done? 

During the “Countering misinformation online: policies and solutions” panel held on Day 3 of the Internet Governance Forum, the four participating panellists echoed concerns and sentiments heard throughout the conference. How can the powerful, negative impacts of disinformation be mitigated without enabling governments and the private sector to infringe upon rights to freedom of expression and anonymity?

Moderator Asad Baig of Media Matters for Democracy, an APC member organisation in Pakistan, made no bones about his frustration with the oversimplification of this issue by government and industry players, linking this tendency with bias toward censorship and criminalisation of online activity. “My own personal agenda,” he stated, is “to see how this subject, misinformation, is not really something we could put a binary solution to. A black-and-white model to take care of it. This is the solution being offered in South Asian countries, including Pakistan. Get Facebook to remove the content, we’re done, that’s the end of it.” This approach, he countered, is “not the solution” and needs to be debunked. 

At the same time, Baig did not downplay the severity of the misinformation problem, noting that “the World Economic Forum has categorised it as one of the biggest threats to human society.” He cited a 2016 case in Pakistan, in which the country’s defence minister responded to a fake news report of a nuclear threat from Israel with a retaliatory tweet stating “Israel forgets that Pakistan is a nuclear state too.” He also mentioned the alarming sophistication of emerging artificial intelligence technologies, which are able to create realistic-style photos and videos imitating real people through algorithms as well as a growing trend of WhatsApp misinformation campaigns, which are very difficult to track. Finally, he questioned the efficacy of new regulatory initiatives, citing legislation in Germany and similar models emerging in South Asian countries. If, as in Pakistan, the government decides to become involved in fact-checking, does “anything that is politically critical of them become misinformation and disinformation?”

Ishara Danasekara, co-editor of Vikalpa Voices in Sri Lanka, highlighted the crucial relevance of political and social context in understanding how disinformation proliferates. Sri Lanka, she explained, is “recovering from nearly three decades of civil war” and it is a “divided country… along lines of language, land, education, unemployment.” Public trust in government and political processes is remarkably low in the country, and many major news outlets are owned by political figures. In an environment this complex, she asked, with the added challenge of poor literacy rates and a “low habit of fact-checking,” “how do we go back and counter this fake news and misinformation spreading?”

Representing DW Akademie, the media wing of the German public broadcaster, Roslyn Moore emphasised media and information literacy as essential to countering the spread and impact of disinformation. She suggested that this education should be counted as one of “our basic literacies… if you want to be an engaged, active citizen.” The Akademie has embraced the UNESCO definition of media literacy, a thorough account of digital empowerment which includes not only the ability to access information but also the capacity to analyse it, understand its underlying agenda and, importantly, participate actively in its creation. “Ultimately, what we want is for [people] to be able to fight for their rights,” she stressed. In pursuit of this goal, the Akademie has successfully lobbied the education ministries of Cambodia and Palestine to incorporate media literacy into the educational curricula, developed media expert networks and created journalist training programmes in media literacy. Mirroring the panel’s concerns, she called for an enabling rather than reactive approach to misinformation. “The idea”, she said, “is to strengthen the ability and level of citizens before looking at regulation… from my experience, it’s normally led to the crackdown on freedom of expression and a crackdown on journalists.” 

Expanding on this theme, Padraig Hughes of the Media Legal Defence Initiative explained his organisation’s focus on “challenging legislation that introduces provisions that seek to regulate fake news and a range of other issues relating to the cyber world.” Many recent legislative measures “relating to preventing electronic crimes or cybercrime acts… contain some very problematic provisions from our perspective,” he stated, “which will have a serious impact on freedom of expression.” The issue, he elaborated, is that much of this regulation is “vague and over broad in the sense that it doesn’t comply with well-established freedom of expression standards.” He also highlighted the similarity of these legislative provisions to measures that were introduced during the “war on terror” in terms of “loose language” that permits abuse of power.

What ultimately emerged from the conversation was the need for broader responses to legislative overreach and online restriction. As Hughes remarked, “What we've seen in the studies of this is that really strong coalitions and civil society and different players in the system are the way to go. It's much better to try to build a movement of concerned actors” than to take on the task of oversight alone. 

But beyond this question of resisting problematic regulation, the pervasive and, at times, profoundly harmful effects of disinformation remain. And responding to this challenge will require a more fundamental shift in our relationship to the internet and to each other. As Danasekar emphasised, "Attitudinal change is a must, because otherwise, however we initiate counter programmes to eradicate disinformation or misinformation, people still want to believe it as true information.” 

And yet, despite the political tensions which stoke these “fake news” campaigns and the increasing sophistication of online manipulation tactics, we still have the ability, as Moore stated, to “give people good enough critical analysis skills so that they know what to look out for.” While it may not be a “silver bullet”, it is an important start. 

Region: 
APC-wide activities: 
« Go back