APC policy explainer: Disinformation
As the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression affirms, there is no universally agreed definition of disinformation. However, following her 2021 report on the issue, disinformation could be understood as false information that is deliberately created to harm; this definition is also aligned with the one adopted by UNESCO. Often, disinformation is organised, well resourced, and reinforced by amplification techniques, including automated technology.
Misinformation, on the other hand, lacks the element of “intention” and may be disseminated unknowingly. There are also cases in which true information is manipulated and used out of context with the intention to lead to error or to inflame certain feelings, such as hatred. This last example is sometimes referred to as “malinformation”.
All these different phenomena are difficult to handle precisely because they are difficult to define and related behaviour is hard to categorise. Some scholars have sought to frame disinformation as “viral deception”, pointing out that it consists of three vectors: manipulative actors, deceptive behaviour and harmful content.
APC views disinformation as a multifaceted, global and complex issue that should be understood as a symptom of much broader information disorders. Disinformation is not a new phenomenon, but it has acquired new dimensions – in terms of reach, speed and volume – with the expansion in the use of digital technologies and, in particular, social media. Technologies also allow a diversification of actors who produce and disseminate disinformation.
Research shows how disinformation campaigns particularly target minorities and vulnerable groups, as well as human rights activists and environmental activists, among others. APC is particularly concerned with gendered disinformation that targets not only women, but feminist struggles and gendered discourse, and is used to silence women, push them to self-censorship and restrict their civic space. APC observes that the situation is even more striking from an intersectionality lens: female political leaders and activists from racial, ethnic, religious or other minority groups are targeted far more often than their white colleagues. For APC, gendered disinformation should be considered as a different phenomenon, separate from gender-based online violence, which requires specific monitoring and solutions.
The change we want to see
APC considers disinformation to be a complex and multifaceted problem that cannot be properly addressed by a fragmented approach. It is also a multistakeholder challenge that requires dialogue between different sectors – dialogue that needs to be built on transparency and participation in decision making. A holistic approach to understanding disinformation requires an analysis of our broader information ecosystems. Only such a holistic approach will allow us to identify solutions and preventive actions that build on the strengthening of the other spaces and actors that promote the flow of information, visions and ideas within our societies. And any solutions, in particular policy and regulatory measures, should be built on truly participatory processes and avoid broad criminalisation provisions. More specifically, to address disinformation, APC advocates for:
Healthy information systems that include robust access to public information; plural, accessible and diverse media contexts; independent and qualified journalism; and the possibility of expressing ideas safely.
Digital and media literacy programmes to counter information disorders. Such programmes could be carried out independently, but also embedded into the regular educational system curricula by states.
Greater caution in the use of criminal sanctions for tackling disinformation, which is often disproportionate, as the UN Special Rapporteur on freedom of expression has said in the past.
A clear definition of disinformation and its differentiation from other information disorders, especially when regulations are adopted to address it. In addition, states should apply the three-part test of legality, necessity and proportionality to any measures taken, considering that attempts to curtail information disorders may significantly impact on freedom of expression and opinion.
Increased access to data and information held by tech companies to allow us to better understand the phenomenon of disinformation.
The promotion of digital inclusion, including universal and affordable access to the internet for all, by governments.
A human rights-based approach to guide companies’ content moderation processes (not just in how they respond to requests for takedowns, but throughout the entirety of their operations). This approach should be guided by, among others, the principles of accountability, equality and non-discrimination, participation and inclusion, transparency, empowerment, sustainability, and non-arbitrariness.
Special attention by platforms and governments to long-term issue-based disinformation campaigns, especially those targeted against specific groups and themes, including human rights, women’s rights and environmental issues.
An understanding of gendered disinformation as a specific phenomenon, separate from gender-based online violence, which requires specific monitoring and solutions.
How APC works on this issue
APC’s holistic approach to disinformation includes advocating for strategies that address the underlying factors of this phenomenon, for multidimensional and multistakeholder responses, and – in line with the UN Special Rapporteur on freedom of expression – for international human rights to serve as the guiding framework for addressing it.
APC contributes to discussion in global policy spaces on this issue, such as at the Human Rights Council and its Special Procedures, and the Freedom Online Coalition.
We work collaboratively with APC members to raise awareness around initiatives at the national level aimed at tackling disinformation that threaten human rights, such as the Brazilian disinformation bill.
We also engage critically with social media platforms, calling on them to review their business models and to align their terms of service and community guidelines with international human rights standards.
Countries worldwide are using legitimate concerns about online disinformation to deepen their control over the internet and people. These policy and legislative initiatives share some similarities: they give discretionary powers to executive bodies to decide whether a piece of content is false or misleading, and give these bodies the power to issue fines, corrections or even hand out prison sentences for creating, publishing or disseminating pieces of content. In these cases, creators, disseminators and publishers of disinformation are the main targets of these regulatory initiatives. As APC work on this issue has shown, these criminalisation efforts often do not distinguish between lawful and unlawful expression, limiting the exercise of freedom of expression and allowing governments to exercise greater control and discretion.
APC work in the last few years has also highlighted worrisome initiatives that have emerged in countries such as Malaysia, Singapore and Egypt. In Africa, disinformation laws or provisions exist in Kenya, Uganda and Tanzania, among other countries. As the African Declaration on Internet Rights and Freedoms Coalition expressed, in many cases these initiatives have failed to demonstrate the necessity of limiting the right to freedom of expression through the criminalisation of so-called false news.
In the midst of the COVID-19 health crisis, emergency legislation in Palestine provided more space for restrictions on freedom of expression and privacy, since it includes broad terms with no legal provisions, safeguards or measurable standards, for example, prohibiting posts, statements or news on social media related to the state of emergency that are not based on an official source.
Some spaces and institutions to engage with
Direct interaction with platforms, including through bodies such as the Facebook Oversight Board
The global Internet Governance Forum (IGF), for example, through the Best Practice Forum on Gender and Digital Rights that will be focusing on gendered disinformation during 2021
Digital rights events and conferences such as RightsCon
Disinformation-related legislative discussions at the national level.