Content regulation in the digital age: Submission to the United Nations Special Rapporteur on the right to freedom of opinion and expression

Author: 
APC
Publisher: 
APC

We welcome this topic because it is current and integral to our work. On the one hand there is a lot of “noise” in the mainstream media about so-called “fake news” and what appears to be a fairly rushed response from platforms consisting of increasing in-house regulation of content. On the other hand, human rights defenders and activists we work with express concern that platforms are removing some of their content in a manner that suggests political bias and reinforcing of societal discrimination.

The topic is also particularly important to APC as we continue the process of finding solutions to combating online gender-based violence without such solutions limiting freedom of expression online. Too often, the response to offensive and dangerous though lawful expression is that which seems most simple: censorship, in the form of takedowns, blocking or filtering content. Censorship is increasingly being implemented by private actors, with little transparency or accountability, and disproportionately impacts groups and individuals who face discrimination in society – in other words, groups who look to social media platforms to amplify their voices, form associations, and organise for change. For civil society and multistakeholder forums that deal with content regulation in the digital age more broadly, this is a useful moment to assess the strengths and shortcomings of state regulation and self-regulatory regimes when it comes to protecting the wide range of rights that internet users around the world have come to rely on to exercise their rights online and offline. 

General recommendations from APC

State responsibility: The primary responsibility of respecting, protecting and fulfilling human rights lies with the state. This includes the duty to protect against human rights abuses by third parties, including business.

Private sector accountability: All companies, regardless of their size, have a responsibility to respect human rights, by not infringing on the human rights of users and addressing adverse human rights impacts with which they are involved. This submission focuses on the large internet platforms, partly because their services are used globally by a large number of users. We also question whether these large platforms need to be treated in a different category because of the way in which they dominate the market. Aside from being very large companies, we question if there is something distinct in the way they operate, with their networked effect and the lack of alternatives, that impacts on users’ exercise of freedom of expression.

Human rights impact assessments: Given that companies are constantly introducing new products, updating their policies, and expanding into new jurisdictions, human rights impact assessments should be carried out on an ongoing basis, and should not be a one-time event. Human rights impact assessments should include all human rights that companies’ policies may impact, beyond freedom of expression and privacy, to include also economic, social and cultural rights, the right to be free from violence, and the right to participate in public life, among others. In addition, they should consider how their policies can strengthen, rather than undermine, due process.

Due process: Every user should have the right to due process, including to be expeditiously notified of content takedowns, the reason for the takedown, and the option to appeal a company's takedown decision, in every case.

Implementation of content restriction: While platforms are increasingly publishing their content restriction policies online, much more transparency is needed in terms of how they are being implemented. In particular, greater attention is needed to ensure that policies are upholding the international human rights principles of non-discrimination and equality, and are taking into account contextual factors, such as language, culture, and power dynamics.

Alternatives to taking down content: Content removals are just one way of addressing content that may be harmful to other users. Platforms are building tools that let users filter ads and other content. While this approach has the potential to further “information bubbles”, it can also empower users to make informed choices about the content that they see. This may be preferable to companies making these decisions for users, when at the end of the day companies’ criteria will be influenced by what content is deemed profitable. We encourage companies to explore tools that enable users to be in more control of their own online experience.

Transparency: Increased transparency is needed in a number of areas in order to better safeguard freedom of expression against arbitrary content removals and to better understand how the content viewed online is being moderated.

Multistakeholder process: We recommend the establishment of a multistakeholder process, building on existing multistakeholder initiatives, with input from different parts of the world, to develop global guidelines or norms to address the challenge of harmful content within a rights-respecting framework. This multistakeholder process could explore whether establishing a more traditional self-regulatory framework would have positive or negative consequences.

 

Image source: "Censored" rubber stamp Piotr VaGla Waglowski, http://www.vagla.pl.

« Retourner