The third day of the Internet Rules: Unboxing Digital Laws in South Asia workshop opened with a free session where participants were able to exchange insights and experiences related to digital laws and their implementation in the region. During this unstructured exchange, participants had the chance to share stories and takeaways gleaned from personal experiences.
Halfway through the curriculum, the third day facilitated a deep-dive on more complex issues focusing on freedom of expression and content moderation and intermediary liability.
The first session had Sadaf Khan of Media Matters for Democracy as the resource person. The discussion looked into freedom of expression and the fact that it is a fundamental right that needs to be guaranteed. Article 19 of the Universal Declaration of Human Rights states that everyone has the freedom to hold opinions with no restrictions or interference. The article is also forward-looking as it refers to different kinds of media, which makes it applicable to the digital sphere.
However, laws governing freedom of expression are often challenged because the definitions of speech and expression can be subjective. To address this, laws should be clear and applied equally among the citizens. Further, rules should outline any restriction on freedom of speech with a clear explanation of why it is necessary and its proportionality to protect a legitimate aim.
The session emphasised that freedom of expression laws should be permissible, legitimate, necessary and proportionate. One must ask oneself: If the laws are not applied equally, how do I know that it won’t affect me?
Participants also discussed forms of expression that are often criminalised by the state. These include religiously motivated restrictions, as well as restrictions related to security, safety and defence, hate speech and incitement, decency and morality, and misinformation. The need to ensure that laws are assessed against a multistakeholder framework that guarantees everyone’s right to freely express themselves without fear of persecution or criminalisation by the state was also highlighted.
The second session of the day welcomed Usama Khilji, director of Bolo Bhi, who shared insights on content moderation and intermediary liability. In this session, the participants recognised that content moderation is governed by a platform’s community guidelines. However, states can overrule these guidelines and impose restrictions based on their own laws. Content moderation, in effect, becomes a political decision as both the government and tech companies can impose sanctions on what should and should not appear online.
Through this session, it was underscored that activists are often subjected to the work of flagging derogatory content only for the same content to show back up on the site. There are also challenges in the algorithms used by platforms, since they may not be able to moderate local languages with accuracy.
When talking about content moderation and intermediary liability, one must ask the following questions: Is this company transparent? How do they moderate content? Are these standards implemented fairly?
Usama noted that content moderation may be improved by allocating resources for localised application of standards that assess and understand the country context better. Tech companies should also be held responsible for how they impose their own standards and should be called out for exercising bias towards a group of people. Lastly, one must think whether the internet should be the same in all of the world (strictly adhering to international standards), or if it should be something that adapts itself to community-based nuances and contexts.
Both sessions were followed by a dynamic round of discussions among the participants, and a sharing of personal experiences that helped make sense of what was discussed through their own contexts and realities.
Click here for more information about the Internet Rules: Unboxing Digital Laws in South Asia workshop.