content moderation
In early 2021, the Australian government enacted the News Media and Digital Platforms Mandatory Bargaining Code, which requires Facebook and Google to pay Australian media for their news content.
The third day of the Internet Rules: Unboxing Digital Laws in South Asia workshop discussed freedom of expression laws in South Asia and trends and challenges related to content moderation and intermediary liability.
Having RightsCon 2020 take place entirely online not only demonstrates how the digital space is increasingly important for many spheres of life, but also illustrates how essential it is to protect digital rights as a fundamental part of human rights.
APC welcomes this consultation, as it is timely and integral to our work. The pandemic poses challenges for content moderation, and while we recognise that these are extraordinary times, human rights laws and principles should be the default standards guiding companies’ policies and procedures.
In response to the Global Internet Forum to Counter Terrorism (GIFCT) call for expressions of interest to join its Independent Advisory Committee (IAC), APC and other NGOs expressed their concerns about the IAC specifically, and the growing role of GIFCT more broadly in regulating content online.
First-time IGF participant Miru Lee of the Korean Progressive Network Jinbonet shares her reflections on the discussions around two topics of particular interest to her: the human rights impacts of AI, and the complexities of content regulation in the online space.