By APCNews 24 September 2019
On 23 September, the Association for Progressive Communications (APC) participated in the Christchurch Call Leaders’ Dialogue at the UN General Assembly in New York as a member of the Christchurch Call Advisory Network.
The meeting featured announcements and updates on the Christchurch Call, which was launched in May 2019 in response to the 15 March terrorist attack in Christchurch, New Zealand. During the tragic attack, the killing of 51 people was broadcast live on the internet, and then copied and redistributed widely, resulting in a rapid and wide-scale spread of fear and suffering.
The Call is a commitment by governments and tech companies to eliminate terrorist and violent extremist content online. It includes a set of actions designed to build greater capacity for tackling the challenge of terrorist and violent extremist content online, and outlines measures to stop the internet from being used as a tool for terror. The Call now has the support of 48 countries and three international organisations.
Among the announcements at the meeting was the creation of an advisory network, which represents a range of perspectives, including human rights, freedom of expression, digital rights, counter-radicalisation, victim support and public policy. Another announcement was the launch of a new crisis response protocol for governments and tech companies, aimed at improving coordination for better management of online impacts in the wake of terrorist and violent extremist attacks, as well as reforms of the Global Internet Forum to Counter Terrorism (GIFCT).
Although APC does not endorse all aspects of the Call (many of our concerns are outlined here) or the fact that it counts among its supporters governments who actively or tacitly support the spread of violent extremist content online , we see this as an important process seeking to address a critical issue: the spread of violent extremist and terrorist content online. To that end, we have joined the Christchurch Call Advisory Network to help shape the direction of its implementation in a way that is grounded in human rights and takes a holistic approach, namely, by addressing the root causes of the problem and not just proposing technological fixes, taking into account the full range of rights affected and the power differentials in societies, and preventing the undermining or harm to an open and secure internet.
The meeting featured over a dozen heads of state and leadership of technology companies. APC was honoured to speak as a member of the Christchurch Call Advisory Network, along with Dia Kayyali from WITNESS and Anjum Rahman from the Islamic Women's Council of New Zealand.
The full statement, presented by APC's Global Policy Advocacy Lead Deborah Brown at the UN General Assembly, follows below.
APC statement to the Christchurch Call Leaders’ Dialogue at the UN General Assembly in New York,
23 September 2019
For too long, efforts to address violent extremist and terrorist content have focused almost exclusively on Islamic terrorism, overlooking violent misogyny, white supremacy, far-right and anti-LGBTIQ extremism. The Christchurch Call provides an opportunity to shift course and address these gaps. We offer some suggestions to that end.
First, engage those affected and embrace holistic solutions. Violent extremism can be hyper local, using language, phrases and images that are not picked up by content moderators or algorithms, but that can spark violence in a particular context. There is no silver bullet technical “fix” to this problem. Ad-based business models fuelled by algorithms that favour extremism are a big part of the problem.
Second, reign in outsourcing of speech regulation to private actors. Reinforce the rule of law, due process and judicial oversight. More transparency and accountability are needed. Companies should make databases of affected content available for independent review and audit, and provide remedy mechanisms to people using their services – in all countries and languages.
Third, states must refrain from enabling and perpetuating violent extremism either by their own actions or inaction. States have the primary responsibility to nurture an environment where diversity, peace and democracy thrive.
Fourth, efforts to curb the spread of violent extremist content online must consider the different technical layers of the internet. Regulatory interventions should be proportional, narrowly tailored to where they can be most effective, and mitigate unintended consequences for people's freedom of expression and ability to share information safely and securely.
Finally, power matters. It is important to be clear eyed that the same powerful technology companies that are central to this discussion have business models that profit from extremist content. Affected groups, civil society, human rights defenders and the technical community must be at the table. We bring technical expertise, research, and lived experiences. We are eager to work together towards solutions, but need resources and assurance that our voices will be heard. We thank the Prime Minister of New Zealand for her leadership in this regard.
This statement can also be found here.