On 15 March 2019 an anti-Muslim terrorist attacked two mosques in Christchurch, New Zealand, killing 51 people and injuring 49. Christchurch is my hometown and I was quickly on the phone to family members, checking their locations, texting those in lockdown and watching with horror as the reality of what was happening unfolded before me. I thought of my APC colleagues in South Africa, India, Pakistan and elsewhere who have experience of these attacks in their communities and was determined to act in solidarity with the Muslim community and the wider New Zealand public in response to the attacks.
Our Prime Minister, Jacinda Ardern, moved quickly, in an outstanding display of leadership, travelling immediately to the Muslim communities affected, framing the attack as one on all New Zealanders, not only Muslim communities, vowing compassion, issuing a pledge to ban semi-automatic weapons of the kind used in the attack and steering her people through a difficult emotional time of grief, anger and shock.
This was the worst terrorist attack in New Zealand in the modern era and a key element of the attacker’s operation was to live stream the attacks on the internet. While Facebook took down the live stream of the attack in less than half an hour, the video had by then already been viewed thousands of times and shared on a wide variety of sites. Within hours of the attack the country’s Chief Censor had ruled the video objectionable, so that possessing or sharing it became illegal. Prosecutions for sharing the video were filed within days. However, in the weeks that followed, the video was still available.
In the days following the attack, Ardern was questioning the use of social media for live streaming. Several weeks later Ardern joined with the French prime minister to issue a global call for an end to the use of social media for acts of terrorism.
“The Christchurch Call” is a collaboration with the French government and a global call to a meeting in Paris next week, on 15 May 2019, which seeks to "bring together countries and tech companies in an attempt to bring to an end the ability to use social media to organise and promote terrorism and violent extremism."
Two months on from the attacks and in the week before this global event, it is plain that the internet has been used for good and bad in New Zealand. The internet has enabled support for and sharing of the massive public outpourings of grief. Tens of thousands of people attended rallies throughout the country to decry the attack, holding public Muslim prayer vigils to show support for Muslim communities, to grieve, to come together in acts of democratic solidarity and call for deeper examination and honesty about racism and religious intolerance. Those are positive, if painful, conversations and the internet seemed, at first, to be enabling “a rare national unity, born out of revulsion with what had taken place.” But there was also a different element where the internet was used to spread misinformation about the attack, to confuse or misinform about the proposed gun law reform, and a host of expressly racist and Islamophobic groups were set up on Facebook and other platforms.
Having worked on internet rights for the last 10 years, I understand the complexities of promoting and protecting human rights and the technical difficulties of online content regulation. I support and have advocated strongly for both an open and uncapturable internet and for human rights online. Regulation of online content is fraught with problems including how to ensure lawful content (such as evidence of war crimes) is not affected by definitions of "terrorist" content. However, the international human rights standards provide a framework for balancing these different rights. At the same time, hard questions must be asked of whether the multistakeholder cooperation processes which work to create agreed norms at the technical DNS layers of the internet, are really working in the social media environment. Technical community members are pointing out that this cooperative norm-making process does not happen on social networks and, in its absence, regulation has now arrived as the only realistic option.
The local New Zealand internet community has been working hard to deal with issues, determine how best to engage with government and create spaces for community discussions. This week InternetNZ launched a forum for civil society and the technical community to participate in the lead-up to the meeting in Paris to support these discussions. The forum is open to all interested civil society and technical community members in New Zealand, and globally, for discussion and coordination. A meeting, Voices for Action, will take place in Paris on 14 May, where Ardern will meet with civil society leaders who are preparing a joint statement with the technical community.
Globally, civil society commentators are worried about the lack of adequate civil society engagement but are still supporting involvement. Some commentators have also criticised the lack of involvement of platform providers from Asia and it remains to be seen if any will participate.
Yet despite its short notice and rapid preparations, this meeting is an opportunity for internet rights activists to help develop options for better protecting human rights online and preventing the spread of extremist terrorist content. In doing so, we have a unique opportunity to ensure governments and other stakeholders show the leadership that has been asked of them to ensure the same human rights we have offline are respected and protected online.
Image: Dunedin mosque, Christchurch, by Joy Liddicoat