How is the internet being used to improve people's lives? APC sends you news stories, resources and info about what we are doing twice a month.

Everyone has rights, including online. APC documents trends, advocates for freedom of expression/information & trains human rights defenders to use the internet securely.

Arbitrary blocking or filtering

There are no generic bans on content

The New Zealand government does not operate any known blocks on general Internet content or platforms (such as YouTube or Facebook).

Sites are not prohibited solely because of political or government criticism

Websites are not prohibited for the sole reason that they have political or government criticism. This standard is therefore met.

State blocks or filters websites based on lawful criteria

State blocking of child sexual abuse images does occur as there is one State operated content blocking system. The Department of Internal Affairs, in partnership with Internet Service Providers, operates a voluntary Digital Child Exploitation Filtering System (DCEFS). A Code of Practice was developed for the DCEFS and most recently updated in March 2013.14

The Code provides that “only website pages containing images of child sexual abuse will be filtered and that the privacy of ISP customers is maintained. The filter will not cover email, file sharing or borderline material.”15 When the filter operates on designated images, the content is blocked, and warning notices are displayed. The filter operates through ISPs. Three major ISPs, which account for approximately 80% of Internet user accounts, operate the filter, with the result that they do not offer their customers the choice of filtered or unfiltered Internet connection.

The Department operates the DCEFS in conjunction with enforcement activity undertaken by its Censorship Compliance Unit. This includes “online investigations into the trading of objectionable images on peer to peer networks, the prosecution of offenders and coordination with other enforcement agencies to have objectionable websites taken down”. Enforcement efforts include international cooperation to identify and rescue of victims, and ensure offending websites are quickly shutdown and owners prosecuted.16

State action to prevent child abuse is a permitted limitation on freedom of expression, provided State action meets the requirements of necessity, proportionality and lawful authority (including linked to offline enforcement strategies). The DIA filter, which is confined to images, appears to be narrowly framed and linked to offline enforcement regimes. The filter’s operations are monitored by an external multi-stakeholder group of respected persons. The DCEFS filter is overseen by an independent reference group and has a number of checks and balances in place. However, the content blocking is not subject to the same checks and balances as apply to censorship of other material, such as films and broadcasting.

There is no clear legislative authority for the DCEFS and this lack of lawful authority means the DCEFS does not comply with the La Rue framework. This is a serious concern and raises other human rights implications: it suggests that regulatory or judicial mandate for content blocking is not required and that ISPs may be conducive to “voluntary” participation in other cases even when they have no legal obligation to do so.17

Aside from child sexual abuse images, specific website content can also be removed by Court order and domain names may also be lawfully blocked by Court order.18 The number and range of informal requests for takedown of online content without court order is unknown.

The technical community generally agrees that content filtering or blocking is ineffective because it does not result in removal of the content from the Internet, and is not effective in controlling online expression, but rather:19

… raise human rights and freedom of expression concerns, and often curtail international principles of rule of law and due process. The negative impact of DNS filtering far outweighs the short-term legal and business benefits.

Internet Blocking and Filtering by Public or Private Agencies

There is widespread filtering in public and private networks, predominantly through the use of contract based filtering agreements (consent to terms and conditions of access and use) as a pre-condition to use of the network. Internet filters and content blocking occur in diverse public places where Internet users access wifi services (whether for free or for payment) including in public libraries, airports, universities and schools, hospitals, Internet cafes, on public transport, in public space such as parks, and in shopping centres. Many private networks, such as businesses and other workplaces, also operate content filtering and blocking policies.

There is little research on how public and private content filtering and blocking policies are developed and few appear to be available for public scrutiny. Most public wifi networks do, however, have terms and conditions relating to use which authorises filtering and content blocking for “offensive” content. There have been a small number of news media reports of complaints about content blocking or filtering on public wifi. In general, tolerance of internet filtering appears to be high and awareness of or concern about how public Internet access filtering and content blocking policies are developed appears to be low.20 There appears to be little or no available New Zealand research about New Zealand Internet users’ knowledge of tools for circumvention of filtering and content blocking. WIPNZ data on the use of filters was unable to be analysed during the research period.

The issue of Internet content blocking was brought to public attention on 19 January 2012. In a dramatic law enforcement raid, New Zealand police officers assisted the United States of America’s Department of Justice to seize and shut down the file-hosting site Criminal proceedings were commenced against its owners, and assets belonging to the company and its New Zealand resident owner, Kim Dotcom, were frozen. Extradition requests were made for four people, including Kim Dotcom, in relation to racketeering and money laundering related to copyright infringements under United States law. Megaupload site users, including New Zealanders with content hosted on the site, have been unable to access their online storage facility since the raid and have not been advised whether their accounts have been accessed by authorities. Kim Dotcom subsequently opened another site, a cloud based storage facility with user controlled encryption keys.

The Megaupload case has heightened awareness about online security and privacy. The case also had widespread political ramifications because of concerns about the use of armed law enforcement officers to serve warrants, search premises and seize property.

A 2012 survey by InternetNZ on Internet filtering found public awareness of the DCEFS is extremely low.21 Less than 9% of respondents knew whether their ISP was using the filter for their Internet connection and only 46% had heard of the filter. Opinion was divided on the issue of whether an Internet filter that blocks access to child sexual abuse images actually lowers abuse: 40% of the people surveyed thought it was likely an Internet filter would lead to a reduction of abuse, 32% thought it unlikely and 27% were neutral.

Frank La Rue highlights the ability of people to tailor their Internet filtering as a way to empower Internet freedom. However, the InternetNZ survey found that “most people (69%) have not on their own initiative installed filtering technology on devices at home,” suggesting that “people could benefit from more at-home filtering options or information educating them on at-home filtering options.”22 Despite this, when asked who should make the decision to filter, three quarters of respondents (73%) stated that they wanted to either directly choose or have their ISP choose whether a filter is applied to their account versus one quarter (23%) wanting the Government to decide.23

State provides lists of blocked and filtered websites

While the government has provided information about the processes by which the DCEFS filter has been developed (including membership and modalities of the independent reference group), lists are not publicly available. Lists of filtered websites are maintained, checked monthly, require verification by 3 investigators and checks for false positives. Aggregated data about the number of blocked sites is available. InternetNZ has criticised this lack of transparency:24

“Routing of legitimate traffic to a Government agency creates potential for misuse of the process, irrespective of best intentions. For example, the Government could reroute traffic to identify visitors to activist websites it was suspicious of. This can only be mitigated by publicising the list of diverted IP addresses and the domains to which they correspond.”

Blocked or filtered websites have explanation on why they are blocked or filtered

Sites filtered by the Department of Internal Affairs receive a warning notice explaining in general, but not specific, terms why they are filtered. There appears to be no best practice or industry or public policy “standard” for the wording of explanations on why sites are blocked or filtered.

Content blocking occurs only when ordered by competent judicial authority or independent body

Content blocking occurs by Court order, but it is not known if it occurs only by Court order. There is no information about informal requests for content blocking by law enforcement or other authorities. Anectodal evidence suggests informal requests have been made in relation to political content, including a request by law enforcement during the 2009 general election for take down of a website of a political party promoting decriminalisation of marijuana.25

The .nz domain name service is overseen by the Domain Name Commission Limited (DNCL).26 DNCL’s public policy on domain name takedown is contained in the Registrar Authorisation Agreement, which provides that may only take down or block domain names by Court order. Content blocking which occurs without proper judicial authorisation is grounds for disciplinary action against a Registrar. The DNCL rationale for this position is:

“The addition of a take down clause is considered appropriate given the increase in reported incidents overseas where domain names have been taken out of the DNS, or cancelled, on the request of law enforcement or other agencies, rather than on the basis of a properly obtained court order making clear directives in respect of the domain name/s involved. Our intention in adding this clause is to clarify that Registrars are not expected to respond to take down demands from law enforcement in the absence of a clear court order seeking that action. If Registrars receive such a demand, they ‘should’ forward it to the Domain Name Commission for action with DNCL working with the appropriate agencies to resolve the matter. DNCL will accept the legal liability and support Registrars put in a difficult position.

DNCL liaises with the party requesting takedown, advising that until a court order is obtained the Registrar is not permitted to comply with the written request to take down content. If the Registrar takes down content without lawful court order, their authorisation can be revoked.27

Where blocked or filtered content is child pornography, blocking or filtering online content is connected with offline national law enforcement strategies focused on those responsible for production and distribution of content.

Yes, as noted above.


There are protections for content blocking and informal take down of online content by Registrars. There is content blocking by government of child sexual abuse images. Action by government to prevent access to such material is permissible as a limitation on freedom of expression under international law. However, the DCEFS raises serious concerns because it does not have clear legal authority. Survey results suggest many New Zealanders do not know about content filtering in general, the DCEFS specifically, and are not actively using tools to filter their own content, nor to question the filtering policies that regulate their use of public and private networks.

14 “Digital Child Exploitation Filtering System Code of Practice” (Department of Internal Affairs, New Zealand Government), available at:

15 Ibid.

16 Ibid.

17 See also “InternetNZ: Filtering Position Paper” (Wellington, January 2010), p 4.

18 Domain Name Commission “Registrar Authorisation Agreement” (2012) clause 15.

19 See for example “Internet Society Perspectives on Domain Name System (DNS) Filtering” (Geneva, 2011), 1.

20 See, for example, Nethui workshop reports 2011 and 2012:

21 InternetNZ Survey on Internet Filtering (Wellington, April 2012), p 1. Available at

22 InternetNZ, above n 20, p 1

23 Ibid.

24 InternetNZ above n 14, p 6.

25 InternetNZ member’s list discussion.

26 For a brief overview of the history of domain names in New Zealand see: Keith Newman, Connecting the Clouds: The Internet in New Zealand (Activity Press, Wellington 2008) 292-296.

27 See also:

Sign in to