Chapter 8 - Free speech abuses: options for reform

Reforming the law

Problems of coverage of the law

As discussed, there are two main problems with respect to coverage:

  • defining the type of communication covered by the statute.
  • providing a legal remedy to novel publishing harms arising on the internet.

We discuss our preliminary proposals for dealing with these problems below.

Type of communication covered

As noted there are wide variations in how statutes define the type of communication they cover. Some extend their prohibitions, expressly or impliedly, to any form of communication.

That is the case with many of the provisions prohibiting publication of material suppressed by a court. The standard phrase “in any report or account of proceedings” could hardly be wider. It would appear to cover a report in any medium, including a blog or other website (although some residual doubts about the extent of this are the subject of an appeal, as noted in the previous chapter). Most of the Crimes Act provisions about threats and incitements are couched in the most general terms: they can be communicated in any way. Those provisions, although written a long time ago, are in terms flexible enough to do service in any communication environment. Other provisions expressly and in some detail extend the prohibition to elements beyond the mainstream media. An example is the Coroners’ Act 2006 which prohibits the making public of certain information about self-inflicted death. “Make public” is expressly defined as meaning publishing by means of broadcasting, a newspaper, a book or magazine, a sound or visual recording, or “an internet site that is generally accessible to the public or some other similar electronic means”330. In similar vein, the Films, Videos and Publications Classification Act 1993, which creates offences relating to objectionable publication, provides that a publication is constituted by supplying, distributing or importing not only in physical form but also by means of electronic transmission “whether by way of facsimile transmission, electronic mail or other similar means of communication other than by broadcasting”331.

Other provisions, however, are narrower, and extend their reach only to quite specific types of communication. For example the Prostitution Reform Act 2003 prohibits advertisements for commercial sexual services on radio or television, or in a newspaper or periodical (except in the classified advertisement section) or in a public cinema.332 That is quite specific, and does not apply to advertisements on the internet or other forms of new media.

Such instances are few enough, but we think there should be a perusal of the statute book to make sure that all controls on communication are widely enough expressed to fulfil their purpose. In some areas – and the Prostitution Reform Act may be one – there may be a genuine reason for confining the offence to the mainstream media. In others there may not.

In this regard we draw attention to three provisions in particular where we think the existing drafting would benefit from amendment to make it clear beyond doubt that they cover communication in cyberspace. Perhaps they would be so interpreted now, but there is advantage in spelling it out beyond doubt.

The first is the Harassment Act 1997, where both the civil and criminal provisions use a definition of “harassment” which provides that it can be constituted, among other things, by:333

  • making contact with a person, whether by telephone, correspondence or in any other way;
  • giving offensive material to a person or leaving it where it will be brought to the attention of that person; or
  • acting in any other way that causes the person to fear for their safety.

Probably most instances of cyber-bullying would already be held to be covered by the first of the above paragraphs when the person is targeted directly. There are District Court decisions supporting that interpretation.334But the advantages of expanding the provision to clearly cover harassment in cyberspace are (a) that it removes any shadow of doubt; and (b) that the message is clearly apparent to all who use the legislation. We think the ambit of the first paragraph should be clarified by inserting “electronic communication” after “telephone, correspondence”. More important, we think, is to expand the second paragraph to make it clear that “leaving [offensive material] where it will be brought to the attention of that person” includes placing offensive messages on websites, or in the social media. Harm can be done, and is done, by offensive messages which are not sent directly to the subject of, but to others (sometimes very many others) in circumstances where it is highly likely they will come to the notice of the subject. The second of the above paragraphs is presently not clearly adapted to that situation.

The second is section 112 of the Telecommunications Act 2001. It prohibits the use of a “telephone device” to convey disturbing, annoying or irritating messages.  There is currently some doubt as to what the boundaries of “telephone device” are. As currently defined it is “any terminal device capable of being used for transmitting or receiving any communications over a network designed for the transmission of voice frequency communication”. Whether this applies to any communication via computer is not absolutely clear, particularly since the advent of wireless.

We think that should be clarified. If communication via computer is to be covered, consideration will need to be given to the interface of this provision with the Harassment Act 1997. But there is merit in so providing: to do so would mean there would be a clear route for prosecuting deeply disturbing conduct of the kind referred to in paragraph 7.58 above.

The United Kingdom Communications Act 2003 makes it an offence to send by means of a “public electronic communications network” a message that is “grossly offensive or of an indecent, obscene or menacing character”.335

The third is the Human Rights Act 1993. Currently section 61 renders it unlawful:

(e) to publish or distribute written matter which is threatening, abusive, or insulting, or to broadcast by means of radio or television words which are threatening, abusive, or insulting; or

(f) To use in any public place as defined in section 2(1) of the Summary Offences Act 1981, or within the hearing of persons in any such public place, or at any meeting to which the public are invited or have access, words which are threatening, abusive, or insulting; or

(g) to use in any place words which are threatening, abusive, or insulting if the person using the words knew or ought to have known that the words were reasonably likely to be published in a newspaper, magazine, or periodical or broadcast by means of radio or television, -

being matter or words likely to excite hostility against or bring into contempt any group of persons in or who may be coming to New Zealand on the ground of the colour, race, or ethnic or national origins of that group of persons. 

Probably paragraph (a) extends to internet publication: “publish or distribute” is certainly wide enough to do so, and “written material” is defined to include “signs and visible representations”. But the section is drafted with an eye to an earlier time, and a possible argument could be made that when read in the context of paragraphs (b) and (c) the whole provision is confined to the traditional print and broadcast media. It could, we think, be usefully updated.


Sections 62 and 63 deal with sexual and racial harassment respectively. They render it unlawful to use language or visual matter which is offensive to a person, and is either repeated, or of such a significant nature, that it has a detrimental effect on the person in respect of a number of specified areas, including:336

(g) access to places, vehicles and facilities;

(h) access to goods and services; …

(j) education.

We have no doubt that harassment of the kinds with which the sections deal can deter individuals, particularly young people, from using the social media, and thus limit their interaction with their peers. That is perhaps covered by paragraph (h), but not clearly and unarguably so. We wonder whether the matter is significant enough to justify adding a further paragraph: “(k) participation in fora for the exchange of ideas and information”.

The common law is less problematic in this regard. Its inherent flexibility is well able to deal with all forms of communication. In relation to contempt of court there is no doubt that any form of dissemination of prejudicial material via any vehicle of communication can constitute a contempt. In New Zealand proceedings have been commenced in relation to publication of allegedly prejudicial material on a website. The Solicitor-General has, on occasion, warned that if material is not removed from a website, contempt proceedings might ensue. In both the United Kingdom and New Zealand concerns have been expressed about jurors in a criminal case doing their own research on the internet to discover material which might be relevant to the case before them. There is precedent in the United Kingdom for proceedings being commenced against a juror guilty of prejudicial conduct by use of the social media.337

Gaps in the law

Should there be new provisions to fill gaps in the law which have been revealed by the advent of the new media: where, in other words, there is no provision that obviously covers conduct of a harmful kind?

It is clear from the above discussion that damage can be caused by the impersonation of people, particularly in the social media: for example by false Facebook pages. Sometimes such conduct may amount to harassment. It will often be defamatory, and sometimes may involve a breach of privacy, but that gives rise only to a civil remedy. If the impersonation is for financial gain it will usually constitute fraud or obtaining by deception. It is also an offence to impersonate various occupations: for instance a police officer and a pilot.

But there may still be cases in which real hurt can be caused by falsely impersonating another person, but no other provision obviously covers what has happened. We have considered whether there should be an offence of maliciously impersonating another person. Such a provision would not be without precedent in this country. It used to be an offence to “impersonate another person by means of a radio station”.338 That unqualified prohibition could doubtless serve to protect a number of interests – both of the person concerned and the public in general. It is now repealed. We believe a more targeted provision of the kind we outlined above is at least worthy of consideration.

Careful consideration would need to be given to the elements of any such offence. Malice would be an essential ingredient. Impersonation for the sake of humour is one thing; impersonation with the intention of causing harm is another altogether. The harms against which the proposed offence might be directed might include intimidation, and fear for safety. We seek views on this, and in particular whether the existing offences can in fact deal with the mischief we have identified.

A second possible gap in the law relates to the publication of intimate photographs. We are aware of several cases where, on the breakup of a relationship, one former partner posts intimate pictures of the other on the internet. We asked in our review of the Law of Privacy whether if intimate pictures are taken with a person’s consent, it should be an offence to publish them on the internet without that person’s consent.339 We there concluded not, but the matter may be worthy of further consideration. There is at least one case where a judge resorted to section 124 of the Crimes Act to enter a conviction and impose a sentence of imprisonment in a case of this kind.340 That section, whose origins are over 100 years old, is arguably not best adapted to the purpose. It expressly deals with “distributing to the public any indecent model or object”. There perhaps needs to be a more direct route to the end result.

Thirdly, we noted above three possible gaps in the Privacy Act. The “news media” are not bound by the information privacy principles; it is not an infringement of privacy if the information published was collected or held for domestic purposes; and it is not an infringement of privacy to publish material already publicly available. In its review of the Privacy Act the Law Commission recommended amendments to fill all these gaps.341 It recommended that “news media” be defined to encompass only media which subscribe to a code of ethics and are subject to a complaints body: the large range of communicators in cyberspace who do not meet those conditions would then be clearly covered by the Privacy Act. It also recommended that the domestic purpose exception should not protect the offensive use of material, and that, likewise, the “publicly available” exception should not be available to exempt offensive and unreasonable use. We continue to support those recommendations.

Finally, incitement to commit a crime is an offence even if the crime is not committed.342 Yet incitement to commit suicide is not an offence unless the person actually does so, or attempts to do so.343 Given the distress such incitements may cause in themselves, let alone the possibly devastating outcome, we think there is a strong case for making incitement to suicide of itself criminal. Attempted suicide is no longer a criminal offence, but we believe that is no reason for decriminalising incitement.

Enforcement Issues


There is a question of who is legally responsible when the law is broken by a publication, whether on the internet or elsewhere. In other words who is the appropriate defendant? Is it the media company; the editor of the publication (if there is one); the host of the website on which the item appears; the individual who generated the content; or even the internet service provider (ISP) through whose channel the item reaches the viewer? We have said above that the answer may well be different for the purpose of different parts of the law. We do not propose to attempt to formulate any general principles in this Issues Paper.

However the position of ISPs merits special consideration. In relation to defamation, the issue needs clarification. Defamation is a tort of absolute liability. Anyone who has contributed to the dissemination of defamatory material is, in theory, liable for it whether they know of its defamatory nature or not. Before statute remedied the position even printers were liable for what was published: their liability was based simply on the fact that they had been involved in the dissemination process even though they had played no part in the creation of the material. The question is how this rule affects ISPs. They can probably take advantage of s21 of the Defamation Act 1992 which provides a defence of “innocent dissemination”:

21. Innocent dissemination – In any proceedings for defamation against any person who has published the matter that is the subject of the proceedings solely in the capacity of, or as the employee or agent of, a processor or a distributor, it is a defence if that person alleges and proves-

(a) That that person did not know that the matter contained the material that is alleged to be defamatory; and

(b) That that person did not know that the matter was of a character likely to contain material of a defamatory nature; and

(c) That that person’s lack of knowledge was not due to any negligence on that person’s part.


In reports in 1999 and 2000 the Law Commission recommended that any doubt be put to rest, and that there be a statutory amendment to the effect that:344

the definition of “distributor” in section 2(1) of the Act be amended to include explicit reference to an ISP.

We continue to support this amendment.

The question of the liability of ISPs in other legal contexts is similarly unresolved, but the more reasonable view would seem to be that an ISP is a conduit for the publications of others rather than a publisher itself. Mr Justice Eady has described an ISP as a “passive medium of communication”.345 It is too punitive to make it strictly liable for material posted by third parties. If liability is to attach at all, it should be only in relation to infringing material of which it has been given clear and specific notice, and in relation to which it declines to take such remedial action as is within its power. The Law Commission so recommended in 1999 and 2000.346 In this Issues Paper we do not further discuss the question of imposing general legal liability of a kind which would involve criminal sanctions or civil liability in damages against ISPs. But, as we shall expand on shortly, we do wonder whether there might be merit in a provision which would enable a court or tribunal to issue “take-down orders” against ISPs and website hosts irrespective of their legal responsibility for the content.


As we saw above, the modes of enforcement available against the mainstream media are also available, and have been used, against communicators using the new media. The law governs all, and the consequences of breaching it should be the same for all. Taking legal action against a few infringers is not without value. It can contain the spread of the objectionable content, and can serve to keep infringing material out of the mainstream media where it would receive its greatest exposure. Particularly damaging communications which constitute a criminal offence sometimes do merit the time and resource it takes to track down perpetrators and prosecute them.

Yet we have noted in the previous chapter the very real difficulties of enforcing the law against the new media. We have considered whether the law relating to enforcement requires amendment or expansion in the new environment. Realism dictates that there are limits to what one can effectively achieve.

Yet, as we have demonstrated, breaches of the law by the newer means of communication can cause significant psychological harm, and even worse, to victims. We wish to explore the possibility of a swift and reasonably effective remedy for such persons. We seek views whether there should be a statutory power in the courts to make take-down orders, or cease-and-desist orders, and whether such a power should be available against avenues of communication such as ISPs or website hosts, even though they themselves are not legally parties to the wrongdoing. What victims usually want is simply that the damaging communications about them stop, or be removed from the internet.

We understand that website hosts and ISPs are usually prepared to do this now if they are requested to and if they are satisfied that the law has been broken. As we have seen, many responsible website hosts have systems in place which allow members of the public to complain about postings, and which result in offending material being taken down, so far as it is possible to do so. It is not going a great deal further to empower a court to order such a take-down in appropriate cases if other avenues have failed, and if the hoped-for co-operation has not been forthcoming.

We emphasise that we are not proposing that ISPs should be legally responsible for anything which they transmit in the sense that they could incur sanctions. Nor are we suggesting that website hosts should be liable to greater legal responsibility than they were before. The proposal is simply that they could be subject to a court order to remove infringing material.

Such a power would need to be carefully circumscribed and qualified. The power should be exercised only in cases where there has been a breach of the law; where that breach has caused or is likely to cause demonstrable distress, humiliation or harm; and after proper consideration of whether the order is a justified limitation on the Bill of Rights Act guarantee of freedom of expression. It should only be exercised when other remedial measures have failed or are impracticable. It is not envisaged that the Crown would often have access to this remedy: to do so it would have to demonstrate damage to the public interest in the continued publication of the item. An order should require the ISP or website host to take reasonable steps to remove the item. This last qualifier is necessary because an ISP’s powers are limited. It cannot itself remove a single posting from a website, although it can block access to the website as a whole. It may however be able to exert some influence over website hosts, and be able to persuade them to remove particular offending items.

The order would extend to any servers hosting such content to which the ISP has access or control directly or by conventional arrangements. Nor can an ISP or website host guarantee that an item will be completely removed for all time: the removal of the original item will not necessarily expunge it from other sites to which it may have migrated, and it may still remain in caches or internet archives. However such take-down (or cease-and-desist) orders can achieve much, and we think they deserve consideration as a general remedy. We anticipate that they would seldom be needed. We seek views on this matter.

Coroners Act 2006 s73.

Films Videos and Publications Classification Act 1993 s123(4).

Prostitution Reform Act 2003 s11.

Harassment Act 1997 ss 3 and 4.

See for example Rodriguez v Osborne (DC, Auckland, CIV 2009-004-28, 3 March 2009) and “Copycat ‘friend’ guilty of harassment” The Daily Post 26 November 2011. See also Judge David Harvey: “Cyberstalking and Internet Harassment: What the Law Can Do”. NetSafe < >.

Communications Act 2003 (UK) s127.

Human Rights Act 1993 ss 62(3) and 63(2).

Jason Deans “Facebook juror jailed for eight months” Guardian (United Kingdom, 16 June 2011).

Radio Regulations 1970 r 49. The maximum fine was $100.

Law Commission Invasion of Privacy: Penalties and Remedies (NZLC R113, 2010) at [8.13].

“Naked photo sends jilted lover to jail”, (13 November 2010), < >. 

Law Commission Review of the Privacy Act 1993 (NZLC R123, 2011) at [2.99] and [4.73] – [4.75].

Crimes Act 1961, s311(2).

Ibid s179(a).

Law Commission Electronic Commerce Part 2 (NZLC R58 1999) at [262] – [270]; Law Commission Electronic Commerce Part 3 (NZLC R68 2000) at [80].

Bunt v Tilley [2006] 3 All ER 336 at [36] – [37].

Above 344. EC Part 2 at [260] and EC Part 3 at [76] – [79].