Chapter 2 - Online media in New Zealand
Moderation & control online
Self-regulation and communal accountability
Although most of these different online publishers and publishing channels are not currently accountable to a regulatory body, it is a mistake to assume there is no form of control or accountability associated with them.46
The degree of control and accountability online varies considerably from site to site and organisation to organisation. To a large extent these differences reflect the nature and function of the websites themselves. Some sites, such as Trade Me’s community message boards are set up to operate like open public forums; others, like some personal blog sites, operate more like private spaces into which the public are invited. Mainstream media organisations often sit somewhere between these two models.
As discussed in the introductory chapter, the internet culture is defined by a powerful commitment to free speech values and an equally powerful aversion to censorship. This, combined with the anonymity frequently associated with digital communication, has helped create an environment characterised by robust debate and a reliance on bottom-up, or user, control.
However alongside the cyber norms which influence how individuals conduct themselves on line, there are a wide range of tools used to moderate and control online behaviour. Organisations like Trade Me, whose business model depends on public trust, have invested millions of dollars in developing their own sophisticated software designed to protect themselves and their customers from a range of illegal and unethical behaviour.
Most large corporate online operators, including social and mainstream media organisations, have detailed terms and conditions which users must accept as a condition of use. Most also require users to register and provide email addresses and other identifying information as part of the “sign-up” process.
Over and above these base-line standards website operators may adopt varying levels of day-to-day control over their sites. The risk averse may pre-moderate user comments before publication. Others rely on community or user moderation, whereby participants can vote to have content removed. This system may be backed up by a discretion to ban persistently abusive users and take down offensive content.
In the following discussion we look briefly at the types of moderation employed by the spectrum of publishers surveyed in this chapter.
Moderation of news sites
Both Stuff and nzherald require users to agree to terms and conditions before posting comments on their websites. All comments are moderated before publication. Stuff does not require users to register before commenting, but does require users to provide a name and email address and to tick a box indicating they accept the terms and conditions. Registering provides access to more services and content and requires use of a password.
The nzherald website requires users to register the first time they submit a comment. Users must provide their name, email address and a password. The site’s terms and conditions, are set out in clear and accessible language.47
Both Stuff and nzherald allow users to comment under pseudonyms but nzherald suggests it would prefer users to make comments using their full names, consistent with the approach taken to letters published in the newspaper’s opinion pages.
It should be noted that commenting on news stories or other content is entirely at the discretion of the website operators. The decision whether or not to allow comment might relate to the nature of the content, (for example a report of an on-going trial typically would not be open for public discussion on a news website), or to more practical considerations such as the amount of resource available to pre-vet comments submitted for publication.
Both TV3 and TVNZ moderate comments pre-publication, and they will not appear until they have been approved. These sites also reserve the right to bar users should they believe the user is posting abusive content.
None of the online newspapers provide a clear avenue for lodging complaints about content although we were told readers simply use the email address and newsroom details on the sites’ “contact” pages to complain about content. The Broadcasting Standards Authority (BSA), however, requires entities under its jurisdiction to provide a clear avenue for the laying of a complaint. Both TVNZ and TV3 have clear links on their homepage for users to make a complaint regarding the content of a television programme. In respect of radio stations, where Radio New Zealand provides a link to a formal complaints page, Newstalk ZB does not provide a clear avenue for complaint other than the ability to contact the editorial team.
Web-only news and blog sites
Although not covered by the jurisdiction of a regulatory body, most websites included in this survey provide clear statements about the nature of their site and what might be described as the publishing philosophy and standards which apply to content carried on the site. Typically these will reinforce the basic legal constraints that apply to all speech in New Zealand, such as the need to avoid defaming others.
However beyond these basic requirements the standards and practices of web publishers vary widely. The generalist and specialist news sites such as Scoop and interest.co.nz and the business wire services are clearly positioned at the professional end of the publishing spectrum and their standards and practices reflect that. The news site Voxy monitors all of the submitted material pre-publication and has ultimate editorial control over the blogs. Voxy will delete or edit comments from the blog thread if necessary but will seldom delete or edit a blog post. Some sites, on the other hand, do not moderate or exercise any editorial control over articles submitted by registered users pre-publication. Any editorial control and monitoring is retrospective and is heavily reliant on community monitoring. Our research found contributed content on at least one news site which clearly breached suppression orders.
Within the blogosphere there are widely divergent approaches to moderation and control – some of which is dictated by the sophistication or otherwise of the underlying technology supporting the blog. As discussed earlier, blogs cover a multitude of topics and target markets and the level of professionalism and editorial control exerted by the authors and blogging communities varies accordingly.
Some blogs provide comprehensive statements setting out rules or expectations for commenting. Kiwiblog, for example sets out a demerit points-type system whereby users accumulate points and once they reach a certain number will be blocked from posting. The editor of this blog retains the right to edit or delete any comments.
The more professional bloggers tend to have clear transparency policy and open disclosure statements about their personal and professional affiliations, interests and history.49
Whilst blogs are primarily used to share information and express opinions, this subjectivity does not infer a disregard for factual accuracy. On the contrary, the very nature of the blogging user-interaction model means writers are constantly open to challenge on matters of both fact and opinion. The blogging community as a whole not only moderates the content and tone of the comment threads but also the content of the blogs. This self-regulation is apparent from a perusal of the message board but also occurs more privately via email between users and the author of the blog.
However blog sites are not democratic public forums: as noted earlier they are often highly partisan and blog posts and commentary can be highly offensive and personally abusive. Ultimately, the blog administrator/author sets both the tone and the threshold for abusive speech. A person who has been denigrated or who has been the subject of a false allegation on a blog site is entirely dependent on the blog’s administrator for any redress or corrective measures.
We discuss the tools that have been developed within self-publishing and social media platforms such as Facebook in chapter 7 of this paper in the context of legal and non- legal remedies for harms arising from speech abuses.
They are of course all accountable to the law. In addition, as we will discuss in chapter 5, the Press Council has extended its jurisdiction to the news websites associated with the newspaper industry. However much of the content on broadcasters’ news sites is unregulated because it falls within the exclusions contained in the Broadcasting Act 1993.