Chapter 7 - Free speech abuses: quantifying the harms and assessing the remedies

Non-legal remedies

Online solutions to online problems

As discussed in chapter 2 of this paper, the read/write architecture of the web facilitates some unique forms of self-regulation. User-generated feedback and comment is hardwired into the design of many websites, including blogs and self-publishing platforms like YouTube. In addition many major internet entities have adopted sophisticated automated systems for dealing with offensive or harmful publishing.

The amount of data shared on these leading internet properties is mind boggling: each month Facebook’s 750 million users exchange 30 billion pieces of content. Trade Me, a minnow by Facebook standards, but with an even greater penetration in the New Zealand market, has 2.8 million members who, on average, will publish 25,000 new posts on Trade Me message boards each day.320 At any given time there may be as many as 550 million words contained on these message boards.

It is of course not humanly possible (nor, arguably desirable) to preview all user generated content before it is published. Often the existence and content of offending posts will be unknown to the publishing website. Instead sites like Trade Me, Facebook, Twitter and rely on a combination of contractual “terms and conditions” and community moderation to establish and maintain civil behaviour on their sites.321

Typically, users must register and agree to comply with the site’s terms and conditions before being able to make use of the site. By default, other users of the site become the agents for policing compliance with these rules and have access to various tools allowing them to “vote” content off and “report” content which transgresses the rules in some way. Facebook told us its “robust reporting infrastructure leverages the 750 million people who use our site to monitor and report offensive or potentially dangerous content.”322

 

Trade Me

As an online auction site Trade Me’s priority is protecting and enhancing the security of the site and designing systems which can detect frauds and other illegitimate activities with the potential to undermine customers’ trust in the site. However Trade Me is also committed to ensuring its community message boards provide a safe environment for discussion and that those using the message boards comply with both internal and legal publishing standards.

Over and above their systems of community moderation and reporting, Trade Me has devoted considerable resource to customising software programmes that will allow them to filter for content that is offensive, including breaches of current court orders relating to suppressed evidence or names. Trade Me’s legal team has fostered strong relationships with key private and public sector organisations, including the Police, banks and the telecommunications sector, allowing it to respond swiftly when required.

Trade Me’s physical presence in New Zealand and its strong engagement with both its users and the regulators contrasts with the remoteness and inaccessibility of the other online entities which dominate in New Zealand.

Facebook and Google

Like Trade Me, Facebook and the Google-owned site YouTube require users to agree to detailed terms and conditions (referred to by Facebook as its Statement of Rights and Responsibilities) before posting content on their sites. In addition Facebook and YouTube have devised simple sets of “community standards” not dissimilar to the types of publishing codes developed by broadcasters. These community standards provide a straightforward guide to civil behaviour online and cover many of the same types of harmful publishing discussed earlier in this chapter: threats; hate speech; graphic violence; impersonation; privacy and bullying and harassment.

The sites provide a variety of tools for reporting content considered offensive or which breaches community standards. Facebook’s “Help Centre” also provides detailed advice on how to manage privacy settings and a variety of self-help tools for responding to abusive or intrusive behaviour of other users.

Members wishing to report abuses can file reports using automated templates which provide a menu of options to describe the nature of the problem.

Like Google, Facebook reserves the right to unilaterally remove content that violates its terms and conditions. Facebook told us that its automated systems removed “thousands of pieces daily” that were in violation of its policies.

To help understand how effective these self-regulatory systems are in preventing and remedying harms such as cyber-bullying, harassment and online impersonation we asked Google and Facebook to provide us with specific information about the extent to which New Zealand users were reporting abuses and the frequency with which such reports resulted in content being removed from sites and/or users having their accounts terminated. We were also interested to know more about the nature of the formal requests Google and Facebook were receiving from police, lawyers or representatives of the government for content to be taken down or for the release of the account details of specific users.

Unfortunately we were told that neither currently captured the sort of information with respect to problem reporting by individual users that would allow them to provide us with the detailed country specific analysis we were seeking.323

However, since 2009 Google has published six monthly “Transparency Reports” documenting the number of government or court initiated requests it has received to either remove content associated with one of its products or services or to reveal information about a user.324  The reports are searchable by country of origin and for those countries which have generated more than ten requests during the reporting period these are broken down to show; the originator of the request (court orders or police/executive); the Google product involved (Street View; Google Search; Blogger; Gmail; ); and the nature of the problem (hate speech; privacy and security; impersonation; defamation etc.).

In addition to this tabulated data, Google provides details of requests dealt with during each reporting period to illustrate country trends and the principles which underpin their decisions whether or not to comply with requests to remove content.325

The examples illustrate how Google, as a global entity, is applying what are effectively editorial judgments, weighing the competing claims of free speech against the specific cultural, legal and political interests of hundreds of different sovereign states whose citizens’ make use of their global products and social spaces.

Google’s transparency reports for New Zealand between 2009 - 2010 recorded fewer than ten government/court requests for content to be removed in each of the six monthly reporting periods. Of these, 83% were complied with in the first period and 100% in the second. Because the number of requests fell below ten, Google provided no further detail about the nature of the contested content. Google registered no requests for user information from New Zealand police or the courts over the 18 months. (In comparison, Google received 345 data requests from official Australian sources, 81% of which were either fully or partially complied with.)

Facebook does not report publicly on its interface with law enforcement and other legal or governmental bodies with respect to take down requests or information about users. However they were able to tell us that in the first half of 2011 they had received 21 requests from New Zealand law enforcement agencies. Two of these involved “emergency matters that required urgent handling”.326

Discussion

Without any empirical evidence about the use of reporting tools and the speed and frequency with which content is removed as a result of user reports it is impossible to gauge the effectiveness of community moderation.327

Arguably the exponential growth in these two publishing platforms, and Facebook, is of itself clear evidence that for the vast majority of users, the environments are considered safe and civil.

Facebook told us its “culture of authentic identity”, signified by the use of true names and identities, has made Facebook “less attractive to bad actors who generally do not like to use their real names or email addresses.”

People are less likely to engage in negative, dangerous, or harassing behaviour online when their friends can see their names, their posts and the information they share. Our real name culture creates accountability and deters bad behaviour since people using Facebook understand that their actions create a record of their behaviour.

However given the totally unprecedented volume of data being published on sites like You Tube, Twitter and Facebook, it is of course inevitable that a percentage of users will abuse the technology and it is evident from the feedback we received from NetSafe that a percentage of these will go unchecked. In response to the problem of people impersonating others online or setting up fake profile pages Facebook told us it had recently introduced a new automated system for auditing accounts reported to be fake or an impersonation.328

Facebook also told us it worked with law enforcement agencies from around the world and discloses information “pursuant to subpoenas, court orders, or other requests” where the company had a “good faith belief that the response was required by law.”

The police told us they were that they are actively working with off shore internet entities to “develop and establish procedures to enable information to be sought and obtained in a timely and consistent basis.”329

Similarly, police tell us that when the goal is to have offensive content taken down from a website, rather than to initiate a prosecution, some social media sites will respond after receiving a formal request on police letterhead.

This appears to be consistent with the response we received from Facebook describing how it responds to legal requests to remove content. It will first review the content against its own Statement of Rights and Responsibilities and if a violation was found it would remove the content and, if appropriate, disable the account of the person responsible. Occasionally, if content is found to be illegal in the jurisdiction from which the complaint originated, but not in breach of Facebook’s terms, the organisation may prevent the content being shown to people in that country but not remove it from the site.

With respect to requests for account details or other user information from law enforcement agencies, Facebook said it may disclose such details “pursuant to subpoenas, court orders, or other requests (including criminal and civil matters) if we have a good faith belief that the response is required by law.”

The fact that a request may come from another jurisdiction was not necessarily an impediment to Facebook cooperating provided “we have a good faith belief that the response is required by law under the local laws, apply to people from that jurisdiction, and are consistent with generally accepted international standards.”

Email from Christine Lanham to Law Commission regarding Trade Me traffic (31 May 2011).

An analysis of the use of community moderation and reporting tools by Trade Me analysts showed that in a four week period over October and November 2011 the organization received:
2500 reports from members about posts on the message boards. Each report/complaint may have referred to one or more posts. these reports resulted in Trade Me removing 700 individual posts and 840 full threads, or 25,390 posts in total (largely notice and takedown). In addition, the Trade Me message board community voted off 6,643 posts.

Letter from Sarah Wynn-Williams, Facebook Manager, Public Policy, to the Law Commission regarding Facebook’s regulatory mechanisms (18 August 2011).

Facebook provided us with the following explanation: “Facebook does not flag user reports on a per country basis and many users do not tell us what country they are in. As we do not organize or collate the data on a per country basis, to provide this information we would have review all requests received to try and determine which were from New Zealand. As this is a hugely expensive and time consuming task, I am afraid that we are not in a position to provide the information.”
Google assured us that its reporting and response system was “robust and fast moving” but, like Facebook, it “did not have statistics or data that would be useful to share” regarding the level of user generated complaints from New Zealand and the instances where content has been removed.

Google Transparency Report 2011 < www.google.com/transparencyreport/ >. These reports do not include child abuse material (which is automatically removed from Google sites) and nor do they include copyright-related removals associated with YouTube.

Their website includes the following examples: “July – December 2010 Italy:  We received a request from the Central Police in Italy for removal of a YouTube video that criticized Prime Minister Silvio Berlusconi and simulated his assassination with a gun at the end of the video. We removed the video for violating YouTube’s Community Guidelines.

Jan – June 2010 China During the period that Google’s joint venture operated google.cn, its search results were subject to censorship pursuant to requests from government agencies responsible for internet regulation. Chinese officials consider censorship demands to be state secrets, so we cannot disclose any information about content removal requests for the two reporting periods from July 2009 to June 2010. Youtube was inaccessible in China during this reporting period.

Argentina The courts in Argentina issued two orders that sought the removal of every search result mentioning a particular individual’s name in association with a certain category of content. The number of search results at issue well exceeds 100,000 results. We did not attempt to approximate the number of individual items of content that might be encompassed by those two court orders. Google appealed those orders. The number of user data requests we received increased by 37% compared to the previous reporting period.

July – December 2009 Argentina   A federal prosecutor claimed that information about him and his wife (a federal judge) had been posted for analysis on two political blogs and asked that we remove them. We removed a portion of one of the blogs for revealing private information about the judge, but otherwise did not comply because it did not violate our internal policies.

Germany A substantial number of German removal requests resulted from court orders that related to defamation in search results. Approximately 11% of the German removal requests are related to pro-Nazi content or content advocating denial of the Holocaust, both of which are illegal under German law.”

Email from Sarah Wynn-Williams to Law Commission regarding complaints’ breakdown (2 September 2011).

The exception, as noted earlier, is Trade Me which was able to provide an analysis of the use of community reporting as a regulatory tool with respect to the oversight of  message boards.

If this revealed a problem with the account the person would be sent a message requiring them to provide evidence that they were in fact the account holder, such as registering and confirming a mobile phone number. If they failed to do this within a specified time the account is disabled. In addition Facebook alerted us to the fact that their Help Centre allows people attempting to have an imposter account disabled to get access to information related to those accounts without submitting a subpoena or other formal legal processes.

Letter from Jackie McCullough, Police Legal Adviser, to the Law Commission (2 September 2011).

Police note that many of the large online entities are incorporated in New Zealand and NC3 has had success in serving a warrant on the registered company address in New Zealand and its parent US entity by email. Police are also working to develop a letter of agreement with Yahoo which would provide alternative protocols allowing it to access subscription/registration details and IP/activity logs in some circumstances.