harmful-communications-201910/HarmfulCommunications201908.org

47 KiB
Raw Blame History

Submission to the Committee on Justice and Equality on issues of online harassment, harmful communications and related offences.

CONSDONE Introduction

My name is Éibhear Ó hAnluain and I have been working in software engineering and IT systems design since 1994. I thank you for the opportunity to submit this contribution to your analysis of issues of online harassment, harmful communications and related offences.

In this submission I am seeking to highlight 3 core concerns:

  • The distinction between user behaviours and online services.
  • The nature of the online services from the perspective of small operators
  • The potential damage legislative measures can have on small operators of online services

However, prior to addressing these topics, I would like to raise some ambiguities that this wider discussion will encounter.

  • The first is the meaning of the term self-regulation. If a measure of self-regulation to address these concerns is acceptable, then it would be necessary for public-perception reasons, to be clear on what that means. Self-regulation could mean either where each service operator manages matters of harassment and harmful communications according to their own rules and processes. This is currently how the large service providers we're most familiar with operate. However, self-regulation may also refer to regulation by a non-governmental industry-funded body, following the model of the press council or the advertising standards authority, where rules and processes are agreed among the operators as a set of standards, and where decisions of compliance to these are made by this body. In order to avoid this ambiguity, I will use the term "self-moderation" to refer to the former, and the term "industry-regulation" for the latter.

CONSTODO The distinction between user behaviours and online services

The internet is awash with online harassment and harmful communications, and responsible governments and legislators have been trying for decades to do something about it.

However, it's no less true in this sphere than in any other that "doing something" is not necessarily enough to address the problem: doing only the right thing it what's required.

In the first of his 6 Laws of Technology1, Dr. Melvin Kranzberg determined that "Technology is neither good nor bad; nor is it neutral." The tempation on observers is to decide that the extent of online harassment, abuse and harmful communications is because of the existence of online services, and that if only we could force the services to implement their technologies in a particular manner, all the problems will be solved.

For instance, the United States of America recently enacted a law known as the "Stop Enabling Sex Traffickers Act", or FOSTA-SESTA2. This was a law to show that the U.S. Congress was doing something to stop sex-trafficking. The law made it an offence for online services to "knowingly [assist], [support], or [facilitate]" sex-trafficking, and it removed from online services speech-related protections that had been previously provided under another U.S. law known as the "Section 230 of the Communications Decency Act".

Accounts show, however, that doing this was not effective, and has been counter-productive. As expected, a number of websites that had been used to legally advertise sex services in the United States either shut down that section of their service (e.g. Craigslists' "Erotic Services"), or shutdown completely[fn:SOSTAEffect:Lura Chamberlain, FOSTA: A Hostile Law with a Human Cost, 87 Fordham

  1. Rev. 2171 (2019). Available at:

https://ir.lawnet.fordham.edu/flr/vol87/iss5/13]. If the goal of the law was to protect sex workers, and women in particular, it has had the opposite effect:

  • Independent sex workers now have no online means to promote their services, forcing them to turn to pimps for this.
  • There has been a notable increase in the number of sex workers who have gone missing.
  • Some sex-workers have died by suicide.
  • Assault and rape of sex workers has increased, and many fear that murders of sex workers are also increasing3.
  • Sex workers have no means to learn about their potential clients prior to the client knowing about them: where they could vet people who made contact with them over these services before identifying themselves, this is not possible anymore, and dramatically increases their risk.
  • Ironically, one of the negative effects of FOSTA-SESTA is that it is now much harder for the police to investigate rapes, assaults and murders of sex workers than before, because a critical trail of evidence the online communications between offenders and sex-workers now can no longer be laid4. This is not least because the websites are no longer there, but because when they were (e.g. Backpage), they assisted the police investigating these crimes against sex workers; advertising was legal back then, and now it's not, the police won't get the help from web sites when they need it5.

This was predicted, but by advocates for sex workers and for free speech, and legislators failed to heed the warnings. In fact, when considering this law, legislators were presented with statistics that were false, and misrepresented the landscape prior to enacting FOSTA-SESTA6.

I highlight this law in particular because it is both recent (early 2018) and relevant. However it's not alone, and as we look at pending legislation coming to us both domestically and from the EU, it's hard not to see the same failures repeating:

  • Pat Rabbitte's and Lorraine Higgins' bills, since withdrawn
  • The EU Terrorism Content Directive…
  • The new Copyright Directive…

CONSTODO Self-hosting

CONSTODO Self-hosting

For the purposes of this submission, self-hosting is where an individual or small group has opted to provide their own internet services, making use either of computer capacity provided by an ISP (for example, Blacknight.com, Amazon AWS) or by maintaining the computer technology themselves.

The services that the self-hoster exposes, then, are either developed specifically by the self-hoster or runs software that has been installed by the self-hoster.

The self-hoster also takes responsibility for the quality of the service that they provide, including ensuring that it is kept running and updates are applied appropriately, and so on.

This submission is primarilty concerned about self-hosting as a hobby and self-hosting engaged in by charity, non-governmental or community organisations. However, self-hosting for commercial purposes is a valid use-case, but implications of regulations on self-hosting has more a direct implication on the former use-cases, as the effect of poor regulation on vulnerable people would be more direct, immediate and serious.

Why self-host?

There is a myriad of reasons for choosing to host one's own service. Some examples might be:

CONSTODO How accessible is self-hosting.

CONSTODO Technical protocols

In a previous, similar, submission7[here]] and here.], I provide an outline of the challenges before someone who wants to set up their own services. There are few, and they are small. In summary, the reasons for this are:

  • The Internet is mechanism for computers to find each other and then to share information with each other. The mechanism is defined in a set of publicly-available documents describing the relevant protocols.
  • Due to the maturity and age of these protocols, software needed to use them is now abundant and trivially easy to get and install and run on a computer. Such software is also very easy to develop for moderately-skilled software engineers.
  • Neither the protocols that define, nor the software that implement the internet regard any computer to be superior or inferior to any other computer. For this reason, there is no cost or capacity barrier for someone to cross in order to run an internet service: if you have the software, and the internet connection, then you can expose such a service.

Clear examples from the past of how the accessibility of the internet technologies has benefited the world include the following:

  • The Linux operating system kernel began life in 1991 as a college project Linus Torvalds wanted to write a computer operating system that was accessible to all. Linux-based operating systems now form the basis of a significant proportion of internet connected computing devices globally8 (including 73% of smartphones and tablet computers, somewhere between 36% and 66% of internet-facing server computers), and 100% of supercomputers.
  • The Apache web server started development when a group of 8 software developers wanted to add functionality to one of the original web server software packages, NCSA httpd. The Apache web server now powers 43.6% of all web sites9[https://w3techs.com/technologies/overview/web_server/all]]. Incidentally, the no. 2 on that web page, with nearly 42% share of websites is nginx. It also started out as a project by an individual who wanted to solve a particular project.].
  • The Firefox web browser was initiated by three software developers who wanted to make a light-weight browser based on the Mozilla code-base. At the height of its popularity, Firefox was used in 34% of web-page requests, despite not coming installed by default on any computer or mobile device. However, its real impact is that it was instrumental in breaking the monopoly that Microsoft's Internet Explorer held since the late '90s, resulting in far richer and more secure web.

When we look at the main services that society is currently struggling with, we need to consider the following historical facts:

  • Facebook started out as a crude service, developed in Mark Zuckerberg's room in Harvard University, to allow users (men, of course) to rate the women in the university in terms of "hotness".
  • Google started out as a search engine called "Backrub". Development initially took place in a garage.
  • eBay was originally an auction service tagged onto the personal website of its founder, Pierre Omidyar.
  • LinkedIn was initially developed in Reid Hoffman's apartment in 2003.
  • Shutterstock, a leading provider of stock images, was founded by a photographer, John Oringer, who developed the service as a means to make available 30,000 of his own photographs.

The ease with which internet technology can be accessed has given rise to the explosion of services that connect people, and people with businesses.

It is critical to note that many of these technologies and services started out with an individual or small group developing an idea and showing it can work prior to receiving the large capital investments that result in their current dominance.

CONSTODO The nature of self-hosting

All of the above technologies and services can be considered truly disruptive. In their respective domains, their arrivals resulted in a dramatic improvements in internet technologies and services.

However, There are many alternatives to the systems that we are familiar with, all developed by individuals, or small, enthusiastic teams:

  • Twitter isn't the only micro-blogging service: there's also GNU Social, Pleroma, Mastodon.
  • An alternative to Facebook is diaspora*
  • Nextcloud and Owncloud are examples of alternatives to Dropbox.

In the cases of all these alternatives, users can sign up for accounts on "instances" operated by third-party providers, or users can set up their own instances and operate the services themselves.

Many of these services can federate with others. Federation in this context means that there can be multiple instances of a service, communicating with each other over a defined protocol, sharing updates and posts. For users, federation means that they can interact with other users who aren't necessarily on the same node or instance. For administrators of instances, federation means that they can configure their instances according to their own preferences, rather than having to abide by the rules or technical implementation of someone else.

CONSTODO Real examples of self-hosting

I host a number of such services:

  • Éibhear/Gibiris is my blog site.
  • Social Gibiris is a micro-blogging service that is federated with others using the AtomPub technology. Thus, Social Gibiris is federated with many other instances of GNU Social, Mastodon and Pleroma.
  • git.gibiris.org is a source-code sharing site that I use to make publicly available some of the software that I develop for myself.
  • news.gibiris.org is a news-aggregation that allows me to gather all the news sources of interest to me into one location, which I can then access from wherever I am.
  • cloud.gibiris.org is a file-sharing platform that I use with my family when we are collaborating on projects (e.g. school projects, home improvement projects, etc.)
  • matrix.gibiris.org is an instant-messaging system which I set up for the purposes of communicating with my family and close friends.

Most of these services are hosted on a computer within my home. 3 of these services provide information to the general public, and the other three are accessible only to those who set up accounts.

2 of those services, git.gibiris.org and Social Gibiris can process or post user-uploaded information.

CONSTODO Regulation of self-hosted services

While it is attractive to create regulations to manage the large, profit-making organisations, it is imperative that such regulations don't harm the desire of those who want to create their own services.

Regulations that apply liability a service-provider for someone else's behaviour, is a regulation that can be adhered to only by organisations with large amounts of money to hand. For example, if the regulation was to apply liability on me for a posting made by someone else (and somewhere else these are federated services after all) that appear on one of the services that I run, I would have to shut them down, as I would not be able to put in place the necessary infrastructure that would mitigate my liability[fn:copyrightDirective:This assumes that my services aren't forced to shut down by the new EU Copyright Directive anyway]. Given that my services are intended to provide a positive benefit to me, my family members and my friends, and that I have no desire to facilitate harmful behaviour on these services, a law forcing me to shut these services down benefits no one.

Similarly, a regulation that demands responses from services on the assumption that the service will be manned at all times, requires individuals who are self-hosting their services to be available at all times (i.e. to be able to respond regardless of whether they are asleep, or overseas on a family holiday, too ill to respond, etc.)

This submission comes from this perspective: that small operators should not be unduly harmed by regulations; the likelihood of this harm coming to pass is greater when such small operators are not even considered during the development of the regulations. If regulations have the effect10 of harming such small operators, the result will not just be the loss of these services, but also the loss of opportunity to make the Web richer by means of the imposition of artificial barriers to entry. Such regulations will inhibit the development of ideas that pop into the heads of individuals, who will realise them with nothing more than a computer connected to the internet.

CONSTODO Abuse

All systems that seek to protect people from harmful or other objectionable material (e.g. copyright infringement, terrorism propaganda, etc.) have, to date, been amenable to abuse. For example, in a recent court filing, Google claimed that 99.97% of infringement notices it received in from a single party in January 2017 were bogus11:

A significant portion of the recent increases in DMCA submission volumes for Google Search stem from notices that appear to be duplicative, unnecessary, or mistaken. As we explained at the San Francisco Roundtable, a substantial number of takedown requests submitted to Google are for URLs that have never been in our search index, and therefore could never have appeared in our search results. For example, in January 2017, the most prolific submitter submitted notices that Google honored for 16,457,433 URLs. But on further inspection, 16,450,129 (99.97%) of those URLs were not in our search index in the first place. Nor is this problem limited to one submitter: in total, 99.95% of all URLs processed from our Trusted Copyright Removal Program in January 2017 were not in our index.

Aside from the percentage of URLs noted that don't exist in Google's index, that a single entity would submit more than 16 million URLs for delisting in a single month is staggering, and demonstrates a compelling point: there is no downside for a bad-faith actor seeking to take advantage of a system for suppressing information[fn:downside:The law being used in this specific case is the US Digital Millennium Copyright Act. It contains a provision that claims of copyright ownership on the part of the claimant are to be made under penalty of perjury. However, that provision is very weak, and seems not to be a deterrent for a determined agent: https://torrentfreak.com/warner-bros-our-false-dmca-takedowns-are-not-a-crime-131115].

More recently, there is the story of abuse of the GDPR's Right to be Forgotten. An individual from Europe made a claim in 2014, under the original Right to be Forgotten, to have stories related to him excluded from Google searches for him. This seemed to have been an acceptable usage under those rules. However, that this claim was made and processed seems also to be a matter of public interest, and some stories were written in the online press regarding it. Subsequently, the same individual used the Right to be Forgotten to have these stories excluded from Google searches.

This cat-and-mouse game continues to the extent that the individual is (successfully) requiring Google to remove stories about his use of the GDPR's Right to be Forgotten. Even stories that cover only his Right to be Forgotten claims, making no reference at all to the original (objected-to) story12. This is clearly an abuse of the law: Google risks serious sanction from data protection authorities if it decides to invoke the "… exercising the right of freedom of expression and information" exception13 and it is determined that the exception didn't apply. However, the claimant suffers no sanction if it is determined that the exception does apply.

In systems that facilitate censorship[fn:censorship:While seeking to achieve a valuable and socially important goal, this legislation, and all others of its nature, facilitates censorship: as a society, we should not be so squeamish about admitting this.], it is important to do more than merely assert that service providers should protect fundamental rights for expression and information. In a regime where sending an e-mail costs nearly nothing, where a service risks serious penalties (up to and including having to shutdown) and where a claimant suffers nothing for abusive claims, the regime is guaranteed to be abused.

CONSTODO Harmful content definition

This submission will not offer any suggestions as to what should be considered "harmful content". However, I am of the belief that if "harmful content" is not narrowly defined, the system will allow bad actors to abuse it, and in the context where there is no risk to making claims, and great risk in not taking down the reported postings, loose definitions will only make it easier for non-harmful content to be removed.

CONSTODO Answers to consultation questions

CONSTODO Strand 1 National Legislative Proposal

CONSTODO Question 1 Systems

  • The legislation should state in an unequivocal manner that it is not the role of web services to adjudicate on whether specific user-uploaded pieces (text, videos, sound recordings, etc.) can be considered harmful under the legislation. The law should make it clear that where there is a controversy on this matter, the courts will make such rulings.
  • As regard a system, this submission would support a notice-counternotice-and-appeal approach. Such an approach affords the service operator and the accused party an opportunity to address the complaint before the complained-of material is taken offline. The following should be incorporated:

    1. A notice to a service operator that a user-uploaded piece is harmful should contain the following information:

      • That the notice is being raised under this legislation (citing section, if relevant).
      • That the person raising the notice is the harmed party, or that the person raising the notice is doing so on behalf, and at the request, of the harmed party. Where the harmed party doesn't want to be identified, the notice could be raised on their behalf by someone else. However, totally anonymous notifications under this legislation should not be permitted, as it would not be possible to determine the good-faith nature of the notice.
      • The specific (narrowly tailored) definition of "harmful content" in the legislation that is being reported.
    2. A notice to the user who uploaded the complained-of material regarding the complaint. This will allow the user to remove the material, or to challenge the complaint. An opportunity to challenge a complaint is necessary to forestall invalid complaints that seek to have information removed that would not be considered harmful under the legislation.
    3. Adequate time periods for both the complainant and the posting user to respond.
    4. Where responses aren't forthcoming…

      • … if the posting user doesn't respond to the initial complaint, the posting is to be taken down
      • … if the complaining user doesn't respond to the posting user's response, the posting is left up.
    5. Within a reasonable and defined period of time, the service provider will assess the initial complaint, the counter-notice, and the complainant's response to the counter-notice, and will decide whether to take the material down or to leaving it up, citing clear reasons for the decision.
    6. Where either party is not happy with the decision, they can appeal to the regulator, and if the regulator contradicts the service operator's decision, the service operator must abide by the regulator's ruling. In its consideration of the ruling, the regulator must be required to consider the rights of both parties.
  • Responsibilities and obligations of the service provider must relate to the size of the service. For example, it's not reasonable to ask the service provider to respond within an amount of time for those services that would not have someone available within that time. Self-hosters or small, single-location, operations would not be able to respond within an hour if the complaint is made at 4am!
  • This system should not apply to complaints that a posting violates the service's terms and conditions. If the complaint isn't explicitly made under this legislation, it should not fall within the regulator's remit. Under no circumstances should merely violating a service's terms and conditions (or "community standards") be considered an offence under this legislation.

CONSTODO Question 2 Statutory tests

The service operator should be protected from liability under the rules if the service can show the following:

  • That the initial complaint was responded to appropriately and within a reasonable amount of time.
  • That an appeal was responded to within a reasonable amount of time.
  • That the poster and complainant were each offered an opportunity to respond
  • That the responses, and any appeals, were given due consideration.
  • That the final decision (whether to keep the post up or pull it down) was well-reasoned, and considered the context in which the post was made.
  • That, where appeals have been made to the regulator, the service responds to any order from the regulator in a reasonable manner and within a reasonable amount of time.

CONSTODO Question 3 Which platforms to be considered in scope

This submission is concerned to ensure that assumptions not be made that all affected platforms will be large, for-profit organisations with scores, or hundreds, or thousands of staff acting as moderators of user-uploads.

The legislation should also not assume that platforms that want to deal with user uploads should be of a particular nature, or size.

To make either assumption would be to chill lawful interactions between internet-connected parties, and would further entrench the larger players on the internet.

CONSTODO Question 4 Definitions

  • Please see my introductory comments on this matter.
  • Definitions of "harmful content" must aim to be as narrow as possible, in order to avoid the potential of the legislation being used to target political speech.
  • In respect of serious cyberbullying, it should be considered harmful content under the legislation not just when it targets a child. It should be considered cyberbullying and harmful even if it is an adult, if the complaint states that s/he is being harmed or fears harm should the complained-of behaviour continue.

    • In the event that the target of the cyberbullying is a public figure, there should be an additional burden on the complainant to state that the behaviour represents real intent to cause harm, and is more than people with opposing political or social views "shooting their mouths off".

CONSTODO Strand 2 Video Sharing Platform Services

CONSTODO Question 5 What are video-sharing services

This submission is not providing an answer to this question.

CONSTODO Question 6 Relationship between Regulator and VSPS

This submission is not providing an answer to this question.

CONSTODO Question 7 Review by Regulator

The regulator should require the following reports to be published by online services regarding complaints made under this legislation:

  • Number of complaints, broken down by nature of complaint
  • Number of complaints that were appealed to the service, broken down by nature of complaint and basis of appeal
  • Number of appeals upheld, broken down by reason for appeal
  • Number of appeals rejected, broken down by reason for rejection.
  • Number of complaints/appeals that were appealed further to the regulator.

CONSTODO Strands 3 & 4 Audiovisual Media Services

CONSTODO Question 8 "Content" rules for television broadcasting and on-demand services

This submission is not providing an answer to this question.

CONSTODO Question 9 Funding

RTÉ and its subsidiary services should continue to be funded by the government, either through the licence fee, general taxation or a mixture of both. RTÉ's editorial independence should be re-iterated in this law (and strengthened, if required, specifically to assure independence from the editorial demands of advertisers). It should be anticipated that RTÉ will eventually broadcast only over the internet, and that it will be both a live-streaming service (e.g. providing programming in a manner similar to it's current broadcast schedule), and an on-demand service.

Funding of services other than RTÉ should only be considered for services operated by non-profit organisations such as trusts or charities, and such funding should also come with an assurance of editorial independence for the recipients.

CONSTODO Strands 1 & 2 European & International Context

CONSTODO Question 10 Freedoms

  • Core to the consideration of the legislation is that everyone posting to services are presumed to be innocent of an offence, and their postings should also be presumed not to offend the law.
  • Accusations of harm must be tested to determine if they are being made to suppress legal speech. This is particularly true where the person making the allegation is a public figure, or is representing a public figure.
  • Where a service applies or is required to apply sanctions on users who repeatedly post harmful information, similar sanctions should also be applied to users who repeatedly make false accusations under the law.

CONSTODO Question 11 Limited liability

Any regulatory system that makes service providers liable for what their users say on those services will result in one or a combination of the following effects:

  1. Service will stop permitting users to make postings.
  2. Where the value of a service is wholly, or in part, that it allows its users to post to it, the service may have to shut down.
  3. Services will be sued or prosecuted for the actions of its users regardless of the effort and good faith they put in to "moderating" what is posted on their service a concept that is borderline ludicrous in the off-line world. This would be analogous to a car manufacturer being liable for the consequences of car occupants not wearing their seat-belts.

There must be clarity in the regulations that a service is protected as long as it acts in a good-faith manner to deal with postings made by users that are determined to have been illegal. This reflects Ireland's obligations under various trade agreements to grant safe-harbour protections to internet services.

The regulation must also protect platforms and their users against bad-faith accusations of harm, particularly from public figures. If it is easier to use an accusation of "harmful content" than to claim libel, public figures will use that facility to suppress information they would like not to be known.

CONSTODO Strands 1-4 Regulatory Structures

CONSTODO Question 12 Regulatory structure

This submission is not providing an answer to this question.

CONSTODO Question 13 Funding of regulatory structure

This submission is not providing an answer to this question.

CONSTODO Strands 1 & 2 Sanctions/Powers

CONSTODO Question 14 Functions and powers

This submission is not providing an answer to this question.

CONSTODO Question 15 Sanctions

The following should be taken into account when considering sanctions on platforms

  • The nature of the operation

    • Large, global, profit-based private organisations providing services to the general population. (examples include YouTube, Facebook, Twitter).
    • Smaller, local, profit-based private organisations providing services to the general population, focused on the region (examples might include boards.ie, everymum.ie, etc.)
    • Small, non-profit forums set up by locally-based and -focused organisations such as soccer clubs, or school parents' associations[fn:useFacebook:There is often the temptation to advise these organisations to use larger platforms like Facebook or Google. Some organisations may not want to avail of those services, and the reasons for this are not relevant. What's important is that deciding not to use these platforms is valid, and these decisions should be protected and encouraged, not inhibited.]
    • Individuals, hosting their own platforms.
  • The good-faith efforts of the operation to respond to accusations of harm.
  • The capacity of the service to respond smaller operations can't afford 24-hour monitoring to respond to such accusations, and the law should not require it. Such services should be able to avail of bad-faith actors seeking to interfere with their operations by overwhelming them with false accusations of harm that need to be dealt with.
  • Who the accuser is public figures should be prevented from using accusations of "harmful content" to remove information that is merely critical of them or their behaviour.

CONSTODO Question 16 Thresholds

This submission is not providing an answer to this question.


10

unintended, one hopes

13

GDPR, Article 17, paragraph 3(a)