1153 lines
63 KiB
Org Mode
1153 lines
63 KiB
Org Mode
|
||
#+latex_class: article
|
||
#+latex_class_options:
|
||
#+latex_header:
|
||
#+latex_header_extra:
|
||
#+description:
|
||
#+keywords:
|
||
#+subtitle:
|
||
#+latex_compiler: pdflatex
|
||
#+date: \today
|
||
|
||
#+TITLE: Submission to the Committee on Justice and Equality on /issues of online harassment, harmful communications and related offences/.
|
||
#+AUTHOR: Éibhear Ó hAnluain
|
||
#+EMAIL: eibhear.geo@gmail.com, 086 8565 666, http://www.gibiris.org/eo-blog/
|
||
#+OPTIONS: ^:{} toc:2 H:4 num:t author:t email:nil
|
||
#+TODO: CONSTODO CONSNOTES | CONSDONE CONSDONTDO
|
||
|
||
* Planning :noexport:
|
||
** Resources :noexport:
|
||
- [[https://www.oireachtas.ie/en/committees/submissions/20190808-committee-on-justice-and-equality-calls-for-submissions-on-online-harassment-and-harmful-communications/][Call for submissions]]
|
||
- [[https://data.oireachtas.ie/ie/oireachtas/committee/dail/32/joint_committee_on_justice_and_equality/other/2019/2019-08-08_possible-issues-for-address_en.pdf][List of possible issues]]
|
||
** Web page (captured [2019-08-24 Sat])
|
||
The Committee on Justice and Equality invites written submissions
|
||
from stakeholders and interested parties on the issues of online
|
||
harassment, harmful communications and related offences.
|
||
|
||
[[https://www.oireachtas.ie/en/committees/32/justice-and-equality/][Go to the Committee on Justice and Equality]]
|
||
|
||
A separate document can be obtained at the following [[https://data.oireachtas.ie/ie/oireachtas/committee/dail/32/joint_committee_on_justice_and_equality/other/2019/2019-08-08_possible-issues-for-address_en.pdf][link]] outlining
|
||
in detail the list of possible issues the Committee wishes to
|
||
address under this broad heading.
|
||
|
||
In summary, the Committee wishes to examine the nature and extent
|
||
of the problems of online ‘cyber bullying’, harassment, stalking,
|
||
‘revenge porn’ and other forms of harmful communications;
|
||
international best practice for addressing these problems; whether
|
||
self-regulation of harmful communications by social media companies
|
||
is the best approach; or whether new laws are necessary to cover
|
||
such activities, and what forms such laws should take.
|
||
|
||
The Committee will commence a series of public hearings on these
|
||
issues on 2 October 2019, with a view to publishing a report.
|
||
*** Closing date
|
||
|
||
The closing date for receipt of submissions is Friday, 20
|
||
September 2019.
|
||
*** How to send your submission
|
||
|
||
Please email an electronic document (PDF/MS Word or equivalent) to
|
||
[[mailto:onlineharassment@oireachtas.ie][onlineharassment@oireachtas.ie]].
|
||
|
||
Please do not send hard copies of your submission; hard copies
|
||
will not be accepted.
|
||
|
||
Please do not send your submission to individual Committee
|
||
members. The Clerk will ensure all members receive copies of all
|
||
submissions.
|
||
*** What to include in your submission
|
||
|
||
Your submission should comprise your submission document and a
|
||
separate covering letter. This allows the Committee to publish
|
||
your submission without your contact details.
|
||
**** In the covering letter, please include:
|
||
|
||
- your name, postal address, email address and contact telephone
|
||
number
|
||
- if the submission is on behalf of an organisation, your
|
||
position in the organisation
|
||
- a brief outline of why you are making the submission
|
||
**** In the submission document please include:
|
||
|
||
- a brief introduction, for example, explaining your area of
|
||
expertise
|
||
- any factual information that you have to offer from which the
|
||
Committee might be able to draw conclusions, or which could be
|
||
put to other parties for their reactions
|
||
- links to any publications you refer to; there is no need to
|
||
send such publications as attachments
|
||
- any recommendations to the Committee; be as specific as
|
||
possible and summarise your recommendations at the end of the
|
||
document
|
||
- if your document is more than 10 pages long, an executive
|
||
summary of the main points made in the submission
|
||
Please remember to be concise and to number your pages.
|
||
*** Important information
|
||
|
||
Submissions sent to any other email address may not be accepted.
|
||
|
||
Anonymous submissions cannot be accepted and will be rejected.
|
||
|
||
Petitions and form letters may not be accepted or published.
|
||
|
||
Submissions made to a Committee may be published as received,
|
||
either as part of a Committee report or separately, if the
|
||
Committee decides to do so.
|
||
*** Making a submission is a public process
|
||
|
||
The Committee is not obliged to accept your document once it has
|
||
been submitted, nor is it obliged to publish any or all of the
|
||
submission if it has been accepted. However, the operations of a
|
||
parliament are a public process, and you should be aware that any
|
||
submissions made to a Committee including your identity may be
|
||
published either as part of a Committee report, or separately, if
|
||
the Committee decides to do so.
|
||
*** Need more guidance?
|
||
|
||
If you would like more detailed guidance, please read the guidance
|
||
note Making Submissions and Presentations to Oireachtas Committees
|
||
below or contact the clerk to the Committee.
|
||
*** Clerk to the Committee
|
||
|
||
Damian Byrne
|
||
|
||
[[mailto:damian.byrne@oireachtas.ie][damian.byrne@oireachtas.ie]]
|
||
|
||
(01) 618 3899
|
||
|
||
Committee on Justice and Equality
|
||
|
||
Committee Secretariat,
|
||
|
||
Houses of the Oireachtas Service,
|
||
|
||
Kildare Street,
|
||
|
||
Dublin 2,
|
||
|
||
D02 XR20
|
||
** Possible issues document (captured [2019-08-24 Sat])
|
||
*** Online Harrassment, harmful communications and related offences
|
||
*Possible issues for address*
|
||
**** Definition of communication in legislation
|
||
1. There are currently significant gaps in legislation with
|
||
regard to harassment and newer, more modern forms of
|
||
communication. Is there a need to expand the definition of
|
||
‘communications’ to include online and digital communications
|
||
tools such as WhatsApp, Facebook, Snapchat, etc. when
|
||
addressing crimes of bullying or harassment?
|
||
- Éibhear comment :: (/Address in introduction/) It is
|
||
necessary not to assume that the current services that
|
||
operate will be the primary services in 5 or 10 years'
|
||
time.
|
||
2. What lessons can be learned from models used in other
|
||
jurisdictions such as the UK, New Zealand, Australia and other
|
||
European countries where legislation is now in place to
|
||
address these issues? How do we establish an appropriate model
|
||
without compromising free speech?
|
||
- Éibhear comment :: (/Address in answer to specific
|
||
questions/) UK: duty of care is inappropriate. New
|
||
Zealand: allowing a committee to decide what is
|
||
objectionable, thus restricting not only those who want
|
||
to share objectionable material, but also those who want
|
||
to report on it.
|
||
3. How do we ensure that any legislation that is enacted is
|
||
flexible enough to keep up with changing and advancing
|
||
technologies, new apps and other online forums, including the
|
||
more familiar social media sites?
|
||
- Éibhear's comments :: (/Core concern/) Hmm. This is the meat
|
||
of the submission.
|
||
**** Harassment, stalking & other forms of online abuse
|
||
4. [@4] Online harassment can take the form of on-consensual
|
||
taking and distribution of intimate images or videos,
|
||
otherwise known as ‘revenge porn’, ‘upskirting’,
|
||
‘downblousing’ and other forms of sharing of imagery online
|
||
without consent. What approaches are taken to addressing these
|
||
issues in other jurisdictions?
|
||
- Éibhear's comment :: No answer for this
|
||
5. New offences are proposed to cover these issues in Deputy
|
||
Brendan Howlin’s Private Members Bill on this subject. Is the
|
||
creation of new offences necessary, or is existing legislation
|
||
sufficient? Should other forms of image-sharing issues - such
|
||
as exposure - also be addressed?
|
||
- Éibhear's comment :: No answer for this
|
||
6. What kind of oversight and regulation of online service
|
||
providers is possible/used in other jurisdictions? Currently,
|
||
online providers are self regulated. Is a proactive,
|
||
self-regulating approach from online companies to activities
|
||
such as revenge porn and other forms of harassment preferable
|
||
to the creation of more laws?
|
||
- Éibhear's comment :: Important to know the difference
|
||
between "self regulated", and pro-active
|
||
moderation. These service moderation according to their
|
||
own rules; there is no industry authority like the press
|
||
council or the advertising standards authority, which are
|
||
self-regulatory regimes.
|
||
7. Is any data provided by online service providers in relation
|
||
to the reporting or prevalence of activities such as
|
||
upskirting/revenge porn/cyberbullying and other online
|
||
behaviour that can be used to develop and draft future
|
||
legislation?
|
||
- Éibhear's comment :: No data. However, services should be
|
||
encouraged to issue reports on their moderation efforts.
|
||
8. To what extent are An Garda Síochána equipped and resourced to
|
||
deal with the issues arising from harmful online
|
||
communications such as these?
|
||
- Éibhear's comment :: No answer for this
|
||
9. Should ‘cyberstalking’ be treated as a separate offence to
|
||
online harassment? What constitutes stalking-type behaviour
|
||
online? Is there a need to legislative specifically for this
|
||
activity?
|
||
- Éibhear's comment :: No answer for this
|
||
10. Based on the findings of other jurisdictions such as in the
|
||
UK, An Garda Síochána will require consistent training in
|
||
order to maintain an appropriate level of knowledge with
|
||
regard to indictable behaviours. Are resources available for
|
||
this?
|
||
- Éibhear's comment :: No answer for this
|
||
11. Fake accounts/troll accounts used to harass or target others
|
||
with abuse – what measures can be taken in relation to these
|
||
without effecting freedom of expression?
|
||
- Éibhear's comment :: Care needs to be taken to ensure
|
||
manage/prevent false identification of accounts as 'fake'
|
||
or 'troll'.
|
||
12. Do other jurisdictions have statutory measures to protect
|
||
victim identities in cases of online harassment being
|
||
released online posthearings, etc?
|
||
- Éibhear's comment :: No answer for this
|
||
**** Harmful online behaviour and young people
|
||
13. [@13] How do we most appropriately regulate social media
|
||
platforms to prevent cyberbullying and inappropriate sharing
|
||
of personal images?
|
||
- Éibhear's comment :: take details from earlier submission.
|
||
14. For young people who participate in such online behaviour as
|
||
consensual image sharing, how can it be ensured that they are
|
||
not inadvertently criminalised when legislation is enacted?
|
||
What safeguards can be put in place?
|
||
- Éibhear's comment :: No answer for this
|
||
15. Deputy Brendan Howlin’s Private Members Bill provides that
|
||
those under 17 should not be fined/imprisoned but put into
|
||
relevant education or supports. Would these supports be part
|
||
of the same educational supports offered to all young
|
||
people/schools or would they be a separate entity? Are
|
||
current supports being utilised? Are there sufficient
|
||
resources to provide for such a provision when enacted?
|
||
- Éibhear's comment :: No answer for this
|
||
** CONSTODO Eibhear's initial thoughts :noexport:
|
||
1. Focus on two core principles:
|
||
- Self-hosting -- individuals and groups hosting their own
|
||
services should not be neglected.
|
||
- Abuse -- services and systems should be protected from abuse
|
||
*** Facts
|
||
- Tweets per day: 500,000,000
|
||
+ Active accouts: 326,000,000
|
||
+ Reported accounts: 11,000,257 (July - December 2018)
|
||
* Abuse, child sexual exploitation, hateful conduct, private information, Sensitive media, voilent threats
|
||
* => 60,000 account reported/day
|
||
* => 0.02% of accounts reported
|
||
* CONSDONTDO The distinction between user behaviours and online services :noexport:
|
||
|
||
The internet is awash with online harassment and harmful
|
||
communications, and responsible governments and legislators have
|
||
been trying for decades to do something about it.
|
||
|
||
However, it's no less true in this sphere than in any other that
|
||
"doing something" is not necessarily enough to address the problem:
|
||
doing only the /right thing/ it what's required.
|
||
|
||
In the first of his 6 Laws of
|
||
Technology[fn:6laws:https://en.wikipedia.org/wiki/Melvin_Kranzberg#Kranzberg's_laws_of_technology],
|
||
Dr. Melvin Kranzberg determined that "Technology is neither good nor
|
||
bad; nor is it neutral." The tempation on observers is to decide
|
||
that the extent of online harassment, abuse and harmful
|
||
communications is because of the existence of online services, and
|
||
that if only we could force the services to implement their
|
||
technologies in a particular manner, all the problems will be
|
||
solved.
|
||
|
||
For instance, the United States of America recently enacted a law
|
||
known as the "Stop Enabling Sex Traffickers Act", or
|
||
/FOSTA-SESTA/[fn:FOSTA-SESTA:https://en.wikipedia.org/wiki/Stop_Enabling_Sex_Traffickers_Act]. This
|
||
was a law to show that the U.S. Congress was doing something to stop
|
||
sex-trafficking. The law made it an offence for online services to
|
||
"knowingly [assist], [support], or [facilitate]" sex-trafficking,
|
||
and it removed from online services speech-related protections that
|
||
had been previously provided under another U.S. law known as the
|
||
"Section 230 of the Communications Decency Act".
|
||
|
||
Accounts show, however, that doing *this* was not effective, and has
|
||
been counter-productive. As expected, a number of websites that had
|
||
been used to legally advertise sex services in the United States
|
||
either shut down that section of their service (e.g. Craigslists'
|
||
"Erotic Services"), or shutdown completely[fn:SOSTAEffect:Lura
|
||
Chamberlain, FOSTA: A Hostile Law with a Human Cost, 87 Fordham
|
||
L. Rev. 2171 (2019). Available at:
|
||
https://ir.lawnet.fordham.edu/flr/vol87/iss5/13]. If the goal of the
|
||
law was to protect sex workers, and women in particular, it has had
|
||
the opposite effect:
|
||
- Independent sex workers now have no online means to promote their
|
||
services, forcing them to turn to pimps for this.
|
||
- There has been a notable increase in the number of sex workers who
|
||
have gone missing.
|
||
- Some sex-workers have died by suicide.
|
||
- Assault and rape of sex workers has increased, and many fear that
|
||
murders of sex workers are also
|
||
increasing[fn:craigslisthomicide:http://www.econlib.org/archives/2018/01/craigslist_redu.html].
|
||
- Sex workers have no means to learn about their potential clients
|
||
prior to the client knowing about them: where they could vet
|
||
people who made contact with them over these services before
|
||
identifying themselves, this is not possible anymore, and
|
||
dramatically increases their risk.
|
||
- Ironically, one of the negative effects of /FOSTA-SESTA/ is that
|
||
it is now much harder for the police to investigate rapes,
|
||
assaults and murders of sex workers than before, because a
|
||
critical trail of evidence -- the online communications between
|
||
offenders and sex-workers -- now can no longer be
|
||
laid[fn:FOSTAPolice:https://www.techdirt.com/articles/20180705/01033440176/more-police-admitting-that-fosta-sesta-has-made-it-much-more-difficult-to-catch-pimps-traffickers.shtml]. This
|
||
is not least because the websites are no longer there, but because
|
||
when they were (e.g. Backpage), they assisted the police
|
||
investigating these crimes against sex workers; advertising was
|
||
legal back then, and now it's not, the police won't get the help
|
||
from web sites when they need
|
||
it[fn:SESTAPolice:https://www.techdirt.com/articles/20180509/13450339810/police-realizing-that-sesta-fosta-made-their-jobs-harder-sex-traffickers-realizing-made-their-job-easier.shtml].
|
||
|
||
|
||
This was predicted, but by advocates for sex workers and for free
|
||
speech, and legislators failed to heed the warnings. In fact, when
|
||
considering this law, legislators were presented with statistics
|
||
that were false, and misrepresented the landscape prior to enacting
|
||
/FOSTA-SESTA/[fn:buzzfeed:https://www.buzzfeednews.com/article/jennyheineman/sex-trafficking-myths-sesta-fosta].
|
||
|
||
I highlight this law in particular because it is both recent
|
||
(early 2018) and relevant. However it's not alone, and as we look at
|
||
pending legislation coming to us both domestically and from the EU,
|
||
it's hard not to see the same failures repeating:
|
||
- Pat Rabbitte's and Lorraine Higgins' bills, since withdrawn
|
||
- The EU Terrorism Content Directive...
|
||
- The new Copyright Directive...
|
||
-
|
||
|
||
* CONSDONE Introduction
|
||
|
||
My name is Éibhear Ó hAnluain and I have been working in software
|
||
engineering and IT systems design since 1994. I thank you for the
|
||
opportunity to submit this contribution to your analysis of /issues
|
||
of online harassment, harmful communications and related offences/.
|
||
|
||
In this submission I am seeking to highlight 3 core concerns:
|
||
- The distinction between user behaviours and online services.
|
||
- The nature of the online services from the perspective of small
|
||
operators
|
||
- The potential damage legislative measures can have on small
|
||
operators of online services
|
||
|
||
|
||
However, prior to addressing these topics, I would like to raise
|
||
some ambiguities that this wider discussion will encounter.
|
||
- The first is the meaning of the term /self-regulation/. If a
|
||
measure of self-regulation to address these concerns is
|
||
acceptable, then it would be necessary for public-perception
|
||
reasons, to be clear on what that means. /Self-regulation/ could
|
||
mean either where each service operator manages matters of
|
||
harassment and harmful communications according to their own rules
|
||
and processes. This is currently how the large service providers
|
||
we're most familiar with operate. However, /self-regulation/ may
|
||
also refer to regulation by a non-governmental industry-funded
|
||
body, following the model of the press council or the advertising
|
||
standards authority, where rules and processes are agreed among
|
||
the operators as a set of standards, and where decisions of
|
||
compliance to these are made by this body.
|
||
|
||
In order to avoid this ambiguity, I will use the term
|
||
"self-moderation" to refer to the former, and the term
|
||
"industry-regulation" for the latter.
|
||
|
||
* CONSDONE Self-hosting
|
||
** CONSDONE Self-hosting
|
||
For the purposes of this submission, /self-hosting/ is where an
|
||
individual or small group has opted to provide their own internet
|
||
services, making use either of computer capacity provided by an ISP
|
||
(for example, Blacknight.com, Amazon AWS) or by maintaining the
|
||
computer technology themselves.
|
||
|
||
The services that the self-hoster exposes, then, are either
|
||
developed specifically by the self-hoster or runs software that has
|
||
been installed by the self-hoster.
|
||
|
||
The self-hoster also takes responsibility for the quality of the
|
||
service that they provide, including ensuring that it is kept
|
||
running and updates are applied appropriately, and so on.
|
||
|
||
This submission is primarilty concerned about self-hosting as a
|
||
hobby and self-hosting engaged in by charity, non-governmental or
|
||
community organisations. However, self-hosting for commercial
|
||
purposes is a valid use-case, but implications of regulations on
|
||
self-hosting has more a direct implication on the former use-cases,
|
||
as the effect of poor regulation on vulnerable people would be more
|
||
direct, immediate and serious.
|
||
|
||
*** CONSTODO Real examples of self-hosting
|
||
I host a number of such services:
|
||
- [[http://www.gibiris.org/eo-blog][/Éibhear/Gibiris/]] is my blog site.
|
||
- [[https://social.gibiris.org/][/Social Gibiris/]] is a micro-blogging service that is federated
|
||
with others using the /AtomPub/ technology. Thus, /Social
|
||
Gibiris/ is federated with many other instances of /GNU Social/,
|
||
/Mastodon/ and /Pleroma/. This network of federated services,
|
||
operated by individuals, groups and businesses, all connected
|
||
together as peers, facilitate connections and communication in a
|
||
way that is very little different to twitter.
|
||
- [[https://git.gibiris.org/][/git.gibiris.org/]] is a source-code sharing site that I use to
|
||
make publicly available some of the software that I develop for
|
||
myself.
|
||
- [[https://news.gibiris.org/][/news.gibiris.org/]] is a news-aggregation that allows me to
|
||
gather all the news sources of interest to me into one location,
|
||
which I can then access from wherever I am.
|
||
- [[https://cloud.gibiris.org/nextcloud][/cloud.gibiris.org/]] is a file-sharing platform that I use with
|
||
my family when we are collaborating on projects (e.g. school
|
||
projects, home improvement projects, etc.)
|
||
- [[https://matrix.gibiris.org/][/matrix.gibiris.org/]] is an instant-messaging system which I set
|
||
up for the purposes of communicating with my family and close
|
||
friends.
|
||
|
||
Most of these services are hosted on a computer within my home. 3
|
||
of these services provide information to the general public, and
|
||
the other three are accessible only to those who set up accounts.
|
||
|
||
2 of those services, /git.gibiris.org/ and /Social Gibiris/ can
|
||
process or post user-uploaded information.
|
||
|
||
*** CONSTODO Why self-host?
|
||
|
||
There is a myriad of reasons for choosing to host one's own
|
||
service. Some examples might be:
|
||
- Privacy -- until recently many services were careless or
|
||
outright abusive users' privacy
|
||
- Tracking -- the extent to which organisations, particularly
|
||
those whose business models are based on advertising, facilitate
|
||
the tracking of internet users as they conduct their business or
|
||
personal activities across the internet.
|
||
- Autonomy -- to be able to configure ones own service is often a
|
||
powerful experience.
|
||
- Community -- While some of the global services with household
|
||
names offer features go small businesses and community groups
|
||
(like footall clubs or debating societies), often the lock-in
|
||
and exclusivity involved can make it hard to include everyone
|
||
who needs to be involved. Hosting your own services allows you
|
||
to set the rules and codes of conduct.
|
||
- Experimentation -- just by means of playing with interesting
|
||
software projects can people often learn about the tools and
|
||
systems they use, and grow their knowledge of the technologies
|
||
involved.
|
||
- Collaboration -- the softwas that implements self-hosted
|
||
services often some under the terms of a Free or Open Source
|
||
Software copyright licence, which allows for peope to copy and
|
||
improve software, and these improvements often find their back
|
||
to the original project for others to benefit.
|
||
- Protection -- Governments in countries where civil rights are
|
||
not regarded as highly as they are in Ireland very often delight
|
||
in the greater ease involved in surveilling their populations
|
||
when the record of all that activity is centralised in a single
|
||
service.
|
||
|
||
Very often, as with me, the reason to self-host is a combination
|
||
of more than 1 of these reasons.
|
||
|
||
** CONSDONE How accessible is self-hosting.
|
||
In a previous, similar, submission[fn:dccae:Available [[http://www.gibiris.org/eo-blog/posts/2019/04/15_harmful-content-consultation.html][here]] and
|
||
[[https://www.dccae.gov.ie/en-ie/communications/consultations/Documents/86/submissions/Eibhear_O_HAnluain.pdf][here]].], I provide an outline of the challenges before someone who
|
||
wants to set up their own services. There are few, and they are
|
||
small. In summary, the reasons for this are:
|
||
- The Internet is mechanism for computers to find each other and
|
||
then to share information with each other. The mechanism is
|
||
defined in a set of publicly-available documents describing the
|
||
relevant protocols.
|
||
- Due to the maturity and age of these protocols, software needed
|
||
to use them is now abundant and trivially easy to get and install
|
||
and run on a computer. Such software is also very easy to develop
|
||
for moderately-skilled software engineers.
|
||
- Neither the protocols that define, nor the software that
|
||
implement the internet regard any computer to be superior or
|
||
inferior to any other computer. For this reason, there is no cost
|
||
or capacity barrier for someone to cross in order to run an
|
||
internet service: if you have the software, and the internet
|
||
connection, then you can expose such a service.
|
||
|
||
Clear examples from the past of how the accessibility of the
|
||
internet technologies has benefited the world include the
|
||
following:
|
||
- The /Linux/ operating system kernel began life in 1991 as a
|
||
college project -- Linus Torvalds wanted to write a computer
|
||
operating system that was accessible to all. Linux-based
|
||
operating systems now form the basis of a significant proportion
|
||
of internet connected computing devices
|
||
globally[fn:LinuxProportions:https://en.wikipedia.org/wiki/Usage_share_of_operating_systems]
|
||
(including 73% of smartphones and tablet computers, somewhere
|
||
between 36% and 66% of internet-facing server computers), and
|
||
100% of supercomputers.
|
||
- The /Apache/ web server started development when a group of 8
|
||
software developers wanted to add functionality to one of the
|
||
original web server software packages, /NCSA httpd/. The Apache
|
||
web server now powers 43.6% of all web
|
||
sites[fn:apacheProportions:[[https://w3techs.com/technologies/overview/web_server/all][https://w3techs.com/technologies/overview/web_server/all]]. Incidentally,
|
||
the no. 2 on that web page, with nearly 42% share of websites is
|
||
/nginx/. It also started out as a project by an individual who
|
||
wanted to solve a particular project.].
|
||
- The /Firefox/ web browser was initiated by three software
|
||
developers who wanted to make a light-weight browser based on the
|
||
Mozilla code-base. At the height of its popularity, /Firefox/ was
|
||
used in 34% of web-page requests, despite not coming installed by
|
||
default on any computer or mobile device. However, its real
|
||
impact is that it was instrumental in breaking the monopoly that
|
||
Microsoft's Internet Explorer held since the late '90s, resulting
|
||
in far richer and more secure web.
|
||
|
||
When we look at the main services that society is currently
|
||
struggling with, we need to consider the following historical
|
||
facts:
|
||
- Facebook started out as a crude service, developed in Mark
|
||
Zuckerberg's room in Harvard University, to allow users (men, of
|
||
course) to rate the women in the university in terms of
|
||
"hotness".
|
||
- Google started out as a search engine called
|
||
"Backrub". Development initially took place in a garage.
|
||
- eBay was originally an auction service tagged onto the personal
|
||
website of its founder, Pierre Omidyar.
|
||
- LinkedIn was initially developed in Reid Hoffman's apartment
|
||
in 2003.
|
||
- Shutterstock, a leading provider of stock images, was founded by
|
||
a photographer, John Oringer, who developed the service as a
|
||
means to make available 30,000 of his own photographs.
|
||
|
||
The ease with which internet technology can be accessed has given
|
||
rise to the explosion of services that connect people, and people
|
||
with businesses.
|
||
|
||
It is critical to note that many of these technologies and services
|
||
started out with an individual or small group developing an idea
|
||
and showing it can work *prior* to receiving the large capital
|
||
investments that result in their current dominance.
|
||
|
||
All of the above technologies and services can be considered truly
|
||
disruptive. In their respective domains, their arrivals resulted in
|
||
a dramatic improvements in internet technologies and services.
|
||
|
||
However, There are many alternatives to the systems that we are
|
||
familiar with, all developed by individuals, or small, enthusiastic
|
||
teams:
|
||
- /Twitter/ isn't the only micro-blogging service: there's also
|
||
/GNU Social/, /Pleroma/, /Mastodon/.
|
||
- An alternative to /Facebook/ is /diaspora*/
|
||
- /Nextcloud/ and /Owncloud/ are examples of alternatives to
|
||
/Dropbox/.
|
||
|
||
In the cases of all these alternatives, users can sign up for
|
||
accounts on "instances" operated by third-party providers, or users
|
||
can set up their own instances and operate the services themselves.
|
||
|
||
Many of these services can federate with others. Federation in this
|
||
context means that there can be multiple instances of a service,
|
||
communicating with each other over a defined protocol, sharing
|
||
updates and posts. For users, federation means that they can
|
||
interact with other users who aren't necessarily on the same node
|
||
or instance. For administrators of instances, federation means that
|
||
they can configure their instances according to their own
|
||
preferences, rather than having to abide by the rules or technical
|
||
implementation of someone else.
|
||
|
||
** CONSDONE Regulation of self-hosted services
|
||
|
||
While it is attractive to create regulations to manage the large,
|
||
profit-making organisations, it is imperative that such
|
||
regulations don't harm the desire of those who want to create
|
||
their own services.
|
||
|
||
A regulation that apply liability to a service-provider for someone
|
||
else's behaviour, is a regulation that can be adhered to only by
|
||
organisations with large amounts of money to hand. For example, if
|
||
the regulation was to apply liability on me for a posting made by
|
||
someone else that appears on one of the services that I run (and
|
||
originally posted *somewhere* else -- these are federated services
|
||
after all), I would have to shut it down; I am not able to put in
|
||
place the necessary infrastructure that would mitigate my
|
||
liability[fn:copyrightDirective:This assumes that my services
|
||
aren't forced to shut down by the new EU Copyright Directive
|
||
anyway]. Given that my services are intended to provide a positive
|
||
benefit to me, my family members and my friends, and that I have no
|
||
desire to facilitate harmful behaviour on these services, a law
|
||
forcing me to shut these services down benefits no one.
|
||
|
||
Similarly, a regulation that demands responses from services on the
|
||
assumption that the service will be manned at all times, requires
|
||
individuals who are self-hosting their services to be available at
|
||
all times (i.e. to be able to respond regardless of whether they
|
||
are asleep, or overseas on a family holiday, too ill to respond,
|
||
etc.)
|
||
|
||
This submission comes from this perspective: that small operators
|
||
should not be unduly harmed by regulations; the likelihood of this
|
||
harm coming to pass is greater when such small operators are not
|
||
even considered during the development of the regulations. If
|
||
regulations have the effect[fn:unintended:unintended, one hopes] of
|
||
harming such small operators, the result will not just be the loss
|
||
of these services, but also the loss of opportunity to make the Web
|
||
richer by means of the imposition of artificial barriers to
|
||
entry. Such regulations will inhibit the development of ideas that
|
||
pop into the heads of individuals, who will realise them with
|
||
nothing more than a computer connected to the internet.
|
||
|
||
* CONSTODO Other considerations
|
||
While the main focus of this submission is to highlight the
|
||
potential risk to self-hosters from regulation that neglect to
|
||
consider the practice, I would like to take the opportunity to
|
||
briefly raise some additional concerns
|
||
|
||
** CONSDONE Abuse
|
||
|
||
To date, all systems that seek to protect others from harmful or
|
||
other objectionable material (e.g. copyright infringement,
|
||
terrorism propaganda, etc.) have, to date, been very amenable to
|
||
abuse. For example, in a recent court filing, Google claimed that
|
||
99.97% of infringement notices it received in from a single party
|
||
in January 2017 were
|
||
bogus[fn:googleTakedown:https://www.techdirt.com/articles/20170223/06160336772/google-report-9995-percent-dmca-takedown-notices-are-bot-generated-bullshit-buckshot.shtml]:
|
||
|
||
#+BEGIN_QUOTE
|
||
A significant portion of the recent increases in DMCA submission
|
||
volumes for Google Search stem from notices that appear to be
|
||
duplicative, unnecessary, or mistaken. As we explained at the San
|
||
Francisco Roundtable, a substantial number of takedown requests
|
||
submitted to Google are for URLs that have never been in our search
|
||
index, and therefore could never have appeared in our search
|
||
results. For example, in January 2017, the most prolific submitter
|
||
submitted notices that Google honored for 16,457,433 URLs. But on
|
||
further inspection, 16,450,129 (99.97%) of those URLs were not in
|
||
our search index in the first place. Nor is this problem limited to
|
||
one submitter: in total, 99.95% of all URLs processed from our
|
||
Trusted Copyright Removal Program in January 2017 were not in our
|
||
index.
|
||
#+END_QUOTE
|
||
|
||
That a single entity would submit more than 16 million URLs for
|
||
delisting in a single month is staggering, and demonstrates a
|
||
compelling point: there is no downside for a bad-faith actor
|
||
seeking to take advantage of a system for suppressing
|
||
information[fn:downside:The law being used in this specific case is
|
||
the US Digital Millennium Copyright Act. It contains a provision
|
||
that claims of copyright ownership on the part of the claimant are
|
||
to be made under penalty of perjury. However, that provision is
|
||
very weak, and seems not to be a deterrent for a determined agent:
|
||
https://torrentfreak.com/warner-bros-our-false-dmca-takedowns-are-not-a-crime-131115].
|
||
|
||
The GDPR's /Right to be Forgotten/ is also subject to abuse. An
|
||
individual from Europe continues to have stories related to him
|
||
excluded from Google searches. However appropriate on the face of
|
||
it, the stories this individual is now getting suppressed relate to
|
||
his continued abuse of the /Right to be
|
||
Forgotten/[fn:RTBF:https://www.techdirt.com/articles/20190320/09481541833]. That
|
||
the "right" can be abused in this way is counter to the public
|
||
interest, as it can now be used like a "Super Injunction".
|
||
|
||
While the GDPR allows for search engines "... exercising the right
|
||
of freedom of expression and information", if they are presented
|
||
with /Right to be Forgotten/ demands, they have to choose between
|
||
serious sanctions if they don't filter the results when they should
|
||
have, or no sanctions if they suppress the results when they didn't
|
||
need to.
|
||
|
||
In systems that facilitate censorship[fn:censorship:While seeking
|
||
to achieve a valuable and socially important goal, this
|
||
legislation, and all others of its nature, facilitates censorship:
|
||
as a society, we should not be so squeamish about admitting this.],
|
||
it is important to do more than merely assert that service
|
||
providers should protect fundamental rights for expression and
|
||
information. In a regime where sending an e-mail costs nearly
|
||
nothing, where a service risks serious penalties (up to and
|
||
including having to shut down) and where a claimant suffers nothing
|
||
for abusive claims, the regime is guaranteed to be abused.
|
||
|
||
** CONSDONE Content Moderation
|
||
|
||
Much of the focus on legislative efforts to deal with harmful or
|
||
objectional material on services that permit uploads from users is
|
||
on what the service providers do about it. Many argue that they are
|
||
not doing anything, or at least not enough.
|
||
|
||
However, this is an unfortunate mischaracterisation of the
|
||
situation. For example, facebook employs -- either directly or
|
||
through out-sourcing contracts -- many 10s of thousands
|
||
"moderators", whose job is to make a decision to remove offensive
|
||
material or not, to suppress someone's freedom of expression or
|
||
not, based on a set of if-then-else questions.
|
||
- It's illegal in Germany to say anything that can be construed as
|
||
glorifying the Holacaust. In the UK it isn't. Facebook can
|
||
suppress such information from users it believes are in Germany,
|
||
but to do so for those in the UK would be an illegal denial of
|
||
free expression, regardless of how objectionable the material
|
||
is. What is facebook to do with users in Germany who route their
|
||
internet connections through the UK? Facebook has no knowledge of
|
||
this unusual routing, and to learn about it could be a violation
|
||
of the user's right to privacy. Should facebook be criminally
|
||
liable for a German user seeing statements that are illegal in
|
||
Germany?
|
||
- Consider the genocide of Armenian people in Turkey in 1915. It is
|
||
illegal to claim it happened in Turkey. However, for a period
|
||
between 2012 and 2017 it was illegal in France to claim it didn't
|
||
happen. In most other countries, neither claim is illegal. What
|
||
can a service like facebook do when faced with 3 options, 2 of
|
||
which are mutually exclusive? Literally, should they be
|
||
criminally liable both if they do /and/ if they
|
||
don't[fn:dink:Prior to his assassination in Istanbul in 2007,
|
||
Hrant Dink, an ethnic Armenian Turkish journalist who campaigned
|
||
against Turkey's denial of the Armenian Genocide had planned to
|
||
travel to France to deny it in order to highlight the
|
||
contradictions with laws that criminalise statements of fact. ]?
|
||
|
||
Moderators have no more than a minute to determine whether a
|
||
statement complies with the law of not, and this includes figuring
|
||
out whether the posting meets the definitions of abusive or
|
||
harmful, and whether it is indeed intended to meet that
|
||
definition. For example, consider an abusive tweet. Should the
|
||
harmful, abusive tweet be removed? Who decides? What if the target
|
||
of the abusive tweet wants that tweet to be retained, for, say
|
||
evidence? What if the tweet was an attempt at abuse, but the target
|
||
chose not to be affected by it? Should it stay up? Who decides?
|
||
What if the target doesn't care, but others who see the tweet but
|
||
who aren't the target of the abuse may be offended by it. Should it
|
||
be taken down as abusive even though the target of the abuse
|
||
doesn't care, or objects to its removal? Who would be criminally
|
||
liable in these situations? What if the target of the abuse
|
||
substantially quotes the abusive tweets? Is the target now to be
|
||
considered an offender under a criminal liability regime when that
|
||
person may be doing nothing other than /highlighting/ abuse?
|
||
|
||
"Content moderation" is very hard, and is impossible at the scales
|
||
that services like twitter or facebook operate in. When context is
|
||
critical in deciding whether to decide someone is engaged in
|
||
harmful or abusive behaviour, it would be fundamentally unfair to
|
||
make a service criminally liable just because it made the wrong
|
||
decision as it didn't have time to determine the full context, or
|
||
because it misinterpreted or misunderstood the context.
|
||
|
||
** CONSTODO User Behaviour
|
||
Many believe that the way to deal with abusive or harmful material
|
||
online is to punish the services that host the material. This is
|
||
reasonable if the material was placed onto the service by those who
|
||
own or manage the service. It is also reasonable if the material is
|
||
put there by users with the clear knowledge of the managers or
|
||
owners of the service, or by users following encouragement of the
|
||
managers or owners of the service.
|
||
|
||
However, these specific situations are rare in the world of normal
|
||
online services[fn:criminal:Services that are dedicated to hosting
|
||
criminal material such as "revenge porn" or child sexual
|
||
exploitation material know they are engaged in criminal activities
|
||
anyway, and take steps to avoid detection that are outside the
|
||
scope of this submission -- those guys will get no support from
|
||
me!].
|
||
|
||
Engaging in harmful and abusive communications is a matter of
|
||
behaviour and not a function of the technical medium through which
|
||
the communication is made. The idea that internet services are
|
||
responsible for abusive communications is as difficult to
|
||
understand as the idea that a table-saw manufacturer is responsible
|
||
for a carpenter not wearing safety glasses which using to to cut
|
||
timber.
|
||
|
||
Recent history has shown that the most effective ways to change
|
||
behaviour are not necessarily punitive. It's hard to see how
|
||
punishing an intermediary would stop people being nasty to each
|
||
other.
|
||
|
||
Any new regulations around controlling abusive or harmful
|
||
behaviours online must start with changing user's behaviours. If
|
||
there is no attempt to change behaviour, then abusive people will
|
||
simply work around the controls and continue to abuse.
|
||
|
||
** CONSTODO Investigation support
|
||
|
||
In response to the live-streaming of that horrific shooting dead of
|
||
more than 50 people in New Zealand earlier this year, that country
|
||
has proscribed the video recorded by that white supremacist
|
||
terrorist as "objectionable", making it a criminal offence to share
|
||
it[fn:banNotice:https://www.classificationoffice.govt.nz/news/latest-news/christchurch-attacks-press-releases/#christchurch-attack-video-footage-and-document-has-been-banned-in-nz-what-this-means-for-you].
|
||
|
||
While one can understand the thinking that sharing the material
|
||
could only be done by people who support the atrocity, this is not
|
||
necessarily true. Other reasons to share the video or portions of
|
||
it might include
|
||
- to appeal for help in finding someone caught up in the masssacre
|
||
- Legitimate news reporting of such an event.
|
||
- to help investigate the shooting and its circumstances
|
||
- training for law enforcement or terrorism- or disaster-response
|
||
personnel.
|
||
|
||
However, if the law says that no form of sharing is permitted, then
|
||
none of the entirely legitimate purposes would be possible, and the
|
||
world would be that bit less safe as a result.
|
||
** CONSTODO Encrypted services
|
||
|
||
Some believe that if end-to-end encryption services that prevent
|
||
security services from accessing material were banned or
|
||
controlled, there would be less abusive behaviour online. This is
|
||
not true, nor is it a good public policy.
|
||
|
||
Encryption is just mathematics, and it knows neither whether its
|
||
use is for ill or good. However, when you consider the extent to
|
||
which encryption is being used -- every website that uses =https=
|
||
as part of its address encrypts the traffic between itself and its
|
||
users, and that is nearly every website around the world -- the
|
||
good uses vastly outnumber the bad uses. If people are forced to
|
||
use an encryption system that has been modified to make it easy for
|
||
security services to gain access to the messages, it means that all
|
||
the good, innocent uses of encryption are at risk. Recent news that
|
||
Russian spies managed to infiltrate the
|
||
FBI[fn:Oath:https://news.yahoo.com/exclusive-russia-carried-out-a-stunning-breach-of-fbi-communications-system-escalating-the-spy-game-on-us-soil-090024212.html
|
||
(Please note that to access this story the user has to agree to
|
||
many hundreds of forms tracking or spend many up to an hour
|
||
examining those forms and disabling each one individually. It is
|
||
recommended that this story be access using "Incognito" or "Private
|
||
Browsing" mode in order to be protected against tracking).],
|
||
highlights how unreliable are assurances from security services
|
||
that they can keep secrets such as the keys to all encryption safe
|
||
from harm.
|
||
|
||
All it takes is one determined intruder, and all the good uses of
|
||
encryption are put at risk in order to safe money and effort on
|
||
investigating illegal activities.
|
||
|
||
I have written a number of articles on this matter providing more
|
||
details:
|
||
- [[http://www.gibiris.org/eo-blog/posts/2015/03/12_the-value-of-encryption.html][The value of encryption]]
|
||
- [[http://www.gibiris.org/eo-blog/posts/2015/03/18_how-can-encryption-be-regulated.html][How can encryption be regulated]]
|
||
- [[http://www.gibiris.org/eo-blog/posts/2018/08/21_you-cant-stop-people-from-using-encryption.html][You just can't stop people from using encryption, so stop trying]]
|
||
- [[http://www.gibiris.org/eo-blog/posts/2018/08/22_stop-people-from-using-encryption-postscript.html][Post-script on why you should stop trying to stop people from
|
||
using encryption]]
|
||
- [[http://www.gibiris.org/eo-blog/posts/2018/09/04_some-questions-5-eyes-countries-what-can-they-do.html][Some questions for the "5 Eyes" countries on what they think they
|
||
can do]]
|
||
|
||
* CONSTODO Answers to consultation questions
|
||
The follows are some answers to the questions posed in the call for
|
||
submissions.
|
||
** CONSTODO Definition of communication in legislation
|
||
1. There are currently significant gaps in legislation with regard
|
||
to harassment and newer, more modern forms of communication. Is
|
||
there a need to expand the definition of ‘communications’ to
|
||
include online and digital communications tools such as
|
||
WhatsApp, Facebook, Snapchat, etc. when addressing crimes of
|
||
bullying or harassment?
|
||
- Éibhear comment :: (/Address in introduction/) It is necessary
|
||
not to assume that the current services that operate will
|
||
be the primary services in 5 or 10 years' time.
|
||
2. What lessons can be learned from models used in other
|
||
jurisdictions such as the UK, New Zealand, Australia and other
|
||
European countries where legislation is now in place to address
|
||
these issues? How do we establish an appropriate model without
|
||
compromising free speech?
|
||
- Éibhear comment :: (/Address in answer to specific questions/)
|
||
UK: duty of care is inappropriate. New Zealand: allowing a
|
||
committee to decide what is objectionable, thus restricting
|
||
not only those who want to share objectionable material,
|
||
but also those who want to report on it.
|
||
3. How do we ensure that any legislation that is enacted is
|
||
flexible enough to keep up with changing and advancing
|
||
technologies, new apps and other online forums, including the
|
||
more familiar social media sites?
|
||
- Éibhear's comments :: (/Core concern/) Hmm. This is the meat
|
||
of the submission.
|
||
** CONSTODO Harassment, stalking & other forms of online abuse
|
||
4. [@4] Online harassment can take the form of on-consensual taking
|
||
and distribution of intimate images or videos, otherwise known
|
||
as ‘revenge porn’, ‘upskirting’, ‘downblousing’ and other forms
|
||
of sharing of imagery online without consent. What approaches
|
||
are taken to addressing these issues in other jurisdictions?
|
||
- Éibhear's comment :: No answer for this
|
||
5. New offences are proposed to cover these issues in Deputy
|
||
Brendan Howlin’s Private Members Bill on this subject. Is the
|
||
creation of new offences necessary, or is existing legislation
|
||
sufficient? Should other forms of image-sharing issues - such as
|
||
exposure - also be addressed?
|
||
- Éibhear's comment :: No answer for this
|
||
6. What kind of oversight and regulation of online service
|
||
providers is possible/used in other jurisdictions? Currently,
|
||
online providers are self regulated. Is a proactive,
|
||
self-regulating approach from online companies to activities
|
||
such as revenge porn and other forms of harassment preferable to
|
||
the creation of more laws?
|
||
- Éibhear's comment :: Important to know the difference
|
||
between "self regulated", and pro-active
|
||
moderation. These service moderation according to their
|
||
own rules; there is no industry authority like the press
|
||
council or the advertising standards authority, which are
|
||
self-regulatory regimes.
|
||
7. Is any data provided by online service providers in relation to
|
||
the reporting or prevalence of activities such as
|
||
upskirting/revenge porn/cyberbullying and other online behaviour
|
||
that can be used to develop and draft future legislation?
|
||
- Éibhear's comment :: No data. However, services should be
|
||
encouraged to issue reports on their moderation efforts.
|
||
8. To what extent are An Garda Síochána equipped and resourced to
|
||
deal with the issues arising from harmful online communications
|
||
such as these?
|
||
- Éibhear's comment :: No answer for this
|
||
9. Should ‘cyberstalking’ be treated as a separate offence to
|
||
online harassment? What constitutes stalking-type behaviour
|
||
online? Is there a need to legislative specifically for this
|
||
activity?
|
||
- Éibhear's comment :: No answer for this
|
||
10. Based on the findings of other jurisdictions such as in the UK,
|
||
An Garda Síochána will require consistent training in order to
|
||
maintain an appropriate level of knowledge with regard to
|
||
indictable behaviours. Are resources available for this?
|
||
- Éibhear's comment :: No answer for this
|
||
11. Fake accounts/troll accounts used to harass or target others
|
||
with abuse – what measures can be taken in relation to these
|
||
without effecting freedom of expression?
|
||
- Éibhear's comment :: Care needs to be taken to ensure
|
||
manage/prevent false identification of accounts as 'fake'
|
||
or 'troll'.
|
||
12. Do other jurisdictions have statutory measures to protect
|
||
victim identities in cases of online harassment being released
|
||
online posthearings, etc?
|
||
- Éibhear's comment :: No answer for this
|
||
** CONSTODO Harmful online behaviour and young people
|
||
13. [@13] How do we most appropriately regulate social media
|
||
platforms to prevent cyberbullying and inappropriate sharing of
|
||
personal images?
|
||
- Éibhear's comment :: take details from earlier submission.
|
||
14. For young people who participate in such online behaviour as
|
||
consensual image sharing, how can it be ensured that they are
|
||
not inadvertently criminalised when legislation is enacted?
|
||
What safeguards can be put in place?
|
||
- Éibhear's comment :: No answer for this
|
||
15. Deputy Brendan Howlin’s Private Members Bill provides that
|
||
those under 17 should not be fined/imprisoned but put into
|
||
relevant education or supports. Would these supports be part of
|
||
the same educational supports offered to all young
|
||
people/schools or would they be a separate entity? Are current
|
||
supports being utilised? Are there sufficient resources to
|
||
provide for such a provision when enacted?
|
||
- Éibhear's comment :: No answer for this
|
||
|
||
* CONSDONTDO Answers to consultation questions :noexport:
|
||
** CONSTODO Strand 1 -- National Legislative Proposal
|
||
*** CONSTODO Question 1 -- Systems
|
||
- The legislation should state in an unequivocal manner that it is
|
||
not the role of web services to adjudicate on whether specific
|
||
user-uploaded pieces (text, videos, sound recordings, etc.) can
|
||
be considered harmful under the legislation. The law should make
|
||
it clear that where there is a controversy on this matter, the
|
||
courts will make such rulings.
|
||
- As regard a system, this submission would support a
|
||
notice-counternotice-and-appeal approach. Such an approach
|
||
affords the service operator and the accused party an
|
||
opportunity to address the complaint before the complained-of
|
||
material is taken offline. The following should be incorporated:
|
||
1) A notice to a service operator that a user-uploaded piece is
|
||
harmful should contain the following information:
|
||
- That the notice is being raised under this legislation
|
||
(citing section, if relevant).
|
||
- That the person raising the notice is the harmed party, or
|
||
that the person raising the notice is doing so on behalf,
|
||
and at the request, of the harmed party. Where the harmed
|
||
party doesn't want to be identified, the notice could be
|
||
raised on their behalf by someone else. However, totally
|
||
anonymous notifications under this legislation should not
|
||
be permitted, as it would not be possible to determine the
|
||
good-faith nature of the notice.
|
||
- The specific (narrowly tailored) definition of "harmful
|
||
content" in the legislation that is being reported.
|
||
2) A notice to the user who uploaded the complained-of material
|
||
regarding the complaint. This will allow the user to remove
|
||
the material, or to challenge the complaint. An opportunity
|
||
to challenge a complaint is necessary to forestall invalid
|
||
complaints that seek to have information removed that would
|
||
not be considered harmful under the legislation.
|
||
3) Adequate time periods for both the complainant and the
|
||
posting user to respond.
|
||
4) Where responses aren't forthcoming...
|
||
- ... if the posting user doesn't respond to the initial
|
||
complaint, the posting is to be taken down
|
||
- ... if the complaining user doesn't respond to the posting
|
||
user's response, the posting is left up.
|
||
5) Within a reasonable and defined period of time, the service
|
||
provider will assess the initial complaint, the
|
||
counter-notice, and the complainant's response to the
|
||
counter-notice, and will decide whether to take the material
|
||
down or to leaving it up, /citing clear reasons for the
|
||
decision./
|
||
6) Where either party is not happy with the decision, they can
|
||
appeal to the regulator, and if the regulator contradicts the
|
||
service operator's decision, the service operator must abide
|
||
by the regulator's ruling. In its consideration of the
|
||
ruling, the regulator must be required to consider the rights
|
||
of both parties.
|
||
- Responsibilities and obligations of the service provider *must*
|
||
relate to the size of the service. For example, it's not
|
||
reasonable to ask the service provider to respond within an
|
||
amount of time for those services that would not have someone
|
||
available within that time. Self-hosters or small,
|
||
single-location, operations would not be able to respond within
|
||
an hour if the complaint is made at 4am!
|
||
- This system should not apply to complaints that a posting violates the service's terms and conditions. If the complaint isn't explicitly made under this legislation, it should not fall within the regulator's remit. *Under no circumstances should merely violating a service's terms and conditions (or "community standards") be considered an offence under this legislation.*
|
||
*** CONSTODO Question 2 -- Statutory tests
|
||
The service operator should be protected from liability under the
|
||
rules if the service can show the following:
|
||
- That the initial complaint was responded to appropriately and
|
||
within a reasonable amount of time.
|
||
- That an appeal was responded to within a reasonable amount of
|
||
time.
|
||
- That the poster and complainant were each offered an opportunity
|
||
to respond
|
||
- That the responses, and any appeals, were given due
|
||
consideration.
|
||
- That the final decision (whether to keep the post up or pull it
|
||
down) was well-reasoned, and considered the context in which the
|
||
post was made.
|
||
- That, where appeals have been made to the regulator, the service
|
||
responds to any order from the regulator in a reasonable manner
|
||
and within a reasonable amount of time.
|
||
*** CONSTODO Question 3 -- Which platforms to be considered in scope
|
||
This submission is concerned to ensure that assumptions not be
|
||
made that all affected platforms will be large, for-profit
|
||
organisations with scores, or hundreds, or thousands of staff
|
||
acting as moderators of user-uploads.
|
||
|
||
The legislation should also not assume that platforms that want to
|
||
deal with user uploads *should* be of a particular nature, or
|
||
size.
|
||
|
||
To make either assumption would be to chill lawful interactions
|
||
between internet-connected parties, and would further entrench the
|
||
larger players on the internet.
|
||
*** CONSTODO Question 4 -- Definitions
|
||
- Please see my introductory comments on this matter.
|
||
- Definitions of "harmful content" must aim to be as narrow as
|
||
possible, in order to avoid the potential of the legislation
|
||
being used to target political speech.
|
||
- In respect of serious cyberbullying, it should be considered
|
||
harmful content under the legislation not just when it targets a
|
||
child. It should be considered cyberbullying and harmful even if
|
||
it is an adult, if the complaint states that s/he is being
|
||
harmed or fears harm should the complained-of behaviour
|
||
continue.
|
||
+ In the event that the target of the cyberbullying is a public
|
||
figure, there should be an additional burden on the
|
||
complainant to state that the behaviour represents real intent
|
||
to cause harm, and is more than people with opposing political
|
||
or social views "shooting their mouths off".
|
||
** CONSTODO Strand 2 -- Video Sharing Platform Services
|
||
*** CONSTODO Question 5 -- What are video-sharing services
|
||
This submission is not providing an answer to this question.
|
||
*** CONSTODO Question 6 -- Relationship between Regulator and VSPS
|
||
This submission is not providing an answer to this question.
|
||
*** CONSTODO Question 7 -- Review by Regulator
|
||
The regulator should require the following reports to be published
|
||
by online services regarding complaints made under this
|
||
legislation:
|
||
- Number of complaints, broken down by nature of complaint
|
||
- Number of complaints that were appealed to the service, broken
|
||
down by nature of complaint and basis of appeal
|
||
- Number of appeals upheld, broken down by reason for appeal
|
||
- Number of appeals rejected, broken down by reason for rejection.
|
||
- Number of complaints/appeals that were appealed further to the
|
||
regulator.
|
||
** CONSTODO Strands 3 & 4 -- Audiovisual Media Services
|
||
*** CONSTODO Question 8 -- "Content" rules for television broadcasting and on-demand services
|
||
This submission is not providing an answer to this question.
|
||
*** CONSTODO Question 9 -- Funding
|
||
RTÉ and its subsidiary services should continue to be funded by
|
||
the government, either through the licence fee, general taxation
|
||
or a mixture of both. RTÉ's editorial independence should be
|
||
re-iterated in this law (and strengthened, if required,
|
||
specifically to assure independence from the editorial demands of
|
||
advertisers). It should be anticipated that RTÉ will eventually
|
||
broadcast only over the internet, and that it will be both a
|
||
live-streaming service (e.g. providing programming in a manner
|
||
similar to it's current broadcast schedule), *and* an on-demand
|
||
service.
|
||
|
||
Funding of services other than RTÉ should only be considered for
|
||
services operated by non-profit organisations such as trusts or
|
||
charities, and such funding should also come with an assurance of
|
||
editorial independence for the recipients.
|
||
** CONSTODO Strands 1 & 2 -- European & International Context
|
||
*** CONSTODO Question 10 -- Freedoms
|
||
- Core to the consideration of the legislation is that everyone
|
||
posting to services are presumed to be innocent of an offence,
|
||
and their postings should also be presumed *not* to offend the
|
||
law.
|
||
- Accusations of harm *must* be tested to determine if they are
|
||
being made to suppress legal speech. This is particularly true
|
||
where the person making the allegation is a public figure, or is
|
||
representing a public figure.
|
||
- Where a service applies -- or is required to apply -- sanctions
|
||
on users who repeatedly post harmful information, similar
|
||
sanctions should also be applied to users who repeatedly make
|
||
*false* accusations under the law.
|
||
*** CONSTODO Question 11 -- Limited liability
|
||
Any regulatory system that makes service providers liable for what
|
||
their *users* say on those services will result in one or a
|
||
combination of the following effects:
|
||
1) Service will stop permitting users to make postings.
|
||
2) Where the value of a service is wholly, or in part, that it
|
||
allows its users to post to it, the service may have to shut
|
||
down.
|
||
3) Services will be sued or prosecuted for the actions of its
|
||
users *regardless* of the effort and good faith they put in to
|
||
"moderating" what is posted on their service -- a concept that
|
||
is borderline ludicrous in the off-line world. This would be
|
||
analogous to a car manufacturer being liable for the
|
||
consequences of car occupants not wearing their seat-belts.
|
||
|
||
There must be clarity in the regulations that a service is
|
||
protected as long as it acts in a good-faith manner to deal with
|
||
postings made by users that are determined to have been
|
||
illegal. This reflects Ireland's obligations under various trade
|
||
agreements to grant safe-harbour protections to internet services.
|
||
|
||
The regulation must also protect platforms and their users against
|
||
bad-faith accusations of harm, particularly from public
|
||
figures. If it is easier to use an accusation of "harmful content"
|
||
than to claim libel, public figures will use that facility to
|
||
suppress information they would like not to be known.
|
||
** CONSTODO Strands 1-4 -- Regulatory Structures
|
||
*** CONSTODO Question 12 -- Regulatory structure
|
||
This submission is not providing an answer to this question.
|
||
*** CONSTODO Question 13 -- Funding of regulatory structure
|
||
This submission is not providing an answer to this question.
|
||
** CONSTODO Strands 1 & 2 -- Sanctions/Powers
|
||
*** CONSTODO Question 14 -- Functions and powers
|
||
This submission is not providing an answer to this question.
|
||
*** CONSTODO Question 15 -- Sanctions
|
||
The following should be taken into account when considering
|
||
sanctions on platforms
|
||
- The nature of the operation
|
||
+ Large, global, profit-based private organisations providing
|
||
services to the general population. (examples include YouTube,
|
||
Facebook, Twitter).
|
||
+ Smaller, local, profit-based private organisations providing
|
||
services to the general population, focused on the region
|
||
(examples might include boards.ie, everymum.ie, etc.)
|
||
+ Small, non-profit forums set up by locally-based and -focused
|
||
organisations such as soccer clubs, or school parents'
|
||
associations[fn:useFacebook:There is often the temptation to
|
||
advise these organisations to use larger platforms like
|
||
Facebook or Google. Some organisations may not want to avail
|
||
of those services, and the reasons for this are not
|
||
relevant. What's important is that deciding not to use these
|
||
platforms is valid, and these decisions should be protected
|
||
and encouraged, not inhibited.]
|
||
+ Individuals, hosting their own platforms.
|
||
- The good-faith efforts of the operation to respond to
|
||
accusations of harm.
|
||
- The capacity of the service to respond -- smaller operations
|
||
can't afford 24-hour monitoring to respond to such accusations,
|
||
and the law should not require it. Such services should be able
|
||
to avail of bad-faith actors seeking to interfere with their
|
||
operations by overwhelming them with false accusations of harm
|
||
that need to be dealt with.
|
||
- Who the accuser is -- public figures should be prevented from
|
||
using accusations of "harmful content" to remove information
|
||
that is merely critical of them or their behaviour.
|
||
*** CONSTODO Question 16 -- Thresholds
|
||
This submission is not providing an answer to this question.
|