Completion. Consolidation and spell-check left to do
This commit is contained in:
parent
b4541778fe
commit
8cbfd939f5
1 changed files with 280 additions and 121 deletions
|
@ -344,23 +344,7 @@
|
|||
|
||||
However, prior to addressing these topics, I would like to raise
|
||||
some ambiguities that this wider discussion will encounter.
|
||||
- The first is the meaning of the term /self-regulation/. If a
|
||||
measure of self-regulation to address these concerns is
|
||||
acceptable, then it would be necessary for public-perception
|
||||
reasons, to be clear on what that means. /Self-regulation/ could
|
||||
mean either where each service operator manages matters of
|
||||
harassment and harmful communications according to their own rules
|
||||
and processes. This is currently how the large service providers
|
||||
we're most familiar with operate. However, /self-regulation/ may
|
||||
also refer to regulation by a non-governmental industry-funded
|
||||
body, following the model of the press council or the advertising
|
||||
standards authority, where rules and processes are agreed among
|
||||
the operators as a set of standards, and where decisions of
|
||||
compliance to these are made by this body.
|
||||
|
||||
In order to avoid this ambiguity, I will use the term
|
||||
"self-moderation" to refer to the former, and the term
|
||||
"industry-regulation" for the latter.
|
||||
- The first is the meaning of the term /self-regulation/.
|
||||
|
||||
* CONSDONE Self-hosting
|
||||
** CONSDONE Self-hosting
|
||||
|
@ -595,7 +579,7 @@
|
|||
pop into the heads of individuals, who will realise them with
|
||||
nothing more than a computer connected to the internet.
|
||||
|
||||
* CONSTODO Other considerations
|
||||
* CONSDONE Other considerations
|
||||
While the main focus of this submission is to highlight the
|
||||
potential risk to self-hosters from regulation that neglect to
|
||||
consider the practice, I would like to take the opportunity to
|
||||
|
@ -728,7 +712,7 @@
|
|||
decision as it didn't have time to determine the full context, or
|
||||
because it misinterpreted or misunderstood the context.
|
||||
|
||||
** CONSTODO User Behaviour
|
||||
** CONSDONE User Behaviour
|
||||
Many believe that the way to deal with abusive or harmful material
|
||||
online is to punish the services that host the material. This is
|
||||
reasonable if the material was placed onto the service by those who
|
||||
|
@ -763,7 +747,7 @@
|
|||
there is no attempt to change behaviour, then abusive people will
|
||||
simply work around the controls and continue to abuse.
|
||||
|
||||
** CONSTODO Investigation support
|
||||
** CONSDONE Investigation support
|
||||
|
||||
In response to the live-streaming of that horrific shooting dead of
|
||||
more than 50 people in New Zealand earlier this year, that country
|
||||
|
@ -777,14 +761,30 @@
|
|||
it might include
|
||||
- to appeal for help in finding someone caught up in the masssacre
|
||||
- Legitimate news reporting of such an event.
|
||||
- to help investigate the shooting and its circumstances
|
||||
- to help investigate the shooting and its
|
||||
circumstances[fn:ForArch:Forensic Architecture,
|
||||
https://forensic-architecture.org/, is a research group that
|
||||
investigates alleged abuses of human rights using image and video
|
||||
records of events. To criminalise the sharing of such imagery and
|
||||
videos with no regard as to the purpose for the sharing plays
|
||||
directly into the hands of those who disregard victims' civil
|
||||
rights.]
|
||||
- training for law enforcement or terrorism- or disaster-response
|
||||
personnel.
|
||||
|
||||
However, if the law says that no form of sharing is permitted, then
|
||||
none of the entirely legitimate purposes would be possible, and the
|
||||
world would be that bit less safe as a result.
|
||||
** CONSTODO Encrypted services
|
||||
|
||||
There is a similar consideration for abusive material posted
|
||||
online. If a communication is deemed to be an offence, care needs
|
||||
to be taken to ensure that the "removal" of such a communication
|
||||
(or a set of such communications) is not equivalent of the
|
||||
destruction of evidence. This is particularly true in the context
|
||||
that it is now very easy for anyone to forge screen-shots of online
|
||||
postings.
|
||||
|
||||
** CONSDONE Encrypted services
|
||||
|
||||
Some believe that if end-to-end encryption services that prevent
|
||||
security services from accessing material were banned or
|
||||
|
@ -828,105 +828,264 @@
|
|||
* CONSTODO Answers to consultation questions
|
||||
The follows are some answers to the questions posed in the call for
|
||||
submissions.
|
||||
** CONSTODO Definition of communication in legislation
|
||||
1. There are currently significant gaps in legislation with regard
|
||||
to harassment and newer, more modern forms of communication. Is
|
||||
there a need to expand the definition of ‘communications’ to
|
||||
include online and digital communications tools such as
|
||||
WhatsApp, Facebook, Snapchat, etc. when addressing crimes of
|
||||
bullying or harassment?
|
||||
- Éibhear comment :: (/Address in introduction/) It is necessary
|
||||
not to assume that the current services that operate will
|
||||
be the primary services in 5 or 10 years' time.
|
||||
2. What lessons can be learned from models used in other
|
||||
jurisdictions such as the UK, New Zealand, Australia and other
|
||||
European countries where legislation is now in place to address
|
||||
these issues? How do we establish an appropriate model without
|
||||
compromising free speech?
|
||||
- Éibhear comment :: (/Address in answer to specific questions/)
|
||||
UK: duty of care is inappropriate. New Zealand: allowing a
|
||||
committee to decide what is objectionable, thus restricting
|
||||
not only those who want to share objectionable material,
|
||||
but also those who want to report on it.
|
||||
3. How do we ensure that any legislation that is enacted is
|
||||
flexible enough to keep up with changing and advancing
|
||||
technologies, new apps and other online forums, including the
|
||||
more familiar social media sites?
|
||||
- Éibhear's comments :: (/Core concern/) Hmm. This is the meat
|
||||
of the submission.
|
||||
** CONSTODO Harassment, stalking & other forms of online abuse
|
||||
4. [@4] Online harassment can take the form of on-consensual taking
|
||||
and distribution of intimate images or videos, otherwise known
|
||||
as ‘revenge porn’, ‘upskirting’, ‘downblousing’ and other forms
|
||||
of sharing of imagery online without consent. What approaches
|
||||
are taken to addressing these issues in other jurisdictions?
|
||||
- Éibhear's comment :: No answer for this
|
||||
5. New offences are proposed to cover these issues in Deputy
|
||||
Brendan Howlin’s Private Members Bill on this subject. Is the
|
||||
creation of new offences necessary, or is existing legislation
|
||||
sufficient? Should other forms of image-sharing issues - such as
|
||||
exposure - also be addressed?
|
||||
- Éibhear's comment :: No answer for this
|
||||
6. What kind of oversight and regulation of online service
|
||||
providers is possible/used in other jurisdictions? Currently,
|
||||
online providers are self regulated. Is a proactive,
|
||||
self-regulating approach from online companies to activities
|
||||
such as revenge porn and other forms of harassment preferable to
|
||||
the creation of more laws?
|
||||
- Éibhear's comment :: Important to know the difference
|
||||
between "self regulated", and pro-active
|
||||
moderation. These service moderation according to their
|
||||
own rules; there is no industry authority like the press
|
||||
council or the advertising standards authority, which are
|
||||
self-regulatory regimes.
|
||||
7. Is any data provided by online service providers in relation to
|
||||
the reporting or prevalence of activities such as
|
||||
upskirting/revenge porn/cyberbullying and other online behaviour
|
||||
that can be used to develop and draft future legislation?
|
||||
- Éibhear's comment :: No data. However, services should be
|
||||
encouraged to issue reports on their moderation efforts.
|
||||
8. To what extent are An Garda Síochána equipped and resourced to
|
||||
deal with the issues arising from harmful online communications
|
||||
such as these?
|
||||
- Éibhear's comment :: No answer for this
|
||||
9. Should ‘cyberstalking’ be treated as a separate offence to
|
||||
online harassment? What constitutes stalking-type behaviour
|
||||
online? Is there a need to legislative specifically for this
|
||||
activity?
|
||||
- Éibhear's comment :: No answer for this
|
||||
10. Based on the findings of other jurisdictions such as in the UK,
|
||||
An Garda Síochána will require consistent training in order to
|
||||
maintain an appropriate level of knowledge with regard to
|
||||
indictable behaviours. Are resources available for this?
|
||||
- Éibhear's comment :: No answer for this
|
||||
11. Fake accounts/troll accounts used to harass or target others
|
||||
with abuse – what measures can be taken in relation to these
|
||||
without effecting freedom of expression?
|
||||
- Éibhear's comment :: Care needs to be taken to ensure
|
||||
manage/prevent false identification of accounts as 'fake'
|
||||
or 'troll'.
|
||||
12. Do other jurisdictions have statutory measures to protect
|
||||
victim identities in cases of online harassment being released
|
||||
online posthearings, etc?
|
||||
- Éibhear's comment :: No answer for this
|
||||
** CONSTODO Harmful online behaviour and young people
|
||||
13. [@13] How do we most appropriately regulate social media
|
||||
platforms to prevent cyberbullying and inappropriate sharing of
|
||||
personal images?
|
||||
- Éibhear's comment :: take details from earlier submission.
|
||||
14. For young people who participate in such online behaviour as
|
||||
consensual image sharing, how can it be ensured that they are
|
||||
not inadvertently criminalised when legislation is enacted?
|
||||
What safeguards can be put in place?
|
||||
- Éibhear's comment :: No answer for this
|
||||
15. Deputy Brendan Howlin’s Private Members Bill provides that
|
||||
those under 17 should not be fined/imprisoned but put into
|
||||
relevant education or supports. Would these supports be part of
|
||||
the same educational supports offered to all young
|
||||
people/schools or would they be a separate entity? Are current
|
||||
supports being utilised? Are there sufficient resources to
|
||||
provide for such a provision when enacted?
|
||||
- Éibhear's comment :: No answer for this
|
||||
** CONSDONE Definition of communication in legislation
|
||||
- Question 1 :: There are currently significant gaps in legislation
|
||||
with regard to harassment and newer, more modern
|
||||
forms of communication. Is there a need to expand
|
||||
the definition of ‘communications’ to include
|
||||
online and digital communications tools such as
|
||||
WhatsApp, Facebook, Snapchat, etc. when addressing
|
||||
crimes of bullying or harassment?
|
||||
+ Answer :: Yes. However, it is important to consider the following:
|
||||
* Not all such tools are as large and have such human and
|
||||
financial resources as the specific services referred
|
||||
to. Legislation that makes the assumption that such
|
||||
communication can take place only through services that are
|
||||
as large and wealthy as these will stand a very good chance
|
||||
of restricting or limiting competition in these services'
|
||||
domains by imposing regulatory barriers of entry. I expand on
|
||||
this in the "Self-hosting" section of this submission.
|
||||
* Legislation should focus not on the tool, but on the
|
||||
behaviour. In the main, therefore, it's the behaviour of
|
||||
those performing the bullying or abuse that should be
|
||||
targetted and not the "tool" used as the communications
|
||||
medium. I expand on this in the "User behaviour" section of
|
||||
this submission.
|
||||
- Question 2 :: What lessons can be learned from models used in
|
||||
other jurisdictions such as the UK, New Zealand,
|
||||
Australia and other European countries where
|
||||
legislation is now in place to address these
|
||||
issues? How do we establish an appropriate model
|
||||
without compromising free speech?
|
||||
+ Answer :: The incentives need to be present to ensure that the
|
||||
balance is managed correctly. Any legislation, such
|
||||
as
|
||||
/FOSTA-SESTA/[fn:FOSTA-SESTA:https://en.wikipedia.org/wiki/Stop_Enabling_Sex_Traffickers_Act]
|
||||
in the US, that seeks merely to punish web sites,
|
||||
will to more harm than good[fn:SOSTAEffect:Lura
|
||||
Chamberlain, FOSTA: A Hostile Law with a Human Cost,
|
||||
87 Fordham L. Rev. 2171 (2019). Available at:
|
||||
https://ir.lawnet.fordham.edu/flr/vol87/iss5/13]. The
|
||||
incentive for US-based web site operators in this
|
||||
case is either *never* to host information for or by
|
||||
sex workers for fear of falling foul of the law, or
|
||||
to cease operations altogether. The result has been a
|
||||
human rights disaster, as sex workers, particularly
|
||||
women, are now at greater risk than before due to the
|
||||
failure of the law to consider the effect of a
|
||||
straight ban.
|
||||
|
||||
The recently-passed new EU Copyright directive
|
||||
mandates the filtering of user uploads based on prior
|
||||
notice that such uploads *may* be infringing
|
||||
copyright, subject to severe penalties, but requires
|
||||
mere respect for users' freedom of speech with no
|
||||
penalties attaching to failing to do so. The
|
||||
incentive for the service operators here is to err on
|
||||
the side of suppressing material regardless of
|
||||
anyone's freedom of expression, as the consequences
|
||||
of not doing so could be catastrophic for the service
|
||||
operator.
|
||||
|
||||
The proposal in the UK to apply a duty of care to
|
||||
service operators is also destined for failure, as a
|
||||
duty of care is a physical-world concept that has no
|
||||
suitable analogy in the context of internet services.
|
||||
|
||||
Ironically, the likely best regulatory approach is
|
||||
one that online services currently operate under in
|
||||
the US and to a large degree in Europe: intermediary
|
||||
liability protection. All these services maintain
|
||||
terms and conditions ("Community Rules", "Code of
|
||||
Conduct", etc.) and confirmed violations of these
|
||||
result in sanctions on the users. However, where
|
||||
services aren't aware of violations, they are
|
||||
protected on the grounds that the behaviour that is
|
||||
objectionable is not that of the service operator,
|
||||
but is of the user. In short, punish the user, not
|
||||
the service provider, unless -- of course -- the
|
||||
service provider is complicit.
|
||||
- Question 3 :: How do we ensure that any legislation that is
|
||||
enacted is flexible enough to keep up with changing
|
||||
and advancing technologies, new apps and other
|
||||
online forums, including the more familiar social
|
||||
media sites?
|
||||
+ Answer :: This is this submissions core concern. For
|
||||
legislation to focus on the technology, and not on
|
||||
the behaviour, to focus on the service operator and
|
||||
not on the real offender, runs real risks of damaging
|
||||
human rights of totally innocent parties, as well as
|
||||
stifling innovation and consolidating the market
|
||||
positions of the majar operators
|
||||
** CONSDONE Harassment, stalking & other forms of online abuse
|
||||
- Question 4 :: Online harassment can take the form of
|
||||
on-consensual taking and distribution of intimate
|
||||
images or videos, otherwise known as ‘revenge
|
||||
porn’, ‘upskirting’, ‘downblousing’ and other forms
|
||||
of sharing of imagery online without consent. What
|
||||
approaches are taken to addressing these issues in
|
||||
other jurisdictions?
|
||||
+ Answer :: This submission is not offering any answer to this
|
||||
question.
|
||||
- Question 5 :: New offences are proposed to cover these issues in
|
||||
Deputy Brendan Howlin’s Private Members Bill on
|
||||
this subject. Is the creation of new offences
|
||||
necessary, or is existing legislation sufficient?
|
||||
Should other forms of image-sharing issues - such
|
||||
as exposure - also be addressed?
|
||||
+ Answer :: This submission is not offering any answer to this
|
||||
question.
|
||||
- Question 6 :: What kind of oversight and regulation of online
|
||||
service providers is possible/used in other
|
||||
jurisdictions? Currently, online providers are self
|
||||
regulated. Is a proactive, self-regulating approach
|
||||
from online companies to activities such as revenge
|
||||
porn and other forms of harassment preferable to
|
||||
the creation of more laws?
|
||||
+ Answer :: If a measure of self-regulation to address these
|
||||
concerns is acceptable, then it would be necessary
|
||||
for public-perception reasons, to be clear on what
|
||||
that means. /Self-regulation/ could mean either where
|
||||
each service operator manages matters of harassment
|
||||
and harmful communications according to their own
|
||||
rules and processes. This is currently how the large
|
||||
service providers we're most familiar with
|
||||
operate. However, /self-regulation/ may also refer to
|
||||
regulation by a non-governmental industry-funded
|
||||
body, following the model of the press council or the
|
||||
advertising standards authority, where rules and
|
||||
processes are agreed among the operators as a set of
|
||||
standards, and where decisions of compliance to these
|
||||
are made by this body.
|
||||
|
||||
Aside from making this comment on the term, what is
|
||||
more important is getting the competing rights
|
||||
correctly balanced, rather than the model of
|
||||
regulation that asserts that balance.
|
||||
|
||||
- Question 7 :: Is any data provided by online service providers in
|
||||
relation to the reporting or prevalence of
|
||||
activities such as upskirting/revenge
|
||||
porn/cyberbullying and other online behaviour that
|
||||
can be used to develop and draft future
|
||||
legislation?
|
||||
+ Answer :: Each of the major sites prepare what are called
|
||||
"Transparency Reports". However, many of these
|
||||
reports are constrained by rules laid out by (in
|
||||
particular) the so-called "Intelligence Community" of
|
||||
the United States. Thus these reports are not as
|
||||
transparent as they could be.
|
||||
|
||||
It should be a requirement for such services to issue
|
||||
a periodic report detailing the following statistics
|
||||
in each:
|
||||
* The number of reported postings, broken down by nature of the
|
||||
complaint
|
||||
* Number of reports that were appealed to the service, broken
|
||||
down by the nature of the complaint and the basis of appeal
|
||||
* Number of appeals upheld, broken down by reason for appeal
|
||||
* Number of appeals rejected, broken down by reason for
|
||||
rejection.
|
||||
* Number of complaints/appeals that were appealed further to
|
||||
the regulator or courts system.
|
||||
- Question 8 :: To what extent are An Garda Síochána equipped and
|
||||
resourced to deal with the issues arising from
|
||||
harmful online communications such as these?
|
||||
+ Answer :: This submission is not offering any answer to this
|
||||
question.
|
||||
- Question 9 :: Should ‘cyberstalking’ be treated as a separate
|
||||
offence to online harassment? What constitutes
|
||||
stalking-type behaviour online? Is there a need to
|
||||
legislative specifically for this activity?
|
||||
+ Answer :: This submission is not offering any answer to this
|
||||
question.
|
||||
- Question 10 :: Based on the findings of other jurisdictions such
|
||||
as in the UK, An Garda Síochána will require
|
||||
consistent training in order to maintain an
|
||||
appropriate level of knowledge with regard to
|
||||
indictable behaviours. Are resources available for
|
||||
this?
|
||||
+ Answer :: This submission is not offering any answer to this
|
||||
question.
|
||||
- Question 11 :: Fake accounts/troll accounts used to harass or
|
||||
target others with abuse – what measures can be
|
||||
taken in relation to these without effecting
|
||||
freedom of expression?
|
||||
+ Answer :: The assumption that an account that isn't clearly
|
||||
associated with a personal identity is "fake" needs
|
||||
to be challenged. It is the /behaviour/ of the
|
||||
account than needs to be considered. This is true of
|
||||
accounts that are associated with indentifiable
|
||||
individuals as well as of pseudonymous
|
||||
accounts[fn:trolls:A well-known Irish public figure
|
||||
who offers commentary on many aspects of society
|
||||
frequently posts messages on Twitter designed to
|
||||
elicit angry responses. I describe this person as "A
|
||||
master of the false equivalence". This is the classic
|
||||
online trolling behaviour. Similarly, on the 18th
|
||||
September 2019, a prominant UK journalist tweeted
|
||||
personal details of a father who publicly challenged
|
||||
UK Prime Minister Boris Johnson regarding the state
|
||||
of the NHS. This was construed by many as a
|
||||
deliberate trolling to inflict a measure of
|
||||
unofficial retribution on the man.].
|
||||
|
||||
It should not be assumed that pseudonymous accounts
|
||||
are created in order for the users to escape legal
|
||||
consequences for criminal communications. There are
|
||||
many reasons for maintaining a psuedonymous presence
|
||||
online, some of which I have personally encountered
|
||||
being:
|
||||
- To protect against a physically abusive family
|
||||
member
|
||||
- To protect against an employer that monitors online
|
||||
activities
|
||||
- To engage online in a manner that deals with prejudices
|
||||
(e.g. many respond to women differently than to men, to
|
||||
people of a different religion or skin colour than to those
|
||||
of the same religion or skin colour, etc.)
|
||||
- To protect against action from their own governments whose
|
||||
laws are less respectful of civil rights as we would think
|
||||
Ireland's are.
|
||||
|
||||
|
||||
It should not be assumed that a pseudonymous account has been
|
||||
created for reasons of abuse or harmful communication. In fact,
|
||||
there's good reason to assume that the significant majority of
|
||||
pseudonymous accounts operate for completely innocent
|
||||
reaons[fn:realnames:facebook excepted. However, facebook's
|
||||
real-name policy is itself wrong, and does a great deal of
|
||||
damage to people who have good reasons for their names not to
|
||||
be associated with their online presences.].
|
||||
- Question 12 :: Do other jurisdictions have statutory measures to
|
||||
protect victim identities in cases of online
|
||||
harassment being released online posthearings,
|
||||
etc?
|
||||
+ Answer :: This submission is not offering any answer to this
|
||||
question.
|
||||
** CONSDONE Harmful online behaviour and young people
|
||||
- Question 13 :: How do we most appropriately regulate social media
|
||||
platforms to prevent cyberbullying and
|
||||
inappropriate sharing of personal images?
|
||||
+ Answer :: I refer you to the details of this submission.
|
||||
- Question 14 :: For young people who participate in such online
|
||||
behaviour as consensual image sharing, how can it
|
||||
be ensured that they are not inadvertently
|
||||
criminalised when legislation is enacted? What
|
||||
safeguards can be put in place?
|
||||
+ Answer :: This submission is not offering any answer to this
|
||||
question.
|
||||
- Question 15 :: Deputy Brendan Howlin’s Private Members Bill
|
||||
provides that those under 17 should not be
|
||||
fined/imprisoned but put into relevant education
|
||||
or supports. Would these supports be part of the
|
||||
same educational supports offered to all young
|
||||
people/schools or would they be a separate entity?
|
||||
Are current supports being utilised? Are there
|
||||
sufficient resources to provide for such a
|
||||
provision when enacted?
|
||||
+ Answer :: This submission is not offering any answer to this
|
||||
question.
|
||||
|
||||
* CONSDONTDO Answers to consultation questions :noexport:
|
||||
** CONSTODO Strand 1 -- National Legislative Proposal
|
||||
|
|
Loading…
Reference in a new issue