More proof-reading

This commit is contained in:
Éibhear Ó hAnluain 2019-09-19 19:51:39 +01:00
parent cd3e65d314
commit 6ed44e21c2

View file

@ -445,14 +445,14 @@
relevant protocols. relevant protocols.
- Due to the maturity and age of these protocols, software needed - Due to the maturity and age of these protocols, software needed
to use them is now abundant and trivially easy to get and install to use them is now abundant and trivially easy to get and install
and run on a computer. Such software is also very easy to develop and run on any general-purpose computer. Such software is also
for moderately-skilled software engineers. very easy to develop for moderately-skilled software engineers.
- Neither the protocols that define, nor the software that - Neither the protocols that define, nor the software that
implement the internet regard any computer to be superior or implement the internet regard any computer to be superior or
inferior to any other computer. For this reason, there is no cost inferior to any other computer. For this reason, there is no cost
or capacity barrier for someone to cross in order to run an or capacity barrier to running an internet service: if you have
internet service: if you have the software, and the internet the software, and the internet connection, then you can expose
connection, then you can expose such a service. an online service.
Clear examples from the past of how the accessibility of the Clear examples from the past of how the accessibility of the
internet technologies has benefited the world include the internet technologies has benefited the world include the
@ -500,14 +500,14 @@
a photographer, John Oringer, who developed the service as a a photographer, John Oringer, who developed the service as a
means to make available 30,000 of his own photographs. means to make available 30,000 of his own photographs.
The ease with which internet technology can be accessed has given The ease with which internet technology can be accessed is
rise to the explosion of services that connect people, and people instrumental in the explosion of services that connect people, and
with businesses. people with businesses.
It is critical to note that many of these technologies and services It is critical to note that many of these technologies and services
started out with an individual or small group developing an idea started out with an individual or small group developing an idea
and showing it can work *prior* to receiving the large capital and showing it can work *prior* to receiving the large capital
investments that result in their current dominance. investments that resulted in their current dominance.
All of the above technologies and services can be considered truly All of the above technologies and services can be considered truly
disruptive. In their respective domains, their arrivals resulted in disruptive. In their respective domains, their arrivals resulted in
@ -534,7 +534,9 @@
or instance. For administrators of instances, federation means that or instance. For administrators of instances, federation means that
they can configure their instances according to their own they can configure their instances according to their own
preferences, rather than having to abide by the rules or technical preferences, rather than having to abide by the rules or technical
implementation of someone else. implementation of someone else. For the ecosystem, federation means
that if one node goes down or is attacked, the others can continue
with a minimum of interruption.
** CONSDONE Regulation of self-hosted services ** CONSDONE Regulation of self-hosted services
@ -543,14 +545,15 @@
regulations don't harm the desire of those who want to create regulations don't harm the desire of those who want to create
their own services. their own services.
A regulation that apply liability to a service-provider for someone A regulation that applies liability to a service-provider for
else's behaviour, is a regulation that can be adhered to only by someone else's behaviour is a regulation that can be adhered to
organisations with large amounts of money to hand. For example, if only by organisations with large amounts of money to hand. For
the regulation was to apply liability on me for a posting made by example, if the regulation was to apply liability on me for a
someone else that appears on one of the services that I run (and posting made by someone else that appears on one of the services
originally posted *somewhere* else -- these are federated services that I run (and likely originally posted *somewhere* else -- these
after all), I would have to shut it down; I am not able to put in are federated services after all), I would have to shut it down; I
place the necessary infrastructure that would mitigate my am not able to put in place the necessary technical or legal
infrastructure that would mitigate my
liability[fn:copyrightDirective:This assumes that my services liability[fn:copyrightDirective:This assumes that my services
aren't forced to shut down by the new EU Copyright Directive aren't forced to shut down by the new EU Copyright Directive
anyway]. Given that my services are intended to provide a positive anyway]. Given that my services are intended to provide a positive
@ -572,24 +575,24 @@
regulations have the effect[fn:unintended:unintended, one hopes] of regulations have the effect[fn:unintended:unintended, one hopes] of
harming such small operators, the result will not just be the loss harming such small operators, the result will not just be the loss
of these services, but also the loss of opportunity to make the Web of these services, but also the loss of opportunity to make the Web
richer by means of the imposition of artificial barriers to richer because artificial barriers to entry will be imposed by
entry. Such regulations will inhibit the development of ideas that those regulations. They will inhibit the development of ideas that
pop into the heads of individuals, who will realise them with pop into the heads of individuals, who would realise them with
nothing more than a computer connected to the internet. nothing more than a computer connected to the internet.
* CONSDONE Other considerations * CONSDONE Other considerations
While the main focus of this submission is to highlight the While the main focus of this submission is to highlight the
potential risk to self-hosters from regulation that neglect to potential risk to self-hosters from regulations that neglect to
consider the practice, I would like to take the opportunity to consider the practice, I would like to take the opportunity to
briefly raise some additional concerns briefly raise some additional concerns
** CONSDONE Abuse ** CONSDONE Abuse of the systems
To date, all systems that seek to protect others from harmful or To date, all systems that seek to protect others from harmful or
other objectionable material (e.g. copyright infringement, other objectionable material (e.g. copyright infringement,
terrorism propaganda, etc.) have, to date, been very amenable to terrorism propaganda, etc.) have been easily amenable to abuse. For
abuse. For example, in a recent court filing, Google claimed that example, in a recent court filing, Google claimed that 99.97% of
99.97% of infringement notices it received in from a single party copyright infringement notices it received in from a single party
in January 2017 were in January 2017 were
bogus[fn:googleTakedown:https://www.techdirt.com/articles/20170223/06160336772/google-report-9995-percent-dmca-takedown-notices-are-bot-generated-bullshit-buckshot.shtml]: bogus[fn:googleTakedown:https://www.techdirt.com/articles/20170223/06160336772/google-report-9995-percent-dmca-takedown-notices-are-bot-generated-bullshit-buckshot.shtml]:
@ -609,19 +612,16 @@
index. index.
#+END_QUOTE #+END_QUOTE
That a single entity would submit more than 16 million URLs for With the US' Digital Millennium Copyright Act, there is no downside
delisting in a single month is staggering, and demonstrates a for a bad-faith actor seeking to take advantage of a system for
compelling point: there is no downside for a bad-faith actor suppressing information[fn:downside:The law contains a provision
seeking to take advantage of a system for suppressing
information[fn:downside:The law being used in this specific case is
the US Digital Millennium Copyright Act. It contains a provision
that claims of copyright ownership on the part of the claimant are that claims of copyright ownership on the part of the claimant are
to be made under penalty of perjury. However, that provision is to be made under penalty of perjury. However, that provision is
very weak, and seems not to be a deterrent for a determined agent: very weak, and seems not to be a deterrent for a determined agent:
https://torrentfreak.com/warner-bros-our-false-dmca-takedowns-are-not-a-crime-131115]. https://torrentfreak.com/warner-bros-our-false-dmca-takedowns-are-not-a-crime-131115].
The GDPR's /Right to be Forgotten/ is also subject to abuse. An The GDPR's /Right to be Forgotten/ is also subject to abuse. An
individual from Europe continues to have stories related to him individual from Europe continues to force stories related to him
excluded from Google searches. However appropriate on the face of excluded from Google searches. However appropriate on the face of
it, the stories this individual is now getting suppressed relate to it, the stories this individual is now getting suppressed relate to
his continued abuse of the /Right to be his continued abuse of the /Right to be
@ -637,46 +637,47 @@
need to. need to.
In systems that facilitate censorship[fn:censorship:While seeking In systems that facilitate censorship[fn:censorship:While seeking
to achieve a valuable and socially important goal, this to achieve a valuable and socially important goal, legislation of
legislation, and all others of its nature, facilitates censorship: nature facilitates censorship: as a society, we should not be so
as a society, we should not be so squeamish about admitting this.], squeamish about admitting this.], it is important to do more than
it is important to do more than merely assert that service merely assert that service providers should regard fundamental
providers should protect fundamental rights for expression and rights for expression and information. In a regime where sending an
information. In a regime where sending an e-mail costs nearly e-mail costs nearly nothing, where a service risks serious
nothing, where a service risks serious penalties (up to and penalties (up to and including having to shut down) and where a
including having to shut down) and where a claimant suffers nothing claimant suffers nothing for abusive claims, the regime is
for abusive claims, the regime is guaranteed to be abused. guaranteed to be abused.
** CONSDONE Content Moderation ** CONSDONE Content Moderation
Much of the focus on legislative efforts to deal with harmful or Much of the focus of legislative efforts to deal with harmful or
objectionable material on services that permit uploads from users is objectionable material that appear on services that permit uploads
on what the service providers do about it. Many argue that they are from users is on what the service providers do about it. Many argue
not doing anything, or at least not enough. that they are not doing anything, or at least not enough.
However, this is an unfortunate mischaracterisation of the However, this is an unfortunate mischaracterisation of the
situation. For example, facebook employs -- either directly or situation. For example, facebook employs -- either directly or
through out-sourcing contracts -- many 10s of thousands through out-sourcing contracts -- many 10s of thousands
"moderators", whose job is to make a decision to remove offensive "moderators", whose job is to make a decision to remove offensive
material or not, to suppress someone's freedom of expression or material or not, to suppress someone's freedom of expression or
not, based on a set of if-then-else questions. not, based on a set of if-then-else questions. These questions are
not easy:
- It's illegal in Germany to say anything that can be construed as - It's illegal in Germany to say anything that can be construed as
glorifying the Holocaust. In the UK it isn't. Facebook can glorifying the Holocaust. In the US it isn't. Facebook can
suppress such information from users it believes are in Germany, suppress such information from users it believes are in Germany,
but to do so for those in the UK would be an illegal denial of but to do so for those in the US would be an illegal denial of
free expression, regardless of how objectionable the material free expression, regardless of how objectionable the material
is. What is facebook to do with users in Germany who route their is. What is facebook to do with users in Germany who route their
internet connections through the UK? Facebook has no knowledge of internet connections through the UK? Facebook has no knowledge of
this unusual routing, and to learn about it could be a violation this unusual routing, and to seek to learn about it could be a
of the user's right to privacy. Should facebook be criminally violation of the user's right to privacy. Should facebook be
liable for a German user seeing statements that are illegal in criminally liable for a German user seeing statements that are
Germany? illegal in Germany?
- Consider the genocide of Armenian people in Turkey in 1915. It is - Consider the genocide of Armenian people in Turkey in 1915. In
illegal to claim it happened in Turkey. However, for a period Turkey it is illegal to claim it happened. However, for a period
between 2012 and 2017 it was illegal in France to claim it didn't between 2012 and 2017 it was illegal in France to claim it didn't
happen. In most other countries, neither claim is illegal. What happen. In most other countries, neither claim is illegal. What
can a service like facebook do when faced with 3 options, 2 of can a service like facebook do when faced with 3 options, 2 of
which are mutually exclusive? Literally, should they be which are mutually exclusive? Literally, they would be
criminally liable both if they do /and/ if they criminally liable both if they do /and/ if they
don't[fn:dink:Prior to his assassination in Istanbul in 2007, don't[fn:dink:Prior to his assassination in Istanbul in 2007,
Hrant Dink, an ethnic Armenian Turkish journalist who campaigned Hrant Dink, an ethnic Armenian Turkish journalist who campaigned
@ -685,39 +686,45 @@
contradictions with laws that criminalise statements of fact.]? contradictions with laws that criminalise statements of fact.]?
Moderators have no more than a minute to determine whether a Moderators have no more than a minute to determine whether a
statement complies with the law of not, and this includes figuring statement complies with the law or not, and this includes figuring
out whether the posting meets the definitions of abusive or out whether the posting meets the definitions of abusive or
harmful, and whether it is indeed intended to meet that harmful, and whether it is indeed intended to meet that
definition. For example, consider an abusive tweet. Should the definition. For example, consider an abusive tweet. Should the
harmful, abusive tweet be removed? Who decides? What if the target harmful, abusive tweet be removed? Who decides? What if the target
of the abusive tweet wants that tweet to be retained, for, say of the abusive tweet wants that tweet to be retained, for, say
evidence? What if the tweet was an attempt at abuse, but the target future evidence in a claim? What if the tweet was an attempt at
chose not to be affected by it? Should it stay up? Who decides? abuse, but the target chose not to be affected by it? Should it
What if the target doesn't care, but others who see the tweet but stay up? Who decides? What if the target doesn't care, but others
who aren't the target of the abuse may be offended by it. Should it who see the tweet and are not the target of the abuse may be
be taken down as abusive even though the target of the abuse offended by it. Should it be taken down as abusive even though the
doesn't care, or objects to its removal? Who would be criminally target of the abuse doesn't care, or objects to its removal? Who
liable in these situations? What if the target of the abuse would be criminally liable in these situations? What if the target
substantially quotes the abusive tweets? Is the target now to be of the abuse substantially quotes the abusive tweets? Is the target
considered an offender under a criminal liability regime when that now to be considered an offender under a criminal liability regime
person may be doing nothing other than /highlighting/ abuse? when that person may be doing nothing other than /highlighting/
abuse?
All of these scenarios are valid and play out every day. Content
moderators need to consider these and many more questions, but get
very little time to do so. The result: a public perception,
promoted by public figures, that these large services are doing
nothing about abuse.
"Content moderation" is very hard, and is impossible at the scales "Content moderation" is very hard, and is impossible at the scales
that services like twitter or facebook operate in. When context is that services like twitter or facebook operate in. When context is
critical in deciding whether to decide someone is engaged in critical to decide that someone is engaged in harmful or abusive
harmful or abusive behaviour, it would be fundamentally unfair to behaviour, it would be fundamentally unfair to make a service
make a service criminally liable just because it made the wrong criminally liable just because it made the wrong decision as it
decision as it didn't have time to determine the full context, or didn't have time to determine the full context, or because it
because it misinterpreted or misunderstood the context. misinterpreted or misunderstood the context.
** CONSDONE User Behaviour ** CONSDONE User Behaviour
Many believe that the way to deal with abusive or harmful material Many believe that the way to deal with abusive or harmful material
online is to punish the services that host the material. This is online is to punish the services that host the material. This is
reasonable if the material was placed onto the service by those who reasonable if the material was placed onto the service by those who
own or manage the service. It is also reasonable if the material is operate the service. It is also reasonable if the material is put
put there by users with the clear knowledge of the managers or there by users with the clear knowledge of the service operator, or
owners of the service, or by users following encouragement of the by users following encouragement of the operators of the service.
managers or owners of the service.
However, these specific situations are rare in the world of normal However, these specific situations are rare in the world of normal
online services[fn:criminal:Services that are dedicated to hosting online services[fn:criminal:Services that are dedicated to hosting
@ -732,8 +739,7 @@
the communication is made. The idea that internet services are the communication is made. The idea that internet services are
responsible for abusive communications is as difficult to responsible for abusive communications is as difficult to
understand as the idea that a table-saw manufacturer is responsible understand as the idea that a table-saw manufacturer is responsible
for a carpenter not wearing safety glasses which using to to cut for a carpenter not wearing safety glasses.
timber.
Recent history has shown that the most effective ways to change Recent history has shown that the most effective ways to change
behaviour are not necessarily punitive. It's hard to see how behaviour are not necessarily punitive. It's hard to see how