Involving page owners in annotation

Dear all

In light of recent events we referenced on the Hypothesis blog [0], we
launched a small research initiative to gather different points of view
about website publishers and authors consent to annotation.  Our goal was
to identify different paths forward taking into account the perspectives of
publishers, engineers, developers and people working on abuse and
harassment issues on the web.

We posted this same summary on the Hypothesis blog today [1]

We interviewed 13 individuals on issues of author preferences, product
design for abuse prevention, content policy and other related topics,
mostly in relation to the Hypothesis product.  Among the interviewees we
consulted software developers, people with experience on Trust & Safety
issues at major companies (Twitter, Pinterest, Facebook), Hypothesis users,
product managers and UX researchers, and individuals working specifically
on online comments and populations vulnerable to online abuse.  The full
list can be found on this document > [2]

We asked our interviewees to consider the goal of providing website owners
with more control over how annotation works with their content.   Some
guiding questions we suggested:

* Should we let page owners register a preference that they’d prefer that
their content not be annotated?

* Could users override that preference for reasons that are consistent with
a set of community guidelines (i.e. the content is in the Public Interest)?

* Should we do nothing?  I.e. the current status quo is acceptable-- if you
don’t want to see the annotations, don’t enable the annotation layer

* In your view, what’s the best approach?


Main takeaways:

Involve authors. Perhaps the most important takeaway is that the annotation
process should ideally include the author of the web content.  A few of the
people interviewed expressed the need to find a way to communicate to the
site owner/author that their website or content being annotated. There’s
disagreement on what role the author should play in the overall moderation
of discussion over their content.

“My first instinct is that I like the idea of a choice.  Of giving authors
the ability to engage in a conversation, or not. [...] There should be
exemptions for government websites, newspapers, corporations.[...] It’s not
easy to write a policy or an algorithm to determine who has power.” Larisa
Mann, Researcher

“I was in favor of all comments all the time, but that quickly fell apart.
Provide some kind of opt out. People want to have some privacy.  A degree
of privacy means have people not commenting on your content.” - Matt
Carroll, MIT Lab

“Worst case for me is people start using annotation as a way of having
comments [where comments have been disallowed] and back channels about
people who have tried to secure their presence online. ” Sydette Harris,
Coral Project



Opt-out of  proxies.  For some of the people interviewed, having your
content “travel” with an additional layer of comments you don’t control is
particularly disempowering. Many mentioned that they wouldn’t mind content
to be annotated with a browser extension, but having their sites proxied
was problematic. This second case was often seen as “appropriating” content
and re-circulating it without the author’s consent, in some cases
(depending on the implementation) even re-hosting the content on a
different server. Proxies will often distort traffic metrics and thus
interfere with targeted advertising. Some worry that as annotation services
become more popular, the proxied links may get more traffic than the
original content.

“At the point you become popular, many many people looking at the web will
see the annotations.  The fact that you’re not technically defacing a
website, at that point doesn’t make much of a difference.” - Peter Seibel,
Twitter Trust and Safety Engineering


Annotation opt-out.  Many interviewees expressed the need to provide a way
to opt-out of annotation completely.  Ideally there would be some
granularity, for example, a per-article way to disallow annotation.  Many
interviewees even mentioned blogs and personal content should opt-in to be
annotated. The most vulnerable demographics would prefer to have more
control over what happens with their content.

“Personal blogs, personal info should be always opt-in. People should not
have to explain why they want out, they should just be able to stay out. To
have good will around this is important, because it eventually brings
better quality to annotation.  ” Sydette Harris, Coral Project

“From the perspective of the most targeted demographics, annotation should
be opt-in. It doesn’t resolve the issue, people will always be able to
annotate or comment on it elsewhere, but it doesn’t become another vector
for abuse” - Anika Gupta, Beyond Comments


Author preferences vs the Public Interest.  A possibility suggested by some
was to listen to the web page owner preference, but leave it to the
annotation service to honor that preference or not.  This would allow users
to override these preferences when content considered of Public Interest.
While this seemed like a good compromise for some, others considered this
would be misleading for website authors, and it would be hard to
communicate what level of control they really have when determining whether
their pages are annotated. It also adds the policy burden to the annotation
service to determine what’s in the public interest.

“[If you allow users to ignore the author’s preference] you’re saying “we
think you should have consent but not really”.  This scenario puts the
service in the position of having to deal with moral decisions, and the
human moderation component is hard to scale ” - Randi Lee Harper, Online
Abuse Prevention Initiative

“There’s the issue of Individual vs institutional content.  Government,
media, agencies should be fair game. Maybe creative work, meant to be put
out in the public sphere. The intent of the author is important.”  - Anika
Gupta, Beyond Comments


Other processes. A few interviewees directed us to review similar decision
processes, such as Google’s work on the Right to be Forgotten [3] about
balancing privacy and freedom of expression, and the Internet Archive’s Way
Back Machine policy of exclusions [4].


This is a summary of our conversations around the issue of author
preferences and consent only.  Other notes are available for those
interested in product design (for example, how to best design an abuse
reporting system)  and service policy, and we may publish them later on the
blog.  If you have additional feedback on this topic, please contact us at
abuse-prevention@hypothes.is

[0] https://hypothes.is/blog/preventing-abuse/
[1] https://hypothes.is/blog/involving-page-owners-in-annotation/
[2]
https://docs.google.com/document/d/1E1-lcRYZQ1zRWMG-wq0gNUI3CMK4HbxjUdWqY653iWk/edit
[3]
https://drive.google.com/a/hypothes.is/file/d/0B1UgZshetMd4cEI3SjlvV0hNbDA/view
[4]
http://web.archive.org/web/20140812200246/http://www2.sims.berkeley.edu/research/conferences/aps/removal-policy.html

Received on Tuesday, 24 May 2016 17:29:11 UTC