PING@TPAC 2015 report

Hello all,

Belatedly, here is the report from the PING meeting at TPAC 2015 in
Sapporo, which took place on Friday 30 October 2015. Thanks to all
attendees (~20 members and observers) and of course extra thanks to the
scribes: Hadley Beeman, Nick Doty, and Karen O'Donoghue.

Agenda: https://www.w3.org/wiki/Privacy/TPAC_2015_Agenda

Main items from agenda:

* Fingerprinting Guidance
Nick Doty presented the Fingerprinting Guidance document [1], which was in
the last steps of discussion prior to its being published as a draft Group
Note. The document has been reviewed by both PING and the TAG, and received
additional outside feedback. There were some changes made in response to
the TAG comments, particularly in regards to the TAG Finding on
Unsanctioned Web Tracking [2]. The fingerprinting document can definitely
still be built upon, and suggestions were made concerning concrete examples
and specific technical guidance for specification developers to mitigate
these issues. At this stage the consensus from the group (after two rounds
of approval on the mailing list) was that it was ready to be published as a
draft Group Note. (NB: This Group Note was published after the TPAC
meeting, on 24 November 2015 [3].)

* Privacy and Security Questionnaire
PING has been working on a Privacy and Security Questionnaire [4], which
has received a lot of recent work from Joe Hall and Greg Norcie (who were
unfortunately unable to attend TPAC). A similar document has been developed
through the TAG, in the form of a security and privacy self-questionnaire
[5]. One of the items to resolve was how these two documents should evolve,
and for what purposes (e.g., should they be merged in any way?). Discussion
on these documents indicated support for two different documents: a short,
high-level TAG document that is mostly for specification authors, with the
PING questionnaire being more a more detailed document that can be used for
reviewing specifications (e.g., by PING reviewers) or for those looking for
more in-depth privacy guidance. Further feedback and contributions are
solicited, both for the TAG and PING questionnaires.

* Items PING can contribute to
There have been a number of documents for which PING has been asked for
contributions, which were discussed during the TPAC meeting (e.g., at the
Web Application Security Working Group meetings)

- Secure Contexts - W3C Editor’s Draft [5]
There was a review request for the section on "Risks associated with
non-secure contexts".
Preliminary discussion at the PING meeting highlighted the value of using
consistent definitions (e.g., passive and active network attacker), and the
ongoing challenge of ensuring users have a clear and reliable understanding
of what is happening to their data.

- Clear Site Data - W3C Editor’s Draft [6]
This document discusses a means for web developers to instruct a user agent
to clear a site’s locally stored data for a particular host. (One use case
would be as a means for a website to recover after an attack, to ensure
that clients did not retain non-trustworthy data.) Discussion identified
the possible user discomfort of unexpected data deletions (although this
mechanism does not permit, for example, deletion of history); the question
of whether locally-stored items outside the origin's control still maintain
state (and thus ideally should be deleted); and when do permissions get
cleared?

- Private browsing mode [7]
This topic was raised back in August [8], around the idea of clearing state
after visiting a website with sensitive content (i.e., trying to assist a
user in removing traces of a site visit). François Légaré gave a
presentation, which included a draft proposal put together for Firefox [9],
which outlines several privacy and security considerations and shows some
mockups of the user experience. The plan is to do more user research and
prototyping. Discussion indicated there is broad concern around private
browsing (across a wide number of contexts) and we may need to make a
concerted effort to push work forward in this domain.

* Ongoing privacy issues for W3C work

- Service Workers and related issues
There was a short discussion around issues raised by Service Workers [10],
with a focus on the Web Background Spec API [11]. The core idea is to deal
with synching over unreliable networks by launching a service worker event
in the background to continue synchronization attempts. There were two
privacy considerations already identified: location tracking (e.g.,
revealing client’s IP address to server after user left the page) and
history leaking (e.g., client changes network, passive eavesdroppers on new
network may see fetch requests). Discussion noted that this was similar to
the Web Beacon Spec [12], from the Web Performance Working Group. Mike West
also noted he was working on a related (new) item, a sort of
"one-stop-shop" reporting mechanism [13], and was looking for feedback.

- Push API [14]
Related to the previous item are concerns around tracking in the background
- when there is no UI, how is the user aware of what is happening? Push
services, as presented in the API, cover such cases as "the webapp is not
currently active in a browser window...the user may close the webapp, but
still benefits from the webapp being able to be restarted when a push
message is received." Trying to find a good option is a challenge. Nick
Doty suggested you might try restrictions such as "only need visibility if
you have outgoing network activity", and linked to work on asynchronous
notice as a "privacy pattern" design solution [15].

* AOB
David Singer initiated a discussion on sharing data with sites - trying to
find a middle ground between individual popups and over-broad consent. Idea
is to use a digitally-signed file: list of things the site needs <e.g.,
heart rate [and nothing else]>, with a promise to agree to <privacy
regulation X>; with this agreement good for <date range>. Then the browser
can mediate the interaction, and there is a record of what happened to data
on what site; this allows for a "promise violation" if the site shares data
beyond what was in the agreement. Discussion brought up issues such as how
the user negotiates the agreement; how revocation of consent is managed
(e.g., if policy changes after you give away data). This proposal may be
developed and discussed further.

[1] https://w3c.github.io/fingerprinting-guidance
[2] http://www.w3.org/2001/tag/doc/unsanctioned-tracking
[3] http://www.w3.org/TR/2015/NOTE-fingerprinting-guidance-20151124
[4] https://www.w3.org/wiki/Privacy_and_security_questionnaire
[5] https://w3c.github.io/webappsec-secure-contexts/
[6] http://w3c.github.io/webappsec-clear-site-data/
[7] https://lists.w3.org/Archives/Public/public-webappsec/2015Sep/0016.html
[8] https://lists.w3.org/Archives/Public/public-privacy/2015JulSep/0087.html
[9] https://wiki.mozilla.org/Security/Automatic_Private_Browsing_Upgrades
[10] https://slightlyoff.github.io/ServiceWorker/spec/service_worker/
[11] https://slightlyoff.github.io/BackgroundSync/spec/
[12] https://w3c.github.io/beacon/
[13] https://mikewest.github.io/error-reporting/
[14] https://w3c.github.io/push-api/
[15] http://privacypatterns.org/patterns/Asynchronous-notice

- Tara

Received on Thursday, 3 December 2015 08:11:34 UTC