RE: ping attribute again (was Re: Big PING Ideas)

Is there a use case for link tracking that does not involve a privacy threat?

 

Gathering aggregated statistics could be one, we could suggest ways to do that i.e. without “singling-out”.

 

The ping/beacon/xhr should have no cookies by default, as Safari now ensures. ITP2.2 is also now restricting the entropy of the target url by removing query and fragment in some circumstances.

 

This still allows arbitrary high entropy components in the path, but we could restrict that. Say by default ping/beacon/xhr could only point to fixed .well-known paths, on the top-level domain, with cookies stripped. Anything else would require a user prompt similar to that triggered by the Storage Access API.

 

 

 

 

 

From: Nick Doty <npdoty@ischool.berkeley.edu> 
Sent: 06 May 2019 20:56
To: public-privacy (W3C mailing list) <public-privacy@w3.org>
Cc: David Singer <singer@apple.com>; Rigo Wenning <rigo@w3.org>; Pete Snyder <psnyder@brave.com>
Subject: ping attribute again (was Re: Big PING Ideas)

 

There seems to be renewed discussion of the ping attribute topic, as Rigo mentioned and in some other fora, so I wanted to collect some of the relevant links and move informal discussion from Slack to an archival form here.

 

There appears to be interest in implementing the `ping` attribute for “hyperlink auditing” (link click tracking) with implementations from Google, Apple, Microsoft and some suggestion that Mozilla would implement as well. This also got some news coverage when it appeared that Chromium/Chrome and Safari would remove user settings to disable those background pings.

 

As we've raised before, the HTML spec currently has normative requirements for user control and user transparency for the `ping` attribute, which have seen zero implementations (that I know of) since the feature was added to the spec in 2007. Here are my comments from May 2018 regarding our review of HTML 5.3 and the re-appearance of `ping`:

https://lists.w3.org/Archives/Public/public-html/2018May/0027.html

And the HTML group heard our concern about the lack of implementation of user transparency for this feature in a previous call in April 2017. That repeated a concern that was expressed in 2007, so it’s been a pretty long-lasting topic.

 

Concerning those implementations, here are the Chromium bugs for user control and user transparency:

https://bugs.chromium.org/p/chromium/issues/detail?id=935978

https://bugs.chromium.org/p/chromium/issues/detail?id=951611

 

And here are the Firefox bugs for turning the feature on by default and the bugs for user control and transparency (which are currently marked as blocking, which I’m heartened by, given that those are the normative requirements in the spec):

https://bugzilla.mozilla.org/show_bug.cgi?id=951104

https://bugzilla.mozilla.org/show_bug.cgi?id=1546198

https://bugzilla.mozilla.org/show_bug.cgi?id=401352

 

Apple’s John Wilander has posted WebKit’s explanation of why the feature is on by default and that they are removing any user controls: https://webkit.org/blog/8821/link-click-analytics-and-privacy/

I’m not sure the reasoning on having the default presence of the feature explains why users also need to not have the option to disable a feature, but it’s good to have the reasoning explicitly described.





On Apr 30, 2019, at 1:44 AM, Rigo Wenning <rigo@w3.org <mailto:rigo@w3.org> > wrote:

 

Concerning privacy by design, for the moment, I see the clear opposite 
happening: 
 <https://html.spec.whatwg.org/multipage/links.html#hyperlink-auditing> https://html.spec.whatwg.org/multipage/links.html#hyperlink-auditing

What else would you need for perfect monitoring? Why would I do complex 
fingerprinting if a get all I want on a silver tablet?

Note that this is NOT a W3C specification. 

 

I share your concern, but this isn’t a new design and there is very similar text regarding that feature in the W3C specification:

https://www.w3.org/TR/html53/links.html#hyperlink-auditing

The W3C HTML 5.3 spec currently recommends that authors affirmatively use this feature, and then has a warning box to point out that the privacy benefits (which were the original intent of this feature) “are still hypothetical” as no browser has implemented either transparency or user controls.

 

When we raised these issues with the W3C HTML folks, it was tracked in this issue:

https://github.com/w3c/html/issues/1456

That also got some discussion from the TAG, although the TAG seems to have lost track of what their particular comments were on the feature. I’m not convinced the HTML resolution of that issue is an improvement, though it at least explicitly notes the lack of implementations of its requirements.

 

There’s an opportunity for the `ping` attribute to be a privacy improvement on the current system of redirects and XHRs, but it’s only an improvement if the user transparency and user control features are actually implemented. If they’re not, then the user has less awareness that link click tracking is happening and sites that want to track user clicks get a performance boost as opposed to the status quo ante where they were a little slower than sites that don’t track clicks. Tying performance improvements to privacy features (like visibility and the option to disable) is an approach to pave the way for better implementations of common practices.

 

I wanted to pass these links and thoughts along now in part because I can’t continue spending a lot of unpaid time on this topic. We should remember that it’s not sustainable in the long-term for privacy reviews to depend on graduate student volunteers. To refer back to the original thread, sustainable funding models for this kind of work could be an important Big Idea for more systematic privacy design for the Web.

 

Cheers,

Nick

Received on Tuesday, 7 May 2019 08:01:36 UTC