Re: ISSUE-151 Re: Change proposal: new general principle for permitted uses

Shane,

 

My suggestion is an enhancement of Rigo's idea for checking if the UGE API
is present. The third-party advertiser just checking for the existence of
the API would work for say IE10 but not for IE11 intercepted by a
DNT-injecting router, proxy or other software component. It would also
require an extra turnaround as you point out.

 

The idea would be that script in the first-party page detects contradicting
DNT signals and causes this to be communicated to third-parties. The
publisher wants the advertising revenue so, if it detects DNT, will attempt
to get consent for its third-parties using a UGE. If a script library finds
that DNT is set (using window.doNotTrack) and a) The API is not present or
b) navigator.confirmSiteSpecificTrackingException returns false, it creates
a cookie e.g. W3CTP=DNT=I which is cloned into the third-party advertisers
domain. This cookie will always be placed by the library on every execution
of the library in that browser so does not need to be hardened, but
subsequent requests from the illicit browser to the third-party will contain
the cookie. There may be an edge case when it is not communicated in the
first request to the third-party, but this will be rare. Even in the IE10
case this is better because there is no need for the third-party to execute
JS and wait for the response in another transaction, the cookie will simply
be immediately present in the request header.

 

In general the W3CTP cookie could anyway be hardened to the same extent as
the DNT header, that is one of the reasons it has a well-known name.

 

Regarding the other proposal, I think cryptographically signing the DNT
header would be tricky because of the difficulties in keeping the encryption
key(s) secret so IMHO is a non-starter.

 

As an aside I have been testing the (illicit DNT) idea using IE11 but I
discovered that the only way to check for the existence of the API is to
execute a call and detect a thrown exception. So
confirmSiteSpecificTrackingException stores an "exception" if it is present
and throws  an exception if it is not. This illustrates the confusion that
using the "exception" word is bound to cause.

 

Mike

 

 

 

From: Shane Wiley <wileys@yahoo-inc.com> 

Date: Sat, 27 Jul 2013 22:24:46 +0000

To: Rigo Wenning <rigo@w3.org>, Chris Mejia <chris.mejia@iab.net> 

CC: "public-tracking@w3.org" <public-tracking@w3.org> 

Message-ID:
<DCCF036E573F0142BD90964789F720E31410F9E5@GQ1-MB01-02.y.corp.yahoo.com> 

 

Rigo,

 

So it appears we have several possible solutions emerging:

 

Solution 1:  UGE Check

 

               Desc      - On each 3rd party call that receives a DNT:1 -
bounce a UGE check call to determine if the      User Agent repeats with a
DNT:1

               

               Pros       - If a 3rd party software/network solution is
injecting the DNT signal, the UGE check will         come back with a DNT:0
or DNT:<null>.  The Server would honor the UGE check signal.

               

               Cons      - All 3rd party calls would universally need to
ping the UA before transmitting their call                back to their
servers.  This could be expensive from a timing perspective - especially in
an                environment where we're already pressed for time.  More
testing needed.

                              - Non-JS scenarios have no solution.

 

Solution 2:  Signed/Keyed DNT

 

               Desc      - the DNT header signal would be signed against a
digital certificate

 

               Pros       - Removes the lazy non-compliant software
package/network appliance from the problem

                              - Is the same approach industry would likely
take if we discovered opt-out cookies were being
turned on by default without user interaction

                              - Depending on approach would likely work with
non-JS environments as well

 

               Cons      - Not sure how to handle cert signing in this case
in a manner that a 3rd party software
package or network device wouldn't be able to thwart in an easy manner

                              - No trusted intermediary to validate cert
through 

                              - If effective, 3rd party software could move
to simply activating DNT:1 within the UA via a                      config
file or macro activation

 

Neither is a silver bullet but definitely worth further discussion.  I
didn't list Mike's cookie option as I don't believe it's a fair starting
point to require validation through cookies which can be cleared in mass
whereas the DNT signal is more persistent.  DNT validation should be on
parity with UA persistence.

 

- Shane

 

-----Original Message-----

From: Rigo Wenning [mailto:rigo@w3.org] 

Sent: Saturday, July 27, 2013 3:02 PM

To: Chris Mejia

Cc: Shane Wiley; public-tracking@w3.org

Subject: Re: ISSUE-151 Re: Change proposal: new general principle for
permitted uses

 

On Friday 26 July 2013 21:40:32 Chris Mejia wrote:

> Yes, W3C is responsible, it's your spec.  See "DNT user agent vetting 

> registry service" (above) for next steps on cleaning up the 

> marketplace mess that's been created.

 

A registry is certainly another valid way forward. But many people still do
not understand what claiming conformity really means in DNT. And that is
part of a discussion we still have to have. 

 

> 

> You wrote "If you can't distinguish between a browser and a router, I 

> wonder about the quality of all that tracking anyway."

> 

> Rigo, this is why you are a lawyer, and not a technologist.

> Technically speaking, we are not talking about distinguishing between 

> browsers and routers, we are are talking about distinguishing between 

> validly set DNT signals and ones that aren't.  You'd need to 

> understand how HTTP header injection works to fully appreciate the 

> technical problem. The best technologists on both sides of this debate 

> have not been able to reconcile this issue. Neither have the lawyers.

 

The lawyers will tell you that you need a rule: "You shall not inject false
headers or transport false or injected headers". That doesn't buy you
anything either. I suggested to require the presence of the UGE testing bit.
In fact you can test already whether you have an exception. 

That test can be used to determine whether the signal is valid. 

> 

> You wrote "I do not believe, given the dynamics of the Web and the 

> Internet, that we can predict the percentage of DNT headers for the 

> next 3 years; let alone the percentage of valid DNT headers."

> 

 

[...]

> Please stop

> asserting that our technical and business concerns are trivial or ill

> informed-- they are not.  Most of your replies below are not helping 

> us get closer to a workable DNT solution-- you are only further 

> exacerbating our concerns.

 

Chris, re-read my reply to Shane. He is having a creative semantics party by
claiming an "opt-in regime"  I would rather be interested on a less
distorted technical discussion. Shane is only mildly reflecting the
solutions that the browser makers have considered viable, especially the UGE
testing bit. And even I earned a lot of criticism by insisting on a "D"
signal that allows you to say that you do not accept a signal. The elements
are there...

 

You're a technician, what would be your answer? A registry is not bad. 

But most web-geeks hate registries. I suggested the UGE verification bit.
Perhaps not perfect, but viable. Is there a better solution? Shane suggested
that we could make the DNT-signal signed. This could be a fixed set of
elements so that you don't need a key to verify the signature but just the
same elements. 

 

--Rigo

Received on Sunday, 28 July 2013 10:08:49 UTC