[Bug 26332] Applications should only use EME APIs on secure origins (e.g. HTTPS)

https://www.w3.org/Bugs/Public/show_bug.cgi?id=26332

--- Comment #114 from Ryan Sleevi <sleevi@google.com> ---
(In reply to Jerry Smith from comment #113)
> It's difficult for me to see how a cryptographically secured identifier
> imposes a higher risk of identity tracking compared to cookies in general. 

1) As you no doubt are quite familiar with, the introduction of cryptography
into an ecosystem creates a new set of legal expectations and rights for users
with respect to privacy preserving decisions. For example, in the US, there's
the DMCA that restricts what actions a user can take, restrictions which do not
intrinsically apply to cookies the same way.

2) As discussed, even if the minimum was "equivalent to cookies" (which is a
meme that is factually and demonstrably false, especially with respect to the
normative requirements of the spec), as has been discussed, cookies themselves
are NOT an acceptable level of security/privacy in 2014. This can be trivially
seen (
http://www.washingtonpost.com/blogs/the-switch/wp/2013/12/10/nsa-uses-google-cookies-to-pinpoint-targets-for-hacking/
)

> If anything, the steps required to retrieve the CDM identifier make it more
> difficult to abuse and less likely to be exploited. 

Are you suggesting that these steps are normatively required?

If not, then it's failing to address the fundamental issue of the spec, and
instead relying on the good will and good behaviour and intention of media
companies, ISPs, user agents, which is an idealistic vision that has no basis
in reality, as demonstrated by abundant evidence in this bug and related
threads.

> Browsers can further
> implement features to reset this identifier, and can allow users to disable
> the identifier in general, though with loss of EME functionality.

Surely you don't mean to argue that it's meeting the priority of constituencies
to suggest that privacy and functionality should be mutually exclusive for
users, because meaningful user privacy is seen as financially troublesome for
some site operators?

That is, we have a clear proposal that can trivially meet many (but
understandably, not all) privacy goals for users, in a way that doesn't require
them to be functionally limited as an intrinsic property.

> I agree with comments in this bug about reverting this change.  The
> conversation hadn't been concluded, and the consensus in the working group
> (if there was one) seemed to be opposition.  I don't believe it is our
> process to implement a controversial change and then debate whether it
> should be retained or not, especially following open discussion that did not
> support it.

There is clearly a shared sentiment by the W3C TAG, extensive contributions
from the community, and from UAs that there exist real and meaningful privacy
concerns. Comment #48 and Comment #70 collect just a small fraction of these
concerns. So yes, these are concerns that MUST be addressed if this spec is to
progress.

Currently, we have at least one concrete proposal to address - by normatively
requiring TLS. We can continue the discussion and look at other normative
requirements to address both the set of privacy concerns yet unaddressed, as
well as the introduction of normative requirements of CDMs that might
alternatively address the concerns regarding CDMs and privacy. However, there's
clear consensus that "doing nothing" is not acceptable, so doing nothing only
serves to show the broader community either that a) the WG is not taking
privacy seriously or b) that members are hoping to delay such requirements past
the point of implementing, such that it becomes unviable in the market for any
UA to prioritize user privacy. No one is suggesting the current text is final -
but the current text does more to meaningfully move to addressing the concerns
than nothing, so that's surely a step in the right direction.

-- 
You are receiving this mail because:
You are the QA Contact for the bug.

Received on Monday, 27 October 2014 18:09:52 UTC