- From: cowwoc <cowwoc@bbs.darktech.org>
- Date: Thu, 28 Nov 2013 13:06:08 -0500
- To: public-media-capture@w3.org
- Message-ID: <52978610.5010008@bbs.darktech.org>
On 28/11/2013 1:03 PM, Harald Alvestrand wrote:
> Gili,
>
> can you point to an example of such a process in any other W3C API?
Hi Harald,
I'm probably the wrong person to ask. I've only got a couple of years
experience with W3C APIs. Maybe someone else on this list could comment.
Gili
>
> On 11/28/2013 06:27 PM, cowwoc wrote:
>> Hi,
>>
>> I'd like to propose a high-level mechanism for dealing with
>> fingerprinting risks. Borrowing from HTTP caching terminology, I
>> propose declaring a new Javascript execution mode called "private".
>> Code executed in "private" mode would be required to adhere to the
>> following restrictions:
>>
>> * Any written data would be marked as "private". Data marked as
>> "private" may only be accessed under "private" mode. In other
>> words, privacy is contagious.
>> * Sensitive methods that may be used to leak data outside the UA
>> (e.g. outgoing network requests) MUST throw a permission denied
>> error.
>>
>> Here is a concrete example of how this may be used:
>>
>> A user invokes getUserMedia(filter, onSuccess) where filterand
>> onSuccess would be supplied by the user. The user invokes the
>> function "normal" mode, but filter gets invoked in "private" mode.
>> Here is a sample filter:
>>
>> function filter(candidateDevice)
>> {
>> var resolutions = candidateDevice.getResolutions();
>> var idealResolution = {1280, 720};
>> return resolutions.indexOf(idealResolution)!=-1;
>> }
>>
>> In the above function, candidateDevice is marked as "private" by the
>> browser before passing it into the function. WebRTC would invoke
>> onSuccess in "normal" mode, passing it the first device accepted by
>> the filter. *NOTE*: the above definition of getUserMedia() is just an
>> example and is not part of this proposal.
>>
>> There are many ways a browser could implement this proposal. It could
>> mark data using a sticky "private" bit (as mentioned above). It could
>> "validate" user functions before invoking them, and throw a
>> permission denied error if they leak data. Any implementation that
>> prevents "private" data from leaking is deemed to be compliant.
>>
>> While this discussion used the getUserMedia() as an example, I
>> believe that this mechanism could be used to tackle fingerprinting
>> risks across the entire WebRTC surface. Unlike other proposals, I
>> believe it does so without compromising the usability of the WebRTC
>> API and user interface.
>>
>> Let me know what you think.
>>
>> Gili
>
>
> --
> Surveillance is pervasive. Go Dark.
Received on Thursday, 28 November 2013 18:07:23 UTC