Re: Proposal: "private" execution mode

I'd be very hesitant to take on something like this, in our TF, at this 
time:

- Declaring a new JS execution model seems to be far outside our TF's 
scope. I find the idea very interesting, but I don't think it should be 
pursued in this context.
- We're trying to reach LC pretty soon, and doing something quite 
fundamental like this would take a long time.

(And if you think I am inconsistent when saying I like two of Jan-Ivar's 
proposals: one of them (use WebIDL) is a simplification that would make 
speccing and implementing easier and faster I think, the other is a 
really minor change).

Stefan

On 28/11/13 18:29, cowwoc wrote:
> Hi,
>
> I'd like to propose a high-level mechanism for dealing with
> fingerprinting risks. Borrowing from HTTP caching terminology, I propose
> declaring a new Javascript execution mode called "private". Code
> executed in "private" mode would be required to adhere to the following
> restrictions:
>
>   * Any written data would be marked as "private". Data marked as
>     "private" may only be accessed under "private" mode. In other words,
>     privacy is contagious.
>   * Sensitive methods that may be used to leak data outside the UA (e.g.
>     outgoing network requests) MUST throw a permission denied error.
>
> Here is a concrete example of how this may be used:
>
> A user invokes getUserMedia(filter, onSuccess) where filterand onSuccess
> would be supplied by the user. The user invokes the function "normal"
> mode, but filter gets invoked in "private" mode. Here is a sample filter:
>
> function filter(candidateDevice)
> {
>    var resolutions = candidateDevice.getResolutions();
>    var idealResolution = {1280, 720};
>    return resolutions.indexOf(idealResolution)!=-1;
> }
>
> In the above function, candidateDevice is marked as "private" by the
> browser before passing it into the function. WebRTC would invoke
> onSuccess in "normal" mode, passing it the first device accepted by the
> filter. *NOTE*: the above definition of getUserMedia() is just an
> example and is not part of this proposal.
>
> There are many ways a browser could implement this proposal. It could
> mark data using a sticky "private" bit (as mentioned above). It could
> "validate" user functions before invoking them, and throw a permission
> denied error if they leak data. Any implementation that prevents
> "private" data from leaking is deemed to be compliant.
>
> While this discussion used the getUserMedia() as an example, I believe
> that this mechanism could be used to tackle fingerprinting risks across
> the entire WebRTC surface. Unlike other proposals, I believe it does so
> without compromising the usability of the WebRTC API and user interface.
>
> Let me know what you think.
>
> Gili


Received on Friday, 29 November 2013 09:06:14 UTC