Fwd: gUM's optional constraints algorithm penalizes user-choice chasing best-match for webpage

This is a mediacapture concern, so forwarding onto the right list.


-------- Original Message --------
Subject: 	gUM's optional constraints algorithm penalizes user-choice 
chasing best-match for webpage
Resent-Date: 	Sat, 31 Aug 2013 07:44:29 +0000
Resent-From: 	public-webrtc@w3.org
Date: 	Sat, 31 Aug 2013 03:43:59 -0400
From: 	Jan-Ivar Bruaroey <jib@mozilla.com>
Organization: 	Mozilla
To: 	public-webrtc@w3.org <public-webrtc@w3.org>



In Firefox, we let users of gUM webpages choose which source to allow 
from a subset dictated by the webpage, or choose to allow none.

I believe this dictated subset is overly narrow when optional gUM 
constraints are in play. To illustrate:

Consider two phones A and B.

     Phone A has both a front camera and a back camera.
     Phone B has only a back camera.

     A webpage C has this constraint = { optional: [{ facingMode:"user" }] }

Meaning the webpage prefers the front camera, but will work with any camera.

Result:

     On Phone A, the webpage user may choose the front camera or nothing.
     On Phone B, the webpage user may choose the back camera or nothing.

I think instead it should be:

     On Phone A, the webpage user may choose the front camera 
(preferred), back camera, or nothing.
     On Phone B, the webpage user may choose the back camera or nothing.

Reason: From a permissions standpoint, I argue the user has as right to 
withhold knowledge of Phone A's front camera, making it 
indistinguishable from Phone B.

Benefit: Lets a webpage affect which source is the default without 
limiting choice.
  e.g. lets pages on Firefox for Android default to front camera without 
removing back option in dropdown.

I believe a browser could implement this today without a webpage knowing 
(and be blackbox spec-compliant):

 1. Run the full set of tracks through the algorithm to arrive at the
    "preferred" set (like today).
 2. Run discarded tracks through algorithm again, but *individually* and
    keep ones that now make it through as "non-preferred".
 3. These tracks are valid from the webpage's point of view (it doesn't
    know the size of the set)

The reason this works is that our (unchanged) core "remove-from-list" 
algorithm ignores zero-reducing optional constraints, which makes it 
more lenient the smaller the starting set is.

I'm curious what people think.

.: Jan-Ivar :.

Received on Monday, 2 September 2013 09:56:01 UTC