Re: Settings retrieval/application API Proposal (formerly: constraint modification API v3)

Travis,

overall I like this a lot. It is very clear which objects you can use to
change properties of the generated media ("MediaDevices"), which I think
is a good thing.

I guess some of the items in the PictureInf dictionary would be 
read-only (e.g. "facing"); I'm not sure how that is expressed.

(I also know that constraints are being documented in a separate 
registry; I personally have no strong view on the right thing to do in 
this respect - to me it matters more that we sort out when and how they 
can be applied and that there is a way to add more data as experience 
shows the need).

You do not discuss the relation between checking and setting properties 
of a MediaDevice using this proposal and the application of constraints 
(beyond "video") at getUserMedia time. To me it seems that they're 
complementary rather than conflicting: constraints at getUserMedia time 
helps the UA order (and perhaps prune) the list of available devices to 
the user in the dialog presented (if constraints are mandatory it could 
even lead to a failure without involving the user).

Stefan


On 08/27/2012 11:46 PM, Travis Leithead wrote:
> Based on the latest round of feedback on the prior proposal [1], I've
> further adjusted the constraint modification proposal. High-level
> changes are:
>
> 1. LocalMediaStream allows more direct access to device settings (now
> optimizing around the 1-video/1-audio track per gUM request) 2. Track
> objects isolated from device objects for clarity and separation of
> APIs 3. Specific settings proposed 4. Usage examples provided
>
>
> As mentioned in the previous proposal [1], the LocalMediaStream's
> changeable audio/videoTracks collections as currently derived from
> MediaStream make it challenging to keep track of the tracks that are
> supplied by a local device over time. In the prior proposal, I
> factored the local-device-supplying tracks into separate track lists
> for isolation. In this proposal, I take a slightly more aggressive
> approach to modifying the definition of a LocalMediaStream which
> further diverges it from its current definition, but which (I
> believe) aligns it more closely with the devices that are supplying
> its tracks. This approach was largely borrowed from Adam's comments
> [2].
>
> Despite these changes to the structure of LocalMediaStream, I still
> want it to behave semantically similar to MediaStream when used with
> URL.createObjectURL or when assigned to a video/audio element using a
> TBD property (see example #2 at the end of the proposal). To continue
> to have it work for this purpose, a new interface:
> AbstractMediaStream is introduced:
>
> // +++New base class--this is what createObjectURL and other // APIs
> now accept to be inclusive of LocalMediaStreams as well // as other
> MediaStreams interface AbstractMediaStream { readonly attribute
> DOMString label; readonly attribute boolean ended; attribute
> EventHandler onended; };
>
> // +++MediaStream now derives from the base class, adding // mutable
> track lists and an onstarted event (since these // objects have go
> from no tracks->one-or-more tracks) [Constructor (optional
> (MediaStream? or MediaStreamTrackList or MediaStreamTrack[])
> trackContainers)] interface MediaStream : AbstractMediaStream {
> readonly attribute MediaStreamTrackList audioTracks; readonly
> attribute MediaStreamTrackList videoTracks; //+++ added for symmetry
> and for mutable track lists. attribute EventHandler onstarted; };
>
> // +++Modified to include device-specific interfaces interface
> LocalMediaStream : AbstractMediaStream { readonly attribute
> VideoDevice? videoDevice; readonly attribute AudioDevice?
> audioDevice; void stop(); };
>
> A LocalMediaStream now has an active (or null) videoDevice,
> audioDevice or both, depending on what was requested from
> getUserMedia.
>
> All Video/AudioDevice objects link to their associated track:
>
> // +++Settings for all device types interface MediaDevice { // +++
> the track object that this device is producing readonly attribute
> MediaStreamTrack track; };
>
> And have a 'local' stop API and associated state (to stop just one of
> the devices):
>
> interface MediaDevice : EventListener { readonly attribute
> MediaStreamTrack track; // +++ stop only this device: void stop(); //
> +++ get the on/off state of the device readonly attribute boolean
> ended; };
>
> All devices support being able to inspect and change their settings.
> Application of settings is asynchronous, but inspection of settings
> can be synchronous (for convenience). The specific settings returned
> depend on whether the caller is a VideoDevice instance or an
> AudioDevice instance. Events related to the changing of settings are
> provided as well.
>
> interface MediaDevice : EventListener { readonly attribute
> MediaStreamTrack track; void stop(); readonly attribute boolean
> ended;
>
> // +++ get device settings (VideoInfo or AudioInfo) getSettings();
>
> //+++ settings application void changeSettings(MediaTrackConstraints
> settings);
>
> //+++ Async results notification from settings application attribute
> EventHandler onsettingschanged; attribute EventHandler
> onsettingserror; };
>
> Video devices, in particular, have the ability to [possibly] switch
> into "photo mode" to capture still images. The following are specific
> to VideoDevice objects and extend the API that Rich proposed. Since
> "photo mode" is often a distinct set of settings from regular video
> mode in a camera, there are separate settings and application of
> those settings just for taking pictures.
>
> //+++ New: audio Device interface (basically just a MediaDevice)
> interface AudioDevice : MediaDevice { };
>
> //+++ New: video device interface VideoDevice : MediaDevice { //+++
> Getting settings (possibly different from the video stream) for
> pictures from this device PictureInfo getPictureSettings();
>
> //+++ Taking snapshots void takePicture(optional PictureInfo
> pictureSettings);
>
> //+++ Picture results attribute EventHandler onpicture; };
>
> The proposed initial set of settings are below. The proposed settings
> are a combination of features proposed by Rich based on his
> customer's requests, as well as a set of functionality already
> supported by Microsoft WinRT Camera API [3] (based on our own
> research and common cameras used in PCs).
>
> In the proposed view of the settings, where the setting is in a range
> (not an enum), there are no "isSupported" values. The expectation is
> that if one of these values that is exposed as a range is not
> supported, then the value is assigned a default value (e.g., 0), and
> the min and max range values are set to that same value. This is the
> same thing as saying that the feature is supported, but that the
> value cannot be changed (which is basically saying that the feature
> is unavailable).
>
> //+++Video device settings dictionary PictureInfo :
> MediaTrackConstraintSet { // Resolution: unsigned long width;
> unsigned long minWidth; unsigned long maxWidth; unsigned long
> height; unsigned long minHeight; unsigned long maxHeight; // Aspect
> Ratio: float horizontalAspectRatio; float minHorizontalAspectRatio;
> float maxHorizontalAspectRatio; float verticalAspectRatio; float
> minVerticalAspectRatio; float maxVerticalAspectRatio; // Rotation:
> float rotation; float minRotation; // if not supported, then min ==
> max == current rotation value float maxRotation; // Zoom: unsigned
> long zoom;  // if not supported, then min == max == current rotation
> value unsigned long minZoom; // e.g., lens supports 55 (mm) - 250
> (mm) zoom unsigned long maxZoom; // Exposure: unsigned long
> exposure; unsigned long minExposure; unsigned long maxExposure; //
> Direction: VideoFacingEnum facing; // Focus: VideoFocusModeEnum
> focusMode; // Flash: VideoFlashModeEnum flashMode; };
>
> //+++ Additional settings for video (extends picture) dictionary
> VideoInfo : PictureInfo { // FPS: float framesPerSecond; float
> minFramesPerSecond; float maxFramesPerSecond; };
>
> //+++Audio device settings dictionary AudioInfo :
> MediaTrackConstraintSet { // Levels unsigned long level; unsigned
> long minLevel; unsigned long maxLevel; // Tone (bass/treble) float
> bassTone; float minBassTone; float maxBassTone; float trebleTone;
> float minTrebleTone; float maxTrebleTone; };
>
> The related enums are defined as:
>
> enum VideoFacingEnum = { "unknown", "user", "environment" }; enum
> VideoFocusModeEnum = { "nofocus", "fixed", "auto", "continuous",
> "edof", "infinity", "macro" }; enum VideoFlashModeEnum = { "noflash",
> "auto", "off", "on", "red-eye", "torch" };
>
> The new Event types that support the "settingschanged" and
> "settingserror" events, as well as the "picture" event are defined
> below:
>
> //+++ New event for "settingschanged/settingserror"
> [Constructor(DOMString type, optional EventInit eventInitDict)]
> interface MediaSettingsEvent : Event { sequence<DOMString>
> getRelatedSettings(); // Returns an array of setting names that apply
> to this event. };
>
> //+++ New event for getting the picture results from 'takePicture'
> (returns raw bytes/non-encoded) [Constructor(DOMString type, optional
> PictureEventInit eventInitDict)] interface PictureEvent : Event {
> readonly attribute ImageData data; // See Canvas spec for definition
> of ImageData };
>
> dictionary PictureEventInit : EventInit { ImageData data; };
>
> ////////////////////
>
> Some examples follow that illustrate how these changes will impact
> coding patterns:
>
> 1. Getting access to a video and/or audio device (if available) --
> scenario is unchanged:
>
> navigator.getUserMedia({audio: true, video: true}, gotMedia,
> failedToGetMedia);
>
> function gotMedia(localStream) { }
>
> 2. Previewing the local video/audio in HTML5 video tag -- scenario is
> unchanged:
>
> function gotMedia(localStream) { // objectURL technique
> document.querySelector("video").src =
> URL.createObjectURL(localStream, { autoRevoke: true }); //
> direct-assign technique document.querySelector("video").streamSrc =
> localStream; // "streamSrc" is hypothetical and TBD at this time }
>
> 3. Applying resolution constraints
>
> function gotMedia(localStream) { var settings =
> localStream.videoDevice.getSettings(); // Check for 1080p+ support if
> ((settings.maxWidth >= 1920) && (settings.maxHeight >= 1080)) { //
> See if I need to change the current settings... if ((settings.width
> != 1920) && (settings.height != 1080)) { settings.width = 1920;
> settings.height = 1080; localStream.videoDevice.onsettingserror =
> failureToComply; localStream.videoDevice.changeSettings(settings); }
> } else failureToComply(); }
>
> function failureToComply(e) { if (e) console.error("Device failed to
> change " + e.getRelatedSettings()); else console.error("Device
> doesn't support at least 1080p"); }
>
> 4. Changing zoom in response to user input:
>
> function gotMedia(localStream) { setupRange( localStream.videoDevice
> ); }
>
> function setupRange(videoDevice) { var cameraSettings =
> videoDevice.getSettings(); // Set HTML5 range control to min/max
> values of zoom var zoomControl =
> document.querySelector("input[type=range]"); zoomControl.min =
> cameraSettings.minZoom; zoomControl.max = cameraSettings.maxZoom;
> zoomControl.device = videoDevice; // Store the device ref for later
> zoomControl.onchange = applySettingChanges; }
>
> function applySettingChanges(e) { e.target.device.changeSettings({
> zoom: e.target.value }); }
>
> 5. Adding the local media tracks into a new media stream
>
> function gotMedia(localStream) { return new MediaStream( [
> localStream.videoDevice.track, localStream.audioDevice.track ]); }
>
> 6. Take a picture, show the picture in a canvas.
>
> function gotMedia(localStream) { localStream.videoDevice.onpicture =
> showPicture; // Turn on flash only for the snapshot...if available
> var picSettings = localStream.videoDevice.getPictureSettings(); if
> (picSettings.flashMode != "noflash")
> localStream.videoDevice.takePicture({ flashMode: "on"}); else {
> console.info("Flash not available");
> localStream.videoDevice.takePicture(); } }
>
> function showPicture(e) { var ctx =
> document.querySelector("canvas").getContext("2d"); // e.data is the
> ImageData property of the PictureEvent interface. ctx.canvas.width =
> e.data.width; ctx.canvas.height = e.data.height;
> ctx.putImageData(e.data); // TODO: can get this picture as an encoded
> Blob via: // ctx.canvas.toBlob(callbackFunction, "image/jpeg"); }
>
> [1]
> http://lists.w3.org/Archives/Public/public-media-capture/2012Aug/0066.html
>
>
[2] 
http://lists.w3.org/Archives/Public/public-media-capture/2012Aug/0095.html
> [3]
> http://msdn.microsoft.com/en-us/library/windows/apps/windows.media.devices.videodevicecontroller.aspx
>
>
>
>

Received on Monday, 3 September 2012 06:49:35 UTC