- From: Rich Tibbett <richt@opera.com>
- Date: Tue, 13 Dec 2011 00:58:07 +0100
- To: "public-media-capture@w3.org" <public-media-capture@w3.org>
Nine months ago we (Opera) were in the middle of some early experimentation work implementing an early proposal of the getUserMedia proposal. We found that we had to patch the then-current WHATWG <device> proposal with a number of additional API behaviors for when HTML media elements are hosting Media Stream objects. In our early specification we defined the term 'stream mode' for when such DOM elements are assigned Stream objects for the purposes of local media playback. We documented the behavior of the DOM APIs for HTML media elements operating in stream mode in [1]. One of the key concepts we established is that MediaStream objects, when applied to audio/video elements, should act within a 'linear media timeline' model (also a defined term in [1]). Such a timeline is, for example, not seekable and has well-defined behavior in the case that the video element is paused and resumed. When a developer interfaces with the DOM APIs representing HTML Media elements, we define specific setter and getter behaviors to switch off things that don't make sense in a 'linear media timeline' model. For example, a developer cannot adjust the playbackRate of a live stream, captured from a user's webcam. I think we need to review and add these behaviors to the current specification to ensure we end up with consistent behavior across UAs implementing our future recommendation. In the course of our early prototyping and also as part of the provided spec [1] we also introduced the ability to assign Stream objects directly to the .src of a video or audio HTML element (e.g. video.src = stream). This was not included in the official WHATWG draft at the time (since we didn't have window.URL.createObjectURL at that time and no-one really knew how else we could do it). Right now in the W3C proposal developers are expected to indirectly mint a new temporary URL (via window.URL.createObjectURL) to assign a MediaStream to a video/audio element. To coincide with direct assignment of a Stream object to a video element, our work also defined how Stream objects that have been assigned to HTML media elements should be labeled if or when a developer attempts to resolve the src URL/URI. We settled on 'about:streamurl' to be a reserved, though unresolvable, about: URI, to indicate that the media element is displaying/playing an object that implements the MediaStream interface (and hence the media element is in stream mode and has all the behaviors of that mode as defined in [1]). This direct object assignment requires less code to assign Stream objects to video/audio elements. It works really well to date in all Opera implementations and we'd like to apply this behavior to the W3C spec pending further discussion and feedback. There are notably a number of things we might want to also discuss, such as what we should do if particular attributes (i.e. autoplay, loop) are included on any HTML media element declaration (but I'd assume they mean nothing in stream mode). The documentation in [1] should be a reasonable starting point for that discussion also. I notice that roc's MediaStream API Processing proposal [2] touches on a few of these aspects to some degree also, albeit in less formal prose right now. I guess we're all thinking along the same thing and we just need to define the exact terms and algorithms in our specification. Any feedback would be appreciated. The main question is: should we add such behaviors as these to our specification or not? br/ Rich [1] http://people.opera.com/richt/release/specs/device/o-device.html#stream-mode-behavior [2] https://dvcs.w3.org/hg/audio/raw-file/tip/streams/StreamProcessing.html#mediastreams
Received on Tuesday, 13 December 2011 00:01:08 UTC