W3C home > Mailing lists > Public > public-device-apis@w3.org > December 2009

Re: <device> proposal (for video conferencing, etc)

From: Andrei Popescu <andreip@google.com>
Date: Wed, 16 Dec 2009 13:22:32 +0000
Message-ID: <708552fb0912160522h7796ff7i6064ba41c88e93f6@mail.gmail.com>
To: Ian Hickson <ian@hixie.ch>
Cc: public-device-apis@w3.org, Ben Murdoch <benm@google.com>
Hi Ian,

Thanks for writing this, the proposal looks good. I do have a couple
of comments / questions:

This approach seems to assume that capturing of static
image/audio/video files can be handled via

<input type="file" accept="image/video/audio/etc">

while streaming of video/audio can be handled by the <device> and
Stream interface.

One reason for this separation is that we want to leave the <input>
tag for form submission. This is all fine but, at the same time, there
are use cases where an application may want a static image from the
camera but may not want to submit any form. So, in such a case, we are
after all misusing the <input> tag?

So while we're adding the <device> element, how about extending it
with support for capturing entire media files (image/video/audio) in
addition to streaming? For example, we could have

-- for media files:

<device type="mediaFile" onchange="update(this.data)">

function update(file) {  // file is an object that implements interface File

-- for streaming:

<device type="mediaStream" onchange="update(this.data)">

function update(stream) {  // stream is an object that implements
interface Stream

What do you think?

On Wed, Dec 16, 2009 at 12:43 AM, Ian Hickson <ian@hixie.ch> wrote:
> It seems, though, that until we can figure out a codec that all the UAs
> are willing to implement, there's not much we can do to proceed on this,
> so I'm not sure where to go from here.

The codec issue aside, maybe add the mechanism for sending the Stream
from the UA?  In your example, you have

<p>To start chatting, select a video camera: <device type=media
 function update(stream) {
   document.getElementsByTagName('video')[0].src = stream.URL;

If I understand the example correctly, the video element will show the
output of the user's camera (i.e. act as an embedded camera viewport).
To be able to implement video chat, we also need a way to see the
remote party, so we need a way to send the Stream over to some server.
 I think we should specify the mechanism for doing that (e.g.
WebSockets::send(Stream stream)).

Received on Wednesday, 16 December 2009 13:23:09 UTC

This archive was generated by hypermail 2.3.1 : Monday, 23 October 2017 14:53:41 UTC