W3C home > Mailing lists > Public > public-media-capture@w3.org > November 2012

RE: revised recording proposal

From: Mandyam, Giridhar <mandyam@quicinc.com>
Date: Wed, 28 Nov 2012 19:31:36 +0000
To: Jim Barnett <Jim.Barnett@genesyslab.com>, Harald Alvestrand <harald@alvestrand.no>, "public-media-capture@w3.org" <public-media-capture@w3.org>
Message-ID: <CAC8DBE4E9704C41BCB290C2F3CC921A16355E66@nasanexd01h.na.qualcomm.com>
The scenario you mention (http://dvcs.w3.org/hg/dap/raw-file/tip/media-stream-capture/scenarios.html#find-the-ball-assignment-video-effects-and-upload-requirements) requires capture of the video, post-processing to track the ball, rendering of the post-processed video, and upload.  The post-processing does not need to be done in real-time based on my reading.  Alice could record the 30 second file, read it using File API and post-process on raw data, and then upload using her favorite method.  In this case, a time sliced blob would be leveraged as part of the File interface, but it does not have to be part of the MediaRecorder interface.

Anyways, I thought one of our goals was to avoid overloading the MediaRecorder object to enable different media processing scenarios.

From: Jim Barnett [mailto:Jim.Barnett@genesyslab.com]
Sent: Wednesday, November 28, 2012 10:55 AM
To: Mandyam, Giridhar; Harald Alvestrand; public-media-capture@w3.org
Subject: RE: revised recording proposal

The most obvious use case for access to the raw data is 3.3, image processing for drawing a box around the bouncing ball.  The use case does not specifically mention access to the raw data, but I don't see how else you can do it.  This leads to requirements 6 and 7, which seem to me to require access to the timesliced Blob.

*  The UA must be able to extract image frames from video.
*  The UA must be able to insert image frames into a local video stream (or capture).

-          Jim

From: Mandyam, Giridhar [mailto:mandyam@quicinc.com]
Sent: Wednesday, November 28, 2012 1:49 PM
To: Harald Alvestrand; public-media-capture@w3.org<mailto:public-media-capture@w3.org>
Subject: RE: revised recording proposal

> We're already depending on HTML5 (for <video>, among other things), and HTML5 in turn depends on Blob, so I don't think we're adding any risk here.

Fair enough, but specification of <video> is sufficiently mature that there is currently nearly ubiquitous support in commercially-available browsers. This is not the case for File API.  But regardless, there needs to be a normative reference in this doc to the File API spec so that it is clear that we are not redefining the Blob.

> Were you thinking of using the File interface (the same spec you referred to), or of something else?

I was thinking of leveraging File API, but having a File object returned (see http://www.w3.org/TR/FileAPI/#dfn-file) instead of a Blob.  Timeslicing would not be required in this case as far as I can tell.

My understanding in following the  debate so far is that there is a need for accessing the raw capture data for reliable streaming purposes (e.g. for voice recognition).  Therefore receiving a blob from the recording method at timesliced intervals is handy.  But reliable streaming and access to raw data from the capture stream does not seem to be explicitly required by http://dvcs.w3.org/hg/dap/raw-file/tip/media-stream-capture/scenarios.html.

The closest requirement that I see is "The UA must be able to send recorded media to one or more remote locations while recording is running".  It seems that one could satisfy that requirement without using the timesliced blob - simply send the same MediaStream that is being recorded (in this case, MediaRecorder.mediaStream) via an existing PeerConnection.

If I am incorrect in my interpretation, please let me know.

From: Harald Alvestrand [mailto:harald@alvestrand.no]
Sent: Wednesday, November 28, 2012 3:19 AM
To: public-media-capture@w3.org<mailto:public-media-capture@w3.org>
Subject: Re: revised recording proposal

On 11/28/2012 05:08 AM, Mandyam, Giridhar wrote:
I'll echo what has been stated in that I also like where this is headed.

1.       It has already been pointed out that an onrecordingerror event handler needs to be specified in the IDL.

2.       timeSlice appears to be optional from the developer perspective, but would it be permissible for a minimum timeSlice to be used from the UA perspective?  In other words, when the developer does not specify a timeSlice, can the UA still return data at pre-set intervals?

3.       To clarify, is the returned Blob consistent with the definition in http://www.w3.org/TR/FileAPI/#dfn-Blob?  If so, we run a risk of having a dependency on a spec that has not progressed significantly along in standardization.
We're already depending on HTML5 (for <video>, among other things), and HTML5 in turn depends on Blob, so I don't think we're adding any risk here.

4.       If we are in fact staying consistent with the Blob definition in File API, then it would seem to me that the Formats dictionary as defined should be consistent with how content types are indicated with Blobs.  Media type as defined in RFC 2046 should be sufficient in this case, and is leveraged for the current definition of Blob.type.

5.       Can you (or anyone else on the mailing list) articulate why a Blob is returned versus an ArrayBuffer?

6.       That being said, I think at least for Version 1 of this spec the group should consider having the recording directly write to a file (a la Android MediaRecorder - http://developer.android.com/reference/android/media/MediaRecorder.html or iOS AVAudioRecorder - http://developer.apple.com/library/ios/#documentation/AVFoundation/Reference/AVAudioRecorder_ClassReference/Reference/Reference.html) and move a Blob-based recording interface to Version 2.
Were you thinking of using the File interface (the same spec you referred to), or of something else?

Javascript in general is isolated from the file system.

Received on Wednesday, 28 November 2012 19:32:06 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 16:26:12 UTC