- From: Stefan Hakansson LK <stefan.lk.hakansson@ericsson.com>
- Date: Thu, 23 Aug 2012 14:25:24 +0200
- To: "public-media-capture@w3.org" <public-media-capture@w3.org>
Back in July ([1]) Jim added requirements to the scenarios document.
Jim, many thanks for doing that, and apologies for not giving any
feedback until now.
I think this looks generally good. I have some comments to some of the
requirement, and propose that some requirements in the webrtc req
document should perhaps also be added here. See below.
Br,
Stefan
Some comments on the requirements in the current draft
===================================================================
PERMISSIONS
3. The UA must request the user's permission before sending or
receiving a media stream to or from another user.
-- I am not sure this is a relevant requirement. The model, as I've
understood it, is that the permission model asks only for permission to
access user media. From there the user has to trust the app; it could
send a media stream to a peer - or it could record to a file and send
wherever.
LOCAL MEDIA
2. The UA must be able to provide a visual display of the
properties of the sound captured from from a microphone (volume in this
case).
-- In the webrtc/rtcweb use-case+req document [2] there is an associated
requirement "F15: The browser MUST be able to change the level in audio
streams." and API req A15 saying that the app must be able to control
how the browser changes the level. Is that something we should add?
5. The UA must be able to continue sending and/or capturing media
while the tab is in the background.
-- This applies to receiving and rendering as well (if you're in a
videochat, you'd not like the incoming audio to stop being played if you
switch tab)
6. The UA must be able to extract image frames from video.
7. The UA must be able to insert image frames into a local video
stream (or capture).
-- For the above two, have we at all discussed how to solve them? The
text in the use case mentions the canvas element (to draw a box around
the blue ball) but how would you go from that to a video stream?
8. The UA must support the use of the local screen/display as a
video source.
-- Agree, but a recorded video should also be allowed to use (the user
should be able to trick the app by selecting a file - that has been
recorded - as video source in the getUserMedia dialogue)
The UA must allow the user to pause or stop media streams via UXes
(and not just the buttons on the underlying hardware.)
-- There are two UXes: one is the browser chrome, the other is what the
app provides. Both should be possible to use for "pause" (if the app has
a pause button), but the chrome method must override the app
REMOTE MEDIA
1. The UA must be able to transmit media to one or more remote
sites and to receive media from them.
-- A nit: is "sites" the right word? It gets me to think of things like
origin, rather than a "peer" browser.
2. The UA must be able to offer a preview of audio and video media
received from a remote site.
-- What does "preview" mean in this context?
5. The UA must be able to send or receive a still image over a
video stream.
-- I can see this coming out of the scenario 2.5, but would not a more
natural way to handle this be to send the actual picture to all
participants for display using http or ws?
7. Ability for user simply drag a image over a area of website, so
the image is send to all of the other users
-- Again, something that can easily be accomplished even without
MediaStreams of webrtc - I don't think we should add it
Media Capture
5. the UA must enable the Application to set size contraints and
time limits on media capture.
-- Do we really want the app to be able to define constraints in MBs?
7. The UA must enable the Application to use device properties,
such as battery level, to determine when to terminate media capture.
-- At least battery level seems out of scope for this TF - isn't that
DAP turf?
Requirements in the webrtc req doc [2] that might make sense to add in
this document
=======================================================================
F8/A11: the UA must detect when a remote stream is not received any more
(and inform the application)
F9: Echo handling must be supported by UA
F10: Support synchronous playout of audio and video
F18: Support playout of other audio at the same time as an audio stream
is played
A19: Support for handling general audio different from speech (e.g.
switch off noise reduction)
[1]
http://lists.w3.org/Archives/Public/public-media-capture/2012Jul/0000.html
[2]
http://datatracker.ietf.org/doc/draft-ietf-rtcweb-use-cases-and-requirements/?include_text=1
Received on Thursday, 23 August 2012 12:25:48 UTC