- From: Olivier Thereaux <olivier.thereaux@bbc.co.uk>
- Date: Wed, 07 Dec 2011 10:54:41 +0000
- To: public-webrtc@w3.org
- CC: public-audio@w3.org
- Message-ID: <4EDF45F1.5000402@bbc.co.uk>
Hello WebRTC WG,
From the record of the joint session between WebRTC and Audio groups at
TPAC, I see that our groups were to make sure the requirements from
WebRTC are taken into account by the Audio WG:
[[ The WebRTC WG will send requirements to the Audio WG to ensure they
get properly addressed by the Audio WG. ]] --
http://www.w3.org/2011/04/webrtc/wiki/Santa_Clara_F2F_Summary#Audio_WG
Given the requirements document published here:
http://tools.ietf.org/html/draft-ietf-rtcweb-use-cases-and-requirements-06
My understanding is that the applicable requirements for the Audio WG
are A8, A13, A14, A15 and A16, and that to a lesser extent F5, F6, F9,
F13, F14 and F18 are also relevant (see below for expanded list). Could
you confirm this is a reasonable assessment, and point out requirements
which I may have forgotten but which the Audio WG should look at?
On a side note, I would suggest systematically disambiguating the
acronyms used in the use cases and requirements document. That would
make reading and understanding easier for the non-initiated.
Thank you,
Olivier
F5 The browser MUST be able to render good quality
audio and video even in the presence of reasonable
levels of jitter and packet losses.
TBD: What is a reasonable level?
----------------------------------------------------------------
F6 The browser MUST be able to handle high loss and
jitter levels in a graceful way.
----------------------------------------------------------------
F9 When there are both incoming and outgoing audio
streams, echo cancellation MUST be made available to
avoid disturbing echo during conversation.
QUESTION: How much control should be left to the
web application?
----------------------------------------------------------------
F13 The browser MUST be able to apply spatialization
effects to audio streams.
----------------------------------------------------------------
F14 The browser MUST be able to measure the level
in audio streams.
----------------------------------------------------------------
F15 The browser MUST be able to change the level
in audio streams.
----------------------------------------------------------------
F16 The browser MUST be able to render several
concurrent video streams
----------------------------------------------------------------
F17 The browser MUST be able to mix several
audio streams.
----------------------------------------------------------------
F18 The browser MUST be able to process and mix
sound objects (media that is retrieved from another
source than the established media stream(s) with the
peer(s) with audio streams.
----------------------------------------------------------------
A8 The Web API MUST provide means for the web
application to mute/unmute a stream or stream
component(s). When a stream is sent to a peer
mute status must be preserved in the stream
received by the peer.
----------------------------------------------------------------
A13 The Web API MUST provide means for the web
application to apply spatialization effects to
audio streams.
----------------------------------------------------------------
A14 The Web API MUST provide means for the web
application to detect the level in audio
streams.
----------------------------------------------------------------
A15 The Web API MUST provide means for the web
application to adjust the level in audio
streams.
----------------------------------------------------------------
A16 The Web API MUST provide means for the web
application to mix audio streams.
Received on Wednesday, 7 December 2011 10:55:34 UTC