Reviewing the Web Audio API

I've spent some time looking at the Web Audio API [1] since the Audio WG 
have put out a call for review [2].

As starting point I used the related reqs from our use-case and req 
document [3]:

F13             The browser MUST be able to apply spatialization
                    effects to audio streams.

F14             The browser MUST be able to measure the level
                    in audio streams.
F15             The browser MUST be able to change the level
                    in audio streams.

with the accompanying API reqs:

A13             The Web API MUST provide means for the web
                    application to apply spatialization effects to
                    audio streams.
A14             The Web API MUST provide means for the web
                    application to detect the level in audio
                    streams.
A15             The Web API MUST provide means for the web
                    application to adjust the level in audio
                    streams.
A16             The Web API MUST provide means for the web
                    application to mix audio streams.

Looking at the Web Audio API, and combining it with the MediaStream 
concept we use, I come to the following understanding:

1) To make the audio track(s) of MediaStream(s) available to the Web 
Audio processing blocks, the The MediaElementAudioSourceNode Interface 
would be used.

2) Once that is done, the audio is available to the Web Audio API 
toolbox, and anything we have requirements on can be done

3) When the processing has been done (panning, measure level, change 
level, mix) the audio would be played using an AudioDestinationNode 
Interface

What is unclear to me at present, is how synchronization would work. So 
far we have been discussing in terms of that all tracks in a MediaStream 
are kept in sync; but what happens when the audio tracks are routed to 
another set of tools, and not played in the same (video) element as the 
video?

Another take away is that the processing can only happen in the browser 
that is going to play the audio, since there is no way to go from an 
AudioNode to a MediaStream or MediaStreamTrack.

Anyone else that has looked into the Web Audio API? And any other 
conclusions?

I think we should give feedback from this WG (as we have some reqs that 
are relevant).

Br,
Stefan


[1] http://www.w3.org/TR/2012/WD-webaudio-20120315/
[2] http://lists.w3.org/Archives/Public/public-webrtc/2012Mar/0072.html
[3] 
http://datatracker.ietf.org/doc/draft-ietf-rtcweb-use-cases-and-requirements/?include_text=1

Received on Wednesday, 28 March 2012 18:55:05 UTC