W3C home > Mailing lists > Public > public-audio@w3.org > October to December 2011

Re: Web Audio API / WebRTC Integration

From: Robert O'Callahan <robert@ocallahan.org>
Date: Tue, 18 Oct 2011 01:05:53 +1300
Message-ID: <CAOp6jLbXg0=0bpWc7P02c+Bo1S5SUDBG0y5PAE5MeYrFyCTYbg@mail.gmail.com>
To: Alistair MacDonald <al@signedon.com>
Cc: Chris Rogers <crogers@google.com>, public-audio@w3.org
On Sun, Oct 16, 2011 at 5:52 PM, Alistair MacDonald <al@signedon.com> wrote:

> Group: if you are attending the AudioWG Telecon this Tuesday, please read
> through the links Chris posted, this is a great starting point for
> Agenda-Item-3 (Integration with other audio APIs).
> [1]
> http://hg.mozilla.org/users/rocallahan_mozilla..com/specs/raw-file/tip/StreamProcessing/StreamProcessing.html#examples<http://hg.mozilla.org/users/rocallahan_mozilla.com/specs/raw-file/tip/StreamProcessing/StreamProcessing.html#examples>
> [2]
> https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/webrtc-integration.html
> ROC will you be available on Tuesday? I would be very interested to get
> your feedback on this.

Yes, I should be able to join.

In example 2, something needs to provide flow control so that if one or both
of foo.webm and back.webm stalls (e.g. due to network conditions), the
output stalls and later both streams eventually continue playing in sync.
StreamProcessing defines a "blocked" state for streams (or rather, reuses
it, since it's also present in other MediaStreams proposals) and sets
requirements on blocking states to ensure the desired behavior. I haven't
seen anything similar in the Web Audio proposal. (I don't think it would be
trivial to add this to Web Audio; for example you might have a video stream
being consumed by multiple AudioContexts, which would then need to
coordinate their blocking states.)

With the suggested code for example 7, there will be an arbitrary delay
between in1 ending and in2 starting to play, which will likely result in an
audio and/or video glitch.

The code in example 8 passes parameters to HTMLMediaElement.pause()/play().
In HTML5 those methods don't take any parameters.

Examples 2 and 9 (any JS audio processing, really) would benefit from a
Worker-based API, so that contention for the HTML5 event loop needn't
disrupt audio processing.

As I mentioned in my other email, in many of these examples having to bridge
Web Audio and MediaStream APIs adds unnecessary complexity. From the
author's point of view there are unnecessary API calls, and two
variables/objects representing the same thing.

"If we claim to be without sin, we deceive ourselves and the truth is not in
us. If we confess our sins, he is faithful and just and will forgive us our
sins and purify us from all unrighteousness. If we claim we have not sinned,
we make him out to be a liar and his word is not in us." [1 John 1:8-10]
Received on Monday, 17 October 2011 12:06:24 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:49:57 UTC