W3C home > Mailing lists > Public > public-webrtc@w3.org > March 2012

Re: Reviewing the Web Audio API (from webrtc)

From: Chris Rogers <crogers@google.com>
Date: Wed, 28 Mar 2012 22:32:40 -0700
Message-ID: <CA+EzO0=ZMsyv=RzBGMzSu+UhKaHvs=ZYd_kidTONPT4mqbCLgg@mail.gmail.com>
To: robert@ocallahan.org
Cc: "Wei, James" <james.wei@intel.com>, "public-audio@w3.org" <public-audio@w3.org>, public-webrtc@w3.org
On Wed, Mar 28, 2012 at 10:10 PM, Robert O'Callahan <robert@ocallahan.org>wrote:

> On Thu, Mar 29, 2012 at 3:33 PM, Chris Rogers <crogers@google.com> wrote:
>
>> HTMLMediaElements already have a mechanism for synchronization using the
>> HTML5 MediaController API.  Live stream (live camera/mic or remote peers)
>> MediaStreams would maintain synchronization (I assume you mean audio/video
>> sync in this case).  The Web Audio API would just be used to apply effects,
>> not changing the synchronization.
>
>
> Can you explain how audio/video sync would account for the latency
> introduced by Web Audio processing? Have you found a way to do this
> automatically?
>

None of the built-in Web Audio processing algorithms have any appreciable
latency which would perceptibly affect audio/video sync.  We're talking
about 3ms or less here.  In terms of irritation, network latency is of
vastly more concern for WebRTC applications.

Chris
Received on Thursday, 29 March 2012 05:33:12 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Thursday, 29 March 2012 05:33:13 GMT