On Wed, Mar 28, 2012 at 10:10 PM, Robert O'Callahan <robert@ocallahan.org>wrote:
> On Thu, Mar 29, 2012 at 3:33 PM, Chris Rogers <crogers@google.com> wrote:
>
>> HTMLMediaElements already have a mechanism for synchronization using the
>> HTML5 MediaController API. Live stream (live camera/mic or remote peers)
>> MediaStreams would maintain synchronization (I assume you mean audio/video
>> sync in this case). The Web Audio API would just be used to apply effects,
>> not changing the synchronization.
>
>
> Can you explain how audio/video sync would account for the latency
> introduced by Web Audio processing? Have you found a way to do this
> automatically?
>
None of the built-in Web Audio processing algorithms have any appreciable
latency which would perceptibly affect audio/video sync. We're talking
about 3ms or less here. In terms of irritation, network latency is of
vastly more concern for WebRTC applications.
Chris