On Wed, Mar 28, 2012 at 10:10 PM, Robert O'Callahan <robert@ocallahan.org>wrote: > On Thu, Mar 29, 2012 at 3:33 PM, Chris Rogers <crogers@google.com> wrote: > >> HTMLMediaElements already have a mechanism for synchronization using the >> HTML5 MediaController API. Live stream (live camera/mic or remote peers) >> MediaStreams would maintain synchronization (I assume you mean audio/video >> sync in this case). The Web Audio API would just be used to apply effects, >> not changing the synchronization. > > > Can you explain how audio/video sync would account for the latency > introduced by Web Audio processing? Have you found a way to do this > automatically? > None of the built-in Web Audio processing algorithms have any appreciable latency which would perceptibly affect audio/video sync. We're talking about 3ms or less here. In terms of irritation, network latency is of vastly more concern for WebRTC applications. ChrisReceived on Thursday, 29 March 2012 05:33:12 UTC
This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:17:26 UTC