- From: Olivier Thereaux <notifications@github.com>
- Date: Wed, 11 Sep 2013 07:28:11 -0700
- To: WebAudio/web-audio-api <web-audio-api@noreply.github.com>
Received on Wednesday, 11 September 2013 14:29:55 UTC
> Originally reported on W3C Bugzilla [ISSUE-17347](https://www.w3.org/Bugs/Public/show_bug.cgi?id=17347) Tue, 05 Jun 2012 11:39:14 GMT > Reported by Philip Jägenstedt > Assigned to Audio-ISSUE-56 (HTMLMediaElementSync): HTMLMediaElement synchronisation [Web Audio API] http://www.w3.org/2011/audio/track/issues/56 Raised by: Philip Jägenstedt On product: Web Audio API It appears as though once audio data has left HTMLMediaElement (via MediaElementAudioSourceNode) there is no way to filter that audio and play it back in sync with other audio or video streams. Since the timestamps are not propagated, it does not appear possible to add effects to a particular point in the media resource timeline. For example, audio descriptions (voice synthesis of extra text cues for the visually impaired) requires ducking the main audio and mixing in additional audio at a specific time. --- Reply to this email directly or view it on GitHub: https://github.com/WebAudio/web-audio-api/issues/78
Received on Wednesday, 11 September 2013 14:29:55 UTC