- From: Olivier Thereaux <notifications@github.com>
- Date: Wed, 11 Sep 2013 07:28:36 -0700
- To: WebAudio/web-audio-api <web-audio-api@noreply.github.com>
Received on Wednesday, 11 September 2013 14:29:01 UTC
> Originally reported on W3C Bugzilla [ISSUE-17346](https://www.w3.org/Bugs/Public/show_bug.cgi?id=17346) Tue, 05 Jun 2012 11:38:36 GMT > Reported by Philip Jägenstedt > Assigned to Audio-ISSUE-55 (HTMLMediaElementIntegration): HTMLMediaElement integration [Web Audio API] http://www.w3.org/2011/audio/track/issues/55 Raised by: Philip Jägenstedt On product: Web Audio API https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html#MediaElementAudioSourceNode https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html#AudioElementIntegration The section "Integration with the audio and video elements" should be merged into the definition of MediaElementAudioSourceNode. Unfortunately, the combination of the two still leaves the behavior completely undefined, with only an example given and no normative requirements for implementations. What happens when a HTMLMediaElement * has readyState HAVE_NOTHING? * is paused? * is seeking? * has no audio channels? * switches the active audio channel using the AudioTrack.enabled interface. * is muted? * has volume < 1? --- Reply to this email directly or view it on GitHub: https://github.com/WebAudio/web-audio-api/issues/145
Received on Wednesday, 11 September 2013 14:29:01 UTC