Re: Exclusive access to audio hardware

On Tue, May 8, 2012 at 9:10 AM, Jer Noble <jer.noble@apple.com> wrote:

> Hi all,
>
> I'd like to bring up a conceptial problem with the WebAudio API as it
> currently stands.  The WebAudio API generally assumes non-exlusive to audio
> hardware.  Any number of AudioContexts can operate simultaneously along
> with any number of non-browser audio hardware clients, with the only
> constraint being CPU speed.
>
> However, there may be platforms which may want to adopt the API, but
> either cannot support non-exclusive access to audio hardware (a hardware
> limitation, say), or do not allow non-exclusive access (a softaware
> limitation), or allow both exclusive and non-exclusive access.
>
> This introduces at least three edge cases:
>
>
>    - Failure to receive access:
>       - A script creates an WebAudio graph, but another client on the
>       system has non-interruptable, exclusive access to the audio hardware. (E.g.
>       on a mobile phone during a call.)
>    - Interruption of access:
>       - A running WebAudio graph has its audio hardware access revoked
>       when a higher priority, exclusive access client requests it. (E.g. a mobile
>       phone receives a phone call while browsing.)
>    - Resumption of access:
>       - A previously exclusive client releases its lock on the audio
>       hardware, and the WebAudio graph can now safely resume. (E.g. the phone
>       call ends.)
>
>
> As it stands, there is no mechanism to relay the information about these
> states back to script authors.  So I propose the following additions to the
> spec:
>
>
>    - Add the AudioContext.startRendering() method to the spec.
>       - Currently, AudioContext.startRendering() exists in the WebKit
>       implementation, but is intended for offline use.  However, due to the edge
>       cases above, rendering may fail to start, may be interrupted, or may need
>       to be restart.
>       - It is will be automatically called when the first AudioNode is
>       added to an AudioContext, but that call may fail.
>    - Add a new AudioContext.renderState property to the spec.
>       - The property would have the following possible values:
>          - RENDERING - The AudioContext successfully requested access to
>          the audio hardware and is rendering.
>          - INTERRUPTED - Another audio hardware client has been granted
>          exclusive access to the audio hardware.
>          - IDLE - The client which had been previously granted exclusive
>          access has released the hardware.
>       - The state would default to IDLE.  Calling
>       AudioContext.startRendering() would move the renderState from IDLE to
>       RENDERING, and if the audio hardware was unavailable for any reason, the
>       state would asynchronously move from RENDERING to INTERRUPTED.  Once that
>       exclusive access was finished, the state would asynchronously move from
>       INTERRUPTED to IDLE.
>    - Add a new simple event, onrenderstatechange, which is fired at the
>    AudioContext when its renderState property changes.
>
>
> The end result would be that, for platforms which have universal
> non-exclusive access to audio hardware, nothing would change.  Rendering
> would automatically start once nodes were added to an AudioContext.
>  However, script authors would be able to detect and handle
> interruptions on other more limited platforms.
>
> Thoughts?
>

Hi Jer, I'm not opposed to something like the proposed
. renderState attribute and the event for state changes.  I'm still trying
to understand if we need startRendering() or not.  I understand that in
some cases when a context is first created it may not be able to start
playing, at least right away, because the device might be in the middle of
a phone call, for example.  I'm just wondering if there are any
alternatives to a startRendering() method?  Can't the developer query the
.renderState attribute early on to know if rendering will be able to start
just after the AudioContext has been created.  Maybe the startRendering()
method is the best choice.  I'm just trying to get a better understanding
of why it's needed.

Chris


>
> -Jer
>

Received on Tuesday, 8 May 2012 18:31:50 UTC