Re: AudioWorker proposal

I agree we don't want to use the word "Event" anywhere unless it's an
actual Event, and that it's best not to thrash function signature.

However, as I've looked at current best practice, e.g. the ServiceWorker
API, and run it past Alex and Domenic - Event is still commonly used, just
defined on the WorkerGlobalScope.  Their take is the synchronous usage here
is not out of line, and we should continue to use Events and event handling
semantics.  If we run into problems, we'd drop back to a callback (with a
collective object for the i/o buffers, etc) on the WorkerGlobalScope, so it
would look nearly identical.

I think we SHOULD, therefore, continue to use AudioProcessingEvent, and
stick with the second IDL from above:

[Constructor(DOMString scriptURL, optional unsigned long bufferSize)]
interface AudioWorker : Worker {
};
interface AudioWorkerGlobalScope : DedicatedWorkerGlobalScope {
attribute EventHandler onaudioprocess;
  readonly attribute float sampleRate;};
interface ScriptProcessorNode : AudioNode {  readonly attribute double
latency;  readonly attribute AudioWorker worker;};
interface AudioProcessingEvent : Event {
    readonly    attribute double      playbackTime;
    readonly    attribute AudioBuffer inputBuffer;
    readonly    attribute AudioBuffer outputBuffer;
};

partial interface AudioContext {  ScriptProcessorNode
createScriptProcessor(    DOMString scriptURL,    optional unsigned
long bufferSize = 0,    optional unsigned long numberOfInputChannels =
2,    optional unsigned long numberOfOutputChannels = 2);}


The interesting semantic changes:
1) the Event and its AudioBuffer objects (and their underlying
ArrayBuffers) would be stated to be reused, for performance (and avoiding
garbage collection) reasons.  This is no longer a potential source of race
conditions.
2) I'm STRONGLY tempted to eliminate the bufferSize altogether, and let the
system dictate it.  The buffering should no longer be necessary, and this
will be the only source of latency.


On Fri, Jul 25, 2014 at 7:36 AM, Joseph Berkovitz <joe@noteflight.com>
wrote:

> Chris,
>
> Thanks so much for clarifying.
>
> I think we’re on the same page here. The only additional suggestion I’m
> making (recapping from the beginning of this thread) is that if we don’t
> use AudioProcessingEvent (and I now think it would be clearest *not* to, so
> there is no confusion about the difference in mechanisms) let’s use some
> other single object type to collect the information passed into and out of
> the AudioWorker so that we do not have a function signature that needs to
> evolve over time. In keeping with what else you said, perhaps let’s not
> call that type an Event either, since that confuses things and suggests
> asynchronicity (and furthermore implies onXXX() and
> addEventListener(“XXX”,…)). This new type will be an object that
> communicates information between the audio thread and an AudioWorker,
> period.
>
> How does that sound to you and other folks on this thread?
>
>
> .            .       .    .  . ...Joe
>
> *Joe Berkovitz*
> President
>
> *Noteflight LLC*
> Boston, Mass.
> phone: +1 978 314 6271
> www.noteflight.com
> "Your music, everywhere"
>
>
>
> On Jul 21, 2014, at 5:08 PM, Chris Wilson <cwilso@google.com> wrote:
>
> Hey Joe-
>
> What I'd been intending was that the AudioProcessingEvent (and associated
> audioprocess event) would go away, because events are typically
> asynchronous - i.e. "audioprocess" is essentially a function that's called
> to process audio, much as the onaudioprocess event handler used to be
> called in standard scenarios, but it's called directly from the audio
> process when it needs data.
>
> Upon further reflection, perhaps we don't need to be that independent, as
> it's the firing of events that is typically asynchronous (via PostMessage).
>  The primary freedom I want to make sure we're obtaining is that the audio
> processing function can be called synchronously, and we should NOT need to
> create a new set of i/o objects for every call (to eliminate/reduce garbage
> collection in this thread).  I was always on the fence about using an Event
> here anyway, because I don't think multiple handlers is a good pattern, but
> I'm somewhat ambivalent and open to influence.  Please note, though, that
> this event is coming from the audio engine, and is delivered to the Worker;
> you wouldn't be able to use Event to pass parameters from the main JS
> thread to the worker thread; you'll still need to use postMessage for
> setting up and varying parameters, etc.
>
> So: either we use the synchronous callback mechanism, as detailed
> previously in this thread modulo the error I made in not separating input
> and output buffers :):
>
> [Constructor(DOMString scriptURL, optional unsigned long bufferSize)]
> interface AudioWorker : Worker {
> };
> callback AudioProcessCallback = void (double playbackTime, AudioBuffer inputBuffer, AudioBuffer outputBuffer);
> interface AudioWorkerGlobalScope : DedicatedWorkerGlobalScope {  attribute AudioProcessCallback audioprocess;  readonly attribute float sampleRate;};
> interface ScriptProcessorNode : AudioNode {  readonly attribute double latency;  readonly attribute AudioWorker worker;};
> partial interface AudioContext {  ScriptProcessorNode createScriptProcessor(    DOMString scriptURL,    optional unsigned long bufferSize = 0,    optional unsigned long numberOfInputChannels = 2,    optional unsigned long numberOfOutputChannels = 2);}
>
>
>
> , or we keep the event-based system (although as the event is firing from
> the audio system inside the same thread, it would in practice be called
> directly):
>
> [Constructor(DOMString scriptURL, optional unsigned long bufferSize)]
> interface AudioWorker : Worker {
> };interface AudioWorkerGlobalScope : DedicatedWorkerGlobalScope {  attribute EventHandler onaudioprocess;  readonly attribute float sampleRate;};
> interface ScriptProcessorNode : AudioNode {  readonly attribute double latency;  readonly attribute AudioWorker worker;};interface AudioProcessingEvent : Event {
>     readonly    attribute double      playbackTime;
>     readonly    attribute AudioBuffer inputBuffer;
>     readonly    attribute AudioBuffer outputBuffer;
> };
> partial interface AudioContext {  ScriptProcessorNode createScriptProcessor(    DOMString scriptURL,    optional unsigned long bufferSize = 0,    optional unsigned long numberOfInputChannels = 2,    optional unsigned long numberOfOutputChannels = 2);}
>
>
>
> On Mon, Jul 21, 2014 at 1:00 PM, Joseph Berkovitz <joe@noteflight.com>
> wrote:
>
>> Chris,
>>
>> I understand that postMessage and onmessage are available to any Worker
>> subclass. What I am looking for is an explanation of the
>> AudioProcessCallback type and “audioprocess” attribute in the global
>> namespace, cited in the IDL that you posted in issue 113 on March 26.
>>
>> If you’re just saying we’re going to be using onaudioprocess and
>> AudioProcessingEvent for these new-style workers, then I don’t see how the
>> proposed IDL fits into that. And if it’s obsolete, that’s OK.
>>
>> …Joe
>>
>>
>> On Jul 18, 2014, at 3:27 PM, Chris Wilson <cwilso@google.com> wrote:
>>
>> No, since AudioWorker is derived from Worker, it responds to (and has
>> access to) .postMessage and .onmessage - enabling you to send and receive
>> messages between the main thread and the audio worker, even structured
>> objects (if they're clonable). (See
>> https://developer.mozilla.org/en-US/docs/Web/API/Worker.postMessage.)
>>  The isolation of everything else is a feature.  You could, for example,
>> transfer an ArrayBuffer, so you could implement convolution or something
>> else that requires a data array - but you don't need to worry about race
>> conditions, because it's message-based and transferable or clonable only.
>>
>> In this way, we can also not worry about the neutering/reuse/new'ing of
>> ArrayBuffers for the onaudioprocess calls - because they're not across
>> threads - and this is a desirable optimization, since we don't need to
>> thrash memory (by alloc'ing new objects for every audioprocess call).
>>
>>
>>
>>
>> On Fri, Jul 18, 2014 at 12:05 PM, Joseph Berkovitz <joe@noteflight.com>
>> wrote:
>>
>>> Thanks, Chris. I think you are saying the worker script will effectively
>>> do something like this:
>>>
>>> // Assign global audio processing callback for this worker
>>> audioprocess = function(playbackTime, buffer) {
>>>   //…handle input passed via the buffer parameter
>>> };
>>>
>>> The one concern I see with this is that it may make it harder to pass a
>>> flexible bunch of information to the worker — this information is currently
>>> burned into the AudioProcessCallback parameter list as a playback time and
>>> a buffer. Even if we don’t use an AudioProcessingEvent, wouldn’t some kind
>>> of single object (AudioProcessingData?) holding the worker’s input and
>>> output data be easier to evolve over time than a parameter list, if there
>>> are changes or alternatives down the road?
>>>
>>> Also, I’m wondering why the sampleRate is provided in the global scope
>>> and not as an parameter to the callback (or, if you accept my idea, an
>>> attribute of some AudioProcessingData object)?
>>>
>>> …Joe
>>>
>>> On Jul 18, 2014, at 12:57 PM, Chris Wilson <cwilso@google.com> wrote:
>>>
>>> Yes, that's somewhat correct.  The only bit that isn't correct is that
>>> it probably shouldn't really be an AudioProcessingEvent - because since the
>>> goal is to be synchronous, it should be called directly, not posted or
>>> fired.  Something like (totally off the top of my head):
>>>
>>> [Constructor(DOMString scriptURL, optional unsigned long bufferSize)]
>>> interface AudioWorker : Worker {
>>> };
>>> callback AudioProcessCallback = void (double playbackTime, AudioBuffer buffer);interface AudioWorkerGlobalScope : DedicatedWorkerGlobalScope {  attribute AudioProcessCallback audioprocess;  readonly attribute float sampleRate;};
>>> interface ScriptProcessorNode : AudioNode {  readonly attribute double latency;  readonly attribute AudioWorker worker;};
>>> partial interface AudioContext {  ScriptProcessorNode createScriptProcessor(    DOMString scriptURL,    optional unsigned long bufferSize = 0,    optional unsigned long numberOfInputChannels = 2,    optional unsigned long numberOfOutputChannels = 2);}
>>>
>>>
>>>
>>> On Thu, Jul 17, 2014 at 11:28 AM, Joseph Berkovitz <joe@noteflight.com>
>>> wrote:
>>>
>>>> Hi Chris,
>>>>
>>>> I remember the issue (and contributed some comments to it at the time)
>>>> but it’s difficult to determine the exact proposal that was discussed at
>>>> last week’s meeting, since a number of variants are proposed in github
>>>> issue 113 with different folks suggesting this flavor or that flavor.
>>>>
>>>> From the fact that Olivier called this “synchronous”, I'm guessing that
>>>> you were discussing an AudioWorker object that executes directly in the
>>>> audio thread and doesn’t have to deal with message-passing. I infer that an
>>>> AudioWorker obtains its input (and output?) audio buffer directly from an
>>>> AudioProcessingEvent passed to the worker’s “onaudioprocess” function
>>>> defined in its AudioWorkerGlobalScope. This was proposed in your comment of
>>>> March 26, I think.
>>>>
>>>> Is that understanding correct? Were further changes to this approach
>>>> discussed?
>>>>
>>>> …Joe
>>>>
>>>> On Jul 17, 2014, at 1:03 PM, Chris Wilson <cwilso@google.com> wrote:
>>>>
>>>> It's mostly written up in the issue (
>>>> https://github.com/WebAudio/web-audio-api/issues/113), but I realized
>>>> on the call that I hadn't yet written out an IDL for how the synchronous
>>>> version would work.
>>>>
>>>>
>>>> On Thu, Jul 17, 2014 at 9:53 AM, Paul Adenot <paul@paul.cx> wrote:
>>>>
>>>>>  On Thu, Jul 17, 2014, at 06:43 PM, Joseph Berkovitz wrote:
>>>>>
>>>>> Hi Olivier,
>>>>>
>>>>> On Jul 11, 2014, at 6:16 AM, Olivier Thereaux <
>>>>> olivier.thereaux@bbc.co.uk> wrote:
>>>>>
>>>>>
>>>>> * Progress of scriptprocessornodes in workers issue
>>>>> We discussed a proposal for a synchronous model for the node. There
>>>>> will also be a Call for Consensus soon on the possibility of breaking
>>>>> changes for the main thread ScriptProcessorNode.
>>>>>
>>>>>
>>>>>
>>>>> Is there a writeup of the proposal that was discussed?
>>>>>
>>>>>
>>>>> Not as far as I know. We kind of agreed on a super high level, but
>>>>> nothing very solid yet.
>>>>>
>>>>> Paul.
>>>>>
>>>>
>>>>
>>>>
>>>>
>>>>          .            .       .    .  . ...Joe
>>>>
>>>> *Joe Berkovitz*
>>>> President
>>>>
>>>> *Noteflight LLC*
>>>> Boston, Mass. phone: +1 978 314 6271
>>>>       www.noteflight.com
>>>> "Your music, everywhere"
>>>>
>>>>
>>>
>>>
>>>
>>
>>          .            .       .    .  . ...Joe
>>
>> *Joe Berkovitz*
>> President
>>
>> *Noteflight LLC*
>> Boston, Mass. phone: +1 978 314 6271
>>       www.noteflight.com
>> "Your music, everywhere"
>>
>>
>
>

Received on Friday, 25 July 2014 16:19:59 UTC