W3C home > Mailing lists > Public > public-audio@w3.org > April to June 2016

Re: Comments on AudioWorklet examples

From: Hongchan Choi <hongchan@google.com>
Date: Wed, 01 Jun 2016 23:17:25 +0000
Message-ID: <CAGJqXNt0wA9s+Z0zrk138X9GcbJq=tCa5CfZ_mHV5_rpi=o-kQ@mail.gmail.com>
To: Joe Berkovitz <joe@noteflight.com>, Audio Working Group <public-audio@w3.org>
Just to brainstorm, I am imagining something like this: (sorry, it is
extremely simplified)

/* AudioWorkletGlobalScope */
registerAudioWorkletNode('foo',
  class extends AudioWorkletNode {
    get parameterDescriptors () {}
    constructor (options) {
      super(options);
    },
    onmessage () {}
  },
  class extends AudioWorkletProcessor {
    process (inputs, outputs, parameters) {},
    onmessage () {}
  }
);

AudioWorkletGlobalScope does not necessarily represent the audio rendering
thread. It's just a special scope to define a worklet node.

-Hongchan

On Wed, Jun 1, 2016 at 3:19 PM Hongchan Choi <hongchan@google.com> wrote:

> Great! I am glad we found the source of confusion.
>
> Yes, I think the instantiation procedure of AWN/AWP must be identical to
> the one of regular AudioNodes. In one way or another, the Web Audio API
> implementation has two layers for a node: the main thread representation
> and the small surface exposed to the audio render thread. Everything
> belongs to the main thread is 'deferrable', while the `process()` method is
> non-blockable and accessed by the audio thread. The instantiation of a node
> falls into the former.
>
> Once again - I really want to avoid defining a custom node by writing a
> pair of scripts, but perhaps that's the only sane way of resolving this. We
> need to discuss this at least few more days. I don't think I can come up
> with a solid alternative by tomorrow's call.
>
> Thanks for the insight, Joe - it's been really helpful for me!
>
> -Hongchan
>
> On Wed, Jun 1, 2016 at 3:05 PM Joe Berkovitz <joe@noteflight.com> wrote:
>
>> Hi Hongchan,
>>
>> You replied privately to me -- I'm CCing the group on this response since
>> it feels like we're just continuing the same thread that's of interest to
>> everyone, which I hope is OK.
>>
>>
>>> If we are only seeking the functionality of API design, the
>>> initialization or the visibility should not be an issue at all. We can be
>>> as explicit and verbose as possible (whichever you want to crack this down
>>> - subclassing, another main thread registration). However, I would like us
>>> to spend more time to make the structure concise and intuitive, rather than
>>> tossing the burden of complexity to developers. This is me being hand-wavy,
>>> but hope you can agree with this.
>>>
>>
>> I totally like intuitive and concise. I also think that developers
>> creating custom nodes will be pleased to wrap an AudioWN in their own nodes
>> to hide the "workletness". But I think we can have it both ways without
>> sacrificing anything good.
>>
>> Maybe we just need to talk this through on tomorrow's call!
>>
>>
>>> > This class is defined, registered and instantiated on the audio
>>> thread.
>>>
>>> This is not true. The instantiation happens on the main thread. (even
>>> for regular audio nodes) Only `process()` method runs on the audio thread.
>>> I guess this was the point of confusion and I am glad you raised the
>>> question. I think we definitely have more to think about.
>>>
>>
>> I was talking about AudioWorkletProcessor instantiation, not
>> AudioWorkletNodes, just making sure.
>>
>> This is big news to me (if I understand correctly). And, actually, good
>> news. So AudioWorkletProcessors (not Nodes) are instantiated on the main
>> thread, but in a special global scope? I guess that would resolve this
>> issue I'm having with exposing parameters right after instantiation. That
>> means that when you synchronously create an Node, the Processor can also be
>> created synchronously (and its param descriptors examined) before the Node
>> is handed back.
>>
>> My misunderstanding shows that there is lots of room for interpretation
>> in these examples, I guess.
>>
>>
>>> > If we want these parameters to be instantly visible on a
>>> synchronously instantiated AudioWorkletNode of the corresponding kind, then
>>> param descriptors must be noted at the time that the Processor is
>>> registered (by calling registerAudioWorkletProcessor) and communicated back
>>> to somewhere that is immediately accessible from the main thread.
>>>
>>> I am not sure why the param descriptors must be available at the time of
>>> registration. Like I mentioned above, the instantiation should happen
>>> on the main thread. So after the construction of an AWN, the parameters
>>> within will be available immediately.
>>>
>>
>> I was thinking that at least one AWP (not an AWN) had to be constructed
>> in advance, in order to determine the param descriptors. But if an AWP can
>> be instantiated synchronously with the construction of its corresponding
>> AWN, then the problem I've been describing goes away.
>>
>> ...Joe
>>
>
Received on Wednesday, 1 June 2016 23:18:03 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 1 June 2016 23:18:03 UTC