W3C home > Mailing lists > Public > public-audio@w3.org > July to September 2013

Re: Behavior of source nodes on connect/disconnect

From: Robert O'Callahan <robert@ocallahan.org>
Date: Fri, 13 Sep 2013 12:35:14 -0700
Message-ID: <CAOp6jLa=gpaZ6dyKBEkYNdP_Rdu4gqByy0AYA1SxU-ghQr9Ryw@mail.gmail.com>
To: Katelyn Gadd <kg@luminance.org>
Cc: Karl Tomlinson <karlt+public-audio@karlt.net>, Jussi Kalliokoski <jussi.kalliokoski@gmail.com>, Chris Wilson <cwilso@google.com>, Raymond Toy <rtoy@google.com>, "public-audio@w3.org" <public-audio@w3.org>
On Fri, Sep 13, 2013 at 4:19 AM, K. Gadd <kg@luminance.org> wrote:

> I don't really see how you can fix it in this case other than by making an
> AudioNode + ScriptProcessorNode pair an uncollectable cycle that leaks
> forever if you release it.

That's what we do.

> That seems kinda awful. On the other hand, maybe it's an unusual enough
> use case that it's okay for it to work that way? Alternately you can just
> tolerate GC visibility and see if anyone pitches a fit about it... I don't
> really think it would be usable for any sort of attack.

To me the biggest problem with GC observability is that it can make app
behavior unintentionally depend on GC timing. And that is something that we
never ever want to have to standardize!!!

These behaviors regarding detached nodes and time are surprising to me - I
> remember seeing slight mentions of them when I last read over the spec, but
> perhaps they should be called out more clearly in an appendix of some sort
> that describes the set of behaviors like this (audiobuffersourcenodes
> playing while disconnected, audioprocess events firing while disconnected,
> etc.)

The spec status quo is that the spec does not say output connection state
affects node processing, therefore it doesn't. It does not make sense to
cram all pairs of "X does not affect Y" into the spec. If there's confusion
on particular pairs, non-normative notes can be added but normative text is
not needed.

> The intent is that these behaviors are normally not observed by
> developers, because all of a user's interactions with the audio API occur
> during a single event loop turn, right? If an event loop turn lasts too
> long will they suddenly become observable, or does mixing stop/continue
> using old state? That is, if I disconnect an AudioBufferSourceNode, then
> spin (blocking the content thread) for 500ms, is the disconnection
> observable by someone listening to mixer output, or does it not become
> 'real' until the next event loop turn? To make this contrived case more
> real, if I were to connect a new AudioBufferSourceNode and then immediately
> block the content thread doing some sort of synchronous operation (asm.js
> compile, sync xhr, sync buffer decode, etc) would the user be able to hear
> the new AudioBufferSourceNode while the content thread is stalled? Or would
> the connection not materialize until the thread recovers, at which point
> the entire buffer may have completed its (imaginary) playback? Can GC
> pauses cause the same sort of problem and result in audio glitching?

In Gecko all changes to the Web Audio graph take effect atomically at the
next stable state (i.e. before the next turn of the event loop), as far as
audio processing is concerned.

I think this is important and worth specifying but it's a different issue
to this thread.

Jtehsauts  tshaei dS,o n" Wohfy  Mdaon  yhoaus  eanuttehrotraiitny  eovni
le atrhtohu gthot sf oirng iyvoeu rs ihnesa.r"t sS?o  Whhei csha iids  teoa
stiheer :p atroa lsyazye,d  'mYaonu,r  "sGients  uapr,e  tfaokreg iyvoeunr,
'm aotr  atnod  sgaoy ,h o'mGee.t"  uTph eann dt hwea lmka'n?  gBoutt  uIp
waanndt  wyeonut  thoo mken.o w  *
Received on Friday, 13 September 2013 19:35:43 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:03:24 UTC