W3C home > Mailing lists > Public > public-audio@w3.org > July to September 2012

Re: Reflections on writing a sequencer

From: Raymond Toy <rtoy@google.com>
Date: Thu, 26 Jul 2012 16:24:30 -0700
Message-ID: <CAE3TgXGF2afn1FfBhYJ7=C9i7eZcf3KXvF-j6U_W8oPY=+ivNg@mail.gmail.com>
To: r baxter <baxrob@gmail.com>
Cc: Stuart Memo <stuartmemo@gmail.com>, Adam Goode <agoode@google.com>, Peter van der Noord <peterdunord@gmail.com>, public-audio@w3.org
On Thu, Jul 26, 2012 at 4:06 PM, r baxter <baxrob@gmail.com> wrote:

> Aha, yeah that's what I thought.  :-)
>
> How about playbackTime?
>

A peek at the code in chrome indicates that playbackTime isn't available
yet.

Filing a bug would probably raise the priority.

Ray


>
> On Thu, Jul 26, 2012 at 4:04 PM, Stuart Memo <stuartmemo@gmail.com> wrote:
> > Sorry, I actually meant that the other way round! Dev and canary - yes.
> >
> >
> > On 26 July 2012 23:58, Stuart Memo <stuartmemo@gmail.com> wrote:
> >>>
> >>> Can anyone tell me if AudioProcessingEvent.playbackTime or
> >>> Oscillator.noteOn are implemented in the Chrome beta, dev, or canary
> >>> channels?  I like to play with this stuff empirically, and neither of
> >>> those exist in Chrome stable...
> >>
> >>
> >> Oscillator.noteOn is only available in the stable and beta releases I
> >> believe. In dev and canary an oscillator runs without noteOn starting
> it.
> >> You can simply use the following to stop it throwing any errors...
> >>
> >> if (typeof oscillator.noteOn !== 'undefined') {
> >>     oscillator.noteOn(0);
> >> }
> >>
> >> Hope that helps!
> >>
> >> - Stuart
> >>
> >> On 26 July 2012 23:45, r baxter <baxrob@gmail.com> wrote:
> >>>
> >>> Can anyone tell me if AudioProcessingEvent.playbackTime or
> >>> Oscillator.noteOn are implemented in the Chrome beta, dev, or canary
> >>> channels?  I like to play with this stuff empirically, and neither of
> >>> those exist in Chrome stable...
> >>>
> >>> By my testing, AudioProccessingEvents are sample accurate relative to
> >>> one another (see: http://jsfiddle.net/eZPJh/), and I think this is the
> >>> intent of the spec draft (says, bufferSize "controls how frequently
> >>> the onaudioprocess event handler is called and how many sample-frames
> >>> need to be processed").
> >>>
> >>> If I'm right about this, I think it's preferable to polling in a
> >>> busy-loop with either setTimeout/setInterval (ugly ~+/-50ms slop in my
> >>> experience - really bad / error-prone for audio), or
> >>> requestAnimationFrame (presumably sample-accurate, but a more complex
> >>> idiom, and outside of the audio API).  ... A drawback to this would be
> >>> forcing the use of jsNode, which seems like a leap if one just wants
> >>> to schedule start/stop of oscillators/audioBuffers and parameter
> >>> automations.
> >>>
> >>> Frankly ... if I had my druthers, I'd like to be able to do something
> >>> like this:
> >>> var scheduler = ctx.createAudioScheduler(schedulingRate, callback);
> >>> and then use ... roughly:
> >>> var eventList = [...]; // list of (event procedure, relative event
> time)
> >>> tuples
> >>> function callback(evt) {
> >>>     for (var i in eventList) {
> >>>         var eventTime = eventList[i].time + evt.playbackTime;
> >>>         ctx.callbackAtTime(eventList[i].proc, eventTime);
> >>>     }
> >>> }
> >>> ...
> >>> scheduler.start();
> >>> scheduler.pause();
> >>> scheduler.reset();
> >>> ... etc
> >>>
> >>> I realize that this is arguably a crazy suggestion, but it could
> >>> afford arbitrary nesting of event schedules.
> >>>
> >>> Thoughts?  Curses?
> >>>
> >>> Cheers,
> >>> Roby
> >>>
> >>> On Thu, Jul 26, 2012 at 8:02 AM, Adam Goode <agoode@google.com> wrote:
> >>> > I think you would do node.noteOn(e.playbackTime +
> >>> > (samplesWrittenThisCallback / sampleRate)).
> >>> >
> >>> >
> >>> > On Thu, Jul 26, 2012 at 10:56 AM, Peter van der Noord
> >>> > <peterdunord@gmail.com> wrote:
> >>> >>
> >>> >> can you give an example?
> >>> >>
> >>> >> Let's say i am in my buffer-write loop (in response to an
> >>> >> AudioProcessingEvent), and at a certain point in that loop (i may or
> >>> >> may not
> >>> >> have written a number of values already) i want to call note-on on
> >>> >> another
> >>> >> node to be fired exactly at the same time that the buffervalue i'm
> >>> >> writing
> >>> >> (or about to write) would reach the soundcard. how would that work?
> >>> >>
> >>> >> at least, that's what i understand i can do then...?
> >>> >>
> >>> >>>
> >>> >>> An AudioProcessingEvent exposes the exact time of the audio to be
> >>> >>> generated in the sample stream as the "playbackTime" attribute.
>  Not
> >>> >>> that
> >>> >>> this makes callbacks any more useful as a source of exact timing,
> but
> >>> >>> it
> >>> >>> does mean that there is no need to keep track of time in separate
> >>> >>> variables.
> >>> >
> >>> >
> >>>
> >>
> >
>
>
Received on Thursday, 26 July 2012 23:24:59 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Thursday, 26 July 2012 23:24:59 GMT