W3C home > Mailing lists > Public > public-audio@w3.org > July to September 2013

Fwd: Sites using webkitAudioContext only

From: Jussi Kalliokoski <jussi.kalliokoski@gmail.com>
Date: Thu, 4 Jul 2013 12:32:21 +0300
Message-ID: <CAJhzemUeD4AKUM6U37iPiEwFWxwd3QeRkfvaYwxwOuK3K9rqig@mail.gmail.com>
To: "public-audio@w3.org" <public-audio@w3.org>
Oops, we accidentally went offlist with Roc:

---------- Forwarded message ----------
From: Jussi Kalliokoski <jussi.kalliokoski@gmail.com>
Date: Wed, Jul 3, 2013 at 7:44 PM
Subject: Re: Sites using webkitAudioContext only
To: Robert O'Callahan <robert@ocallahan.org>


I liked your Media Streams Processing API proposal, but obviously we won't
be going back to that, so my idea of corrections we need to make is at
least these:

AudioContext#createBuffer():
 * new AudioBuffer(sequence<Float32Array> data, sampleRate). This will
avoid the synchronous memory allocation so authors can even offload the
creation of the actual buffers to Web Workers. It also helps avoid an extra
copy if you already have the data when you create the buffer.

AudioContext#decodeAudioData():
 * At the very least this should return a promise instead of the callbacks
as arguments.
 * Is there some way we could integrate this into AudioElement? e.g.
Promise AudioElement#decode(). This would make the normal pipeline of
loading the assets simpler as well.

AudioNodes:
 * Use constructors.

AudioContext#destination
 * Use AudioElement here as well? Either assign a MediaStream as the `src`
or even better, make AudioElement and MediaStream connectable to, e.g.
myGainNode.connect(myAudioElement) and there we have all that's required to
make an audio stream audible. The AudioElement would work as the sink here
so if for example pause() is fired, it would stop pulling in content. (That
would fix the much wanted pause requirement as well)

AudioContext#listener
 * First of all I don't think spatialization should be part of the first
release, but aside from that:
 * Each spatialization node has its own listener, or
 * AudioElement and MediaStream have a listener associated to them

AudioContext#createPeriodicWave()
 * Use a constructor.

Cheers,
Jussi


On Wed, Jul 3, 2013 at 11:04 AM, Robert O'Callahan <robert@ocallahan.org>wrote:

> On Wed, Jul 3, 2013 at 7:39 PM, Jussi Kalliokoski <
> jussi.kalliokoski@gmail.com> wrote:
>
>> 1. Web Audio API is not webby: I agree, and have been ranting about this
>> enough already. Let's fix this.
>>
>
> Do you have any particular proposals in mind?
>
> Rob
> --
> Jtehsauts  tshaei dS,o n" Wohfy  Mdaon  yhoaus  eanuttehrotraiitny  eovni
> le atrhtohu gthot sf oirng iyvoeu rs ihnesa.r"t sS?o  Whhei csha iids  teoa
> stiheer :p atroa lsyazye,d  'mYaonu,r  "sGients  uapr,e  tfaokreg iyvoeunr,
> 'm aotr  atnod  sgaoy ,h o'mGee.t"  uTph eann dt hwea lmka'n?  gBoutt  uIp
> waanndt  wyeonut  thoo mken.o w  *
> *
>
Received on Thursday, 4 July 2013 09:32:48 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:03:22 UTC