Re: Some thoughts

On 12/03/2010 03:52 PM, Jussi Kalliokoski wrote:
> Hey guys,
>
> Lately I've invested really a lot of time and thought to the research of
> our Web Audio Api, and there are some things that are disturbing me, so
> I thought I'd bring them up. But first I'd like to say I really really
> appreciate the work that's been done here, it's awesome and I actually
> hope you'll prove my points to be wrong.
>
> So, I'll just start listing things:
>
> First: Synchronization. Say I have an AudioParam that is being modulated
> with an AudioCurve. Cool. What if I want to add a UI that controls it?
> According to the specification it would seem to me that it would
> actually change the parameter only every time the AudioContext asks for
> a buffer, so if I for example move a slider, the value changes get
> stacked up to the next buffer change, which would introduce audible
> edges in the parameter change, if I change the cutoff parameter slowly,
> ie. Am I mistaken? This is also a big concern for midi events, which
> brings me to my next point:
>
> MIDI. Yes, we can always create VMKBDs or cool touch interfaces, but if
> this is to be used in music production, MIDI is a must.
This is a good question. It is not quite clear to me what is the
target of the becoming Web Audio API (whatever it will look like).
Do we want to be able to implement DAWs?
I think the first step would be to bring some audio handling
capability to web and then in the later versions of the
spec new features could be added.


> I understand we
> cannot achieve support for external MIDI devices, but it's going to come
> sooner or later. So I'm saying, let's not make the system crippled from
> the beginning, we've seen too many examples of that in the audio area. I
> think we want this to be as ready to the future possibilities as is
> possible, that, for me, means implementing built-in support for MIDI
> events, even though we can't yet receive them. This would help with the
> fact that there are in-browser VMKBD implementations and MIDI file
> readers, and so that the support would already be there when we have the
> actual devices too, not forcing the developers to change the whole
> architechture they built on the last time (Like VST and DirectX have
> done).
>
> Third thing is that now we have MODULES that are connectible, however
> the ideal situation, IMO would be that we don't connect modules, we
> connect ports, just like in analog audio. Say there are three port
> types, Audio, Midi and Param and these all have outputs and inputs which
> can be connected. This for me, is a much more flexible and modular
> environment, which I think is something that we should achieve with our
> work. You can see what I mean by visiting my Modular Synth project at
> http://niiden.com/jstmodular/ (FF4 only).
>
> I know all this seems a little bit late after all the hard and great
> work Chris has done and everyone here has agreed upon, but I really
> resist the idea of making a system that is already... outdated (sorry)
> on it's release.
>
> Best Regards
> Jussi Kalliokoski
>
> P.S. Please don't hate me for this, I felt like I had to bring this
> up. :/ I would hope this is regarded as constructive criticism, and a
> place for further discussion.
>
Criticism is very welcome.
And note, there are just proposal APIs (webkit and mozilla), not even
W3C drafts.
Everything may still change.



-Olli

Received on Friday, 3 December 2010 14:04:48 UTC