Re: Some thoughts

Hi Jussi,

I agree with much of what you have to say (having built a modular  
synth project of my own, though not in JS) and have raised a number of  
the same points, minus the emphasis on MIDI support.

Since then, however, I've become convinced that it's best to leave  
room for expansion in most of these areas, in order to get a stable  
specification out there and accepted. In the meantime, it is possible  
to code up an approach that provides smooth parameter changes across  
buffer fills, and it's possible to code adapters that bridge between  
MIDI and the current API.  For what it's worth, my opinion is that  
working on the Audio and Param connection approach to be simpler and  
more consistent is the highest priority right now.

All the best,

... .  .    .       Joe

Joe Berkovitz
President
Noteflight LLC
160 Sidney St, Cambridge, MA 02139
phone: +1 978 314 6271
www.noteflight.com


On Dec 3, 2010, at 8:52 AM, Jussi Kalliokoski wrote:

> Hey guys,
>
> Lately I've invested really a lot of time and thought to the  
> research of
> our Web Audio Api, and there are some things that are disturbing me,  
> so
> I thought I'd bring them up. But first I'd like to say I really really
> appreciate the work that's been done here, it's awesome and I actually
> hope you'll prove my points to be wrong.
>
> So, I'll just start listing things:
>
> First: Synchronization. Say I have an AudioParam that is being  
> modulated
> with an AudioCurve. Cool. What if I want to add a UI that controls it?
> According to the specification it would seem to me that it would
> actually change the parameter only every time the AudioContext asks  
> for
> a buffer, so if I for example move a slider, the value changes get
> stacked up to the next buffer change, which would introduce audible
> edges in the parameter change, if I change the cutoff parameter  
> slowly,
> ie. Am I mistaken? This is also a big concern for midi events, which
> brings me to my next point:
>
> MIDI. Yes, we can always create VMKBDs or cool touch interfaces, but  
> if
> this is to be used in music production, MIDI is a must. I understand  
> we
> cannot achieve support for external MIDI devices, but it's going to  
> come
> sooner or later. So I'm saying, let's not make the system crippled  
> from
> the beginning, we've seen too many examples of that in the audio  
> area. I
> think we want this to be as ready to the future possibilities as is
> possible, that, for me, means implementing built-in support for MIDI
> events, even though we can't yet receive them. This would help with  
> the
> fact that there are in-browser VMKBD implementations and MIDI file
> readers, and so that the support would already be there when we have  
> the
> actual devices too, not forcing the developers to change the whole
> architechture they built on the last time (Like VST and DirectX have
> done).
>
> Third thing is that now we have MODULES that are connectible, however
> the ideal situation, IMO would be that we don't connect modules, we
> connect ports, just like in analog audio. Say there are three port
> types, Audio, Midi and Param and these all have outputs and inputs  
> which
> can be connected. This for me, is a much more flexible and modular
> environment, which I think is something that we should achieve with  
> our
> work. You can see what I mean by visiting my Modular Synth project at
> http://niiden.com/jstmodular/ (FF4 only).
>
> I know all this seems a little bit late after all the hard and great
> work Chris has done and everyone here has agreed upon, but I really
> resist the idea of making a system that is already... outdated (sorry)
> on it's release.
>
> Best Regards
> Jussi Kalliokoski
>
> P.S. Please don't hate me for this, I felt like I had to bring this
> up. :/ I would hope this is regarded as constructive criticism, and a
> place for further discussion.
>
>

Received on Friday, 3 December 2010 14:31:00 UTC