- From: Chris Wilson <cwilso@google.com>
- Date: Thu, 19 Jan 2012 14:34:25 -0800
- To: Joseph Berkovitz <joe@noteflight.com>
- Cc: public-audio@w3.org
- Message-ID: <CAJK2wqWyTo2Kp9PzKYQ6GgavZveFSQSD7M691=h9sWPsvdGOVw@mail.gmail.com>
I would suggest expanding UC_6's bullet: "novel music creation environments, e.g. beat grids or "virtual instruments" interpreting touch gestures in real time to control sound generation" to more clearly cover implementing LIVE virtual instruments that are being driven in real time by controllers, whether they are touch, mouse gestures, other inputs (e.g. via Gamepad api), or as direct controller input to the platform (i.e. MIDI inputs). As software synthesizers have been around for many years, this isn't particularly novel. Perhaps: "The implementation of virtual instruments, both live (e.g. driven by real-time inputs) and pre-programmed (beat grids or rendering of Standard MIDI Files)." I'm particularly concerned because the further description seems to be slanted toward non-dynamically controlled synthesis - or am I reading too much into a suggestion of a 5-second synthesis window? -Chris On Wed, Jan 18, 2012 at 9:47 PM, Joseph Berkovitz <joe@noteflight.com>wrote: > I have posted the new use case 6 to the wiki for virtual instrument > synthesis, and revised UC 5 a bit. Please see: > > > http://www.w3.org/2011/audio/wiki/Use_Cases_and_Requirements#UC_6:_synthesis_of_a_virtual_music_instrument > > http://www.w3.org/2011/audio/wiki/Use_Cases_and_Requirements#UC_5:_writing_music_on_the_web > > I imagine we are all very busy between now and the F2F but feedback is of > course extremely welcome. > > Best, > > ... . . . Joe > > *Joe Berkovitz* > President > > *Noteflight LLC* > 84 Hamilton St, Cambridge, MA 02139 > phone: +1 978 314 6271 > www.noteflight.com > >
Received on Thursday, 19 January 2012 22:34:55 UTC