- From: Marcus Geelnard <mage@opera.com>
- Date: Tue, 24 Jul 2012 14:53:34 +0200
- To: "Jussi Kalliokoski" <jussi.kalliokoski@gmail.com>, "Matthew Paradis" <matthew.paradis@bbc.co.uk>
- Cc: public-audio@w3.org
Hi Matt! Thanks for the feedback! Den 2012-07-24 14:36:40 skrev Matthew Paradis <matthew.paradis@bbc.co.uk>: > Some high level feedback... > > This fills in a lot of the holes in the web audio api for more advanced > developers. The majority of users may not be interested in these > functions I think the the natural outcome of shipping a DSP API like this would be that advanced developers will write (more or less) advanced signal synthesis/processing nodes using this functionality, and then "less advanced" developers will re-use that work (e.g. as JS libs). Just look at the plethora of VST plugins and the likes. > In an ideal world I would like to see this functionality > included in the web audio api and not developed separately. Having > multiple > apis that provide the building blocks for sound generation and > processing is > messy and confusing (in my opinion). I tend to lean that way too. The functionality should be possible to access without using the AudioContext, but without tying the DSP API to the Audio API there is a significant risk that in order to support non-Audio API use cases it has to be extended to the point where it's hugely bloated, impractical to implement, and will be superseded by WebCL and friends before it gets any traction anyway. /Marcus -- Marcus Geelnard Core Graphics Developer Opera Software ASA
Received on Tuesday, 24 July 2012 12:54:11 UTC