- From: Marcus Geelnard <mage@opera.com>
- Date: Wed, 23 May 2012 10:17:05 +0200
- To: robert@ocallahan.org, "Chris Wilson" <cwilso@google.com>
- Cc: "Colin Clark" <colinbdclark@gmail.com>, "Chris Rogers" <crogers@google.com>, "Jussi Kalliokoski" <jussi.kalliokoski@gmail.com>, public-audio@w3.org, "Alistair MacDonald" <al@signedon.com>
Den 2012-05-23 01:46:06 skrev Chris Wilson <cwilso@google.com>: > One question - "exposing the behaviour of built-in AudioNodes in a manner > that authors of JavaScriptAudioNodes can harness" sounds like subclassing > those nodes to me, which isn't the same thing as providing only > lower-level libraries (like FFT) and asking developers to do the hook-up > in JS nodes. What's the desire here? I think the cleanest and most useful approach would be to provide functions/classes independent of the Audio API, so that you can use it in any way you want, including applications other than audio. For instance, compare this to how typed arrays originally emerged from WebGL (it was a requirement for making WebGL work), but has found wide-spread use in many other applications too. /Marcus
Received on Wednesday, 23 May 2012 08:18:02 UTC