- From: <bugzilla@jessica.w3.org>
- Date: Tue, 11 Dec 2012 21:21:20 +0000
- To: public-audio@w3.org
https://www.w3.org/Bugs/Public/show_bug.cgi?id=17335 --- Comment #15 from Chris Rogers <crogers@google.com> --- (In reply to comment #14) > Is there anything against the idea of having separate interpolator objects? > I start to realize that a lot of the time you want to make a choice *if* and > *how* you want to interpolate. So why not have interpolation as a separate > module that you can plug in front of an AudioParam. > Or is this a crazy idea? :) I don't think that there's any simple way to generically abstract an interpolator object to work at the level of the modules in the Web Audio API. Marcus has suggested an approach which is very much lower-level with his math library, but that's assuming a processing model which is very much different than the "fire and forget" model we have here. Even if there were a way to simply create an interpolator object and somehow attach it to nodes (which I don't think there is), I think that for the 99.99% case developers don't want to have to worry about such low-level details for such things as "play sound now". I've tried to design the AudioNodes such that they all have reasonable default behavior, trading off quality versus performance. An attribute for interpolation quality seems like a simple way to extend the default behavior, without requiring developers to deal with interpolator objects all the time. -- You are receiving this mail because: You are on the CC list for the bug.
Received on Tuesday, 11 December 2012 21:21:22 UTC