Re: [web-audio-api] (setValueCurveAtTime): AudioParam.setValueCurveAtTime (#131)

> [Original comment](https://www.w3.org/Bugs/Public/show_bug.cgi?id=17335#5) by Marcus Geelnard (Opera) on W3C Bugzilla. Thu, 06 Dec 2012 08:38:35 GMT

(In reply to [comment #5](#issuecomment-24244407))
> Since it is an audio rate controller you should always see it as a signal
> and apply signal theory.

True, but in this case I think that the real use case is to use quite low-frequency signals (like various forms of ramps that run for at least 20 ms or so). For those scenarios, band-limiting should not be necessary. As long as the spec mandates a certain method of interpolation (e.g. nearest, linear or cubic spline), the user knows what to expect and will not try to make other things with it (like modulating a signal with a high-frequency waveform).

Also, I think it's important that all implementations behave equally here, because different interpolation & filtering methods can lead to quite different results. E.g. a 5 second fade-out would sound quite different if it used nearest interpolation instead of cubic spline interpolation. In that respect, a simpler and more performance friendly solution (like nearest or linear interpolation) is better, because it's easier to mandate for all implementations.

---
Reply to this email directly or view it on GitHub:
https://github.com/WebAudio/web-audio-api/issues/131#issuecomment-24244415

Received on Wednesday, 11 September 2013 14:35:59 UTC