Re: Questions about AudioParam

Hi Ehsan, happy to answer your questions:

On Wed, Oct 24, 2012 at 3:45 PM, Ehsan Akhgari <ehsan.akhgari@gmail.com>wrote:

> I'm implementing AudioParam in Gecko, and one of the things that I'm
> missing is what the point behind the minValue and maxValue properties are.
> The spec explicitly says that those bounds values should not be enforced
> when setting the value property, and WebKit doesn't seem to enforce them
> anywhere at all.  Are they intended to be merely informational?  In which
> case, should we consider removing them?  Or are implementers supposed to
> enforce them in the automation methods?
>

Well, it's really merely informational.  We can discuss removing them on
the list.  I think they do provide *some* value, but implementors don't
really need to do anything at all with that information other than provide
the read-only value...


>
> Another question that I have is with regard to the semantics of value
> setter.  I'm not exactly sure how I should reconcile the fact that the
> author can set AudioParam.value on an AudioParam object which has a number
> of events set up.  Should that value override the value that AudioParam
> introduces for any given time?  (The spec seems to suggest that by saying
> "If the .value attribute is set after any automation events have been
> scheduled, then these events will be removed.")  However, WebKit seems to
> use the AudioParam.value to override defaultValue, by initializing an
> m_value member to defaultValue when the AudioParam object gets created and
> then using m_value as the default value from now on, which effectively
> means that if the author sets the value property on an AudioParam, that
> value will be used as the default value where that needs to be used, and
> AudioParam.defaultValue will always return the initial default value.  If
> that is the desired semantics, we should clarify the spec here.  Also, I'm
> not sure if having the defaultValue getter would make a lot of sense in
> that case, since the default value would effectively be AudioParam.value
> when the AudioParam object first gets created.
>
>
Basically setting .value attribute when there are timeline events is an
anomaly which isn't really useful or fully specified.  One option is to
simply ignore the value (and not throw exception).  We can discuss that on
the list, but that's what I would do, even if WebKit is not doing exactly
this.  I'm not aware of any developer code which is setting .value while
there are timeline events.


> The last question that I have is about the computedValue property.  It's
> not entirely clear to me whether reading that should calculate the current
> automation value based on the current time of the AudioContext.  That would
> probably introduce the need for some locking if audio is being processed in
> a background thread, which would be a little unfortunate.  Also, it's not
> clear to me what should be calculated as the second contributing component
> in case the AudioParam object is not connected to a node.
>
>
WebKit hasn't yet implemented this attribute, but I'm thinking that each
AudioParam will keep track of the latest .computedValue which is computed
every single render quantum in the audio thread.  Then the main JS thread
simply can read the latest value of this attribute.  WebKit *is* basically
computing this value every render quantum, but doesn't store it away to be
read by the attribute...

The 2nd contributing component would be 0 if there are no AudioNodes
connected to the AudioParam --- it's basically considered as neutral
silence.  I kind of imply that in my wording by saying that it mixes
 "audio data from any AudioNode output connected to", so if there are *no*
connections then the mix is 0 -- dead silence.

Cheers,
Chris


> Thanks!
> --
> Ehsan
> <http://ehsanakhgari.org/>
>

Received on Wednesday, 24 October 2012 23:26:46 UTC