Re: Requirements for Web audio APIs

On Mon, May 23, 2011 at 1:55 PM, Chris Rogers <crogers@google.com> wrote:

> On Sun, May 22, 2011 at 6:11 PM, Robert O'Callahan <robert@ocallahan.org>wrote:
>
>> On Mon, May 23, 2011 at 12:11 PM, Chris Rogers <crogers@google.com>wrote:
>>
>>> On Sun, May 22, 2011 at 2:39 PM, Robert O'Callahan <robert@ocallahan.org
>>> > wrote:
>>>
>>>> On Sat, May 21, 2011 at 7:17 AM, Chris Rogers <crogers@google.com>wrote:
>>>>
>>>>>  On Thu, May 19, 2011 at 2:58 AM, Robert O'Callahan <
>>>>> robert@ocallahan.org> wrote:
>>>>>
>>>>>> My concern is that having multiple abstractions representing streams
>>>>>> of media data --- AudioNodes and Streams --- would be redundant.
>>>>>>
>>>>>
>>>>> Agreed, there's a need to look at this carefully.  It might be workable
>>>>> if there were appropriate ways to easily use them together even if they
>>>>> remain separate types of objects.  In graphics, for example, there are
>>>>> different objects such as Image, ImageData, and WebGL textures which have
>>>>> different relationships with each other.  I don't know what the right answer
>>>>> is, but there are probably various reasonable ways to approach the problem.
>>>>>
>>>>
>>>> There are reasons why we need to have different kinds of image objects.
>>>> For example, a WebGL texture has to live in VRAM so couldn't have its pixel
>>>> data manipulated by JS the way an ImageData object can. Are there
>>>> fundamental reasons why AudioNodes and Streams have to be different ... why
>>>> we couldn't express the functionality of AudioNodes using Streams?
>>>>
>>>
>>> I didn't say they *have* to be different.  I'm just saying that there
>>> might be reasonable ways to have AudioNodes and Streams work together. I
>>> could also turn the question around and ask if we could express the
>>> functionality of Streams using AudioNodes?
>>>
>>
>> Indeed! One answer to that would be that Streams contain video so
>> "AudioNode" isn't a great name for them :-).
>>
>> If they don't have to be different, then they should be unified into a
>> single abstraction. Otherwise APIs that work on media streams would have to
>> come in an AudioNode version and a Stream version, or authors would have to
>> create explicit bridges.
>>
>
> For connecting an audio source from an HTMLMediaElement into an audio
> processing graph using the Web Audio API, I've suggested adding an
> .audioSource attribute.  A code example with diagram is here in my proposal:
>
> http://chromium.googlecode.com/svn/trunk/samples/audio/specification/specification.html#DynamicLifetime-section
>
> I'm fairly confident that this type of approach will work well for
> HTMLMediaElement.  Basically, it's a "has-a" design instead of "is-a"
>

Can you use that node as a sink, or just a source?

Aside: the "connect" method doesn't make it clear which way the data flows,
so it's hard to read your example without referring to the spec. Have you
considered flipping it around and calling it "addSource" instead, or
something like that?

Similarly, for Streams I think the same type of approach could be
> considered.  I haven't looked very closely at the proposed media stream API
> yet, but would like to explore that in more detail.  If we adopt the "has-a"
> (instead of "is-a") design then the problem of AudioNode not being a good
> name for Stream disappears.
>

Yes, but the problem of having two objects where one would do remains.

Rob
-- 
"Now the Bereans were of more noble character than the Thessalonians, for
they received the message with great eagerness and examined the Scriptures
every day to see if what Paul said was true." [Acts 17:11]

Received on Monday, 23 May 2011 02:19:23 UTC