Ways forward

I'm starting a new thread because the previous one has focused on exposing and debating various points, while I am looking for a conversation on process that might take us to some conclusions about those very points. I'm going to try to keep my opinions out of this for now. When I say something "may" be true, I simply mean that there are clearly differences of opinion in the group that need resolution.

To summarize, some of the questions are:

- Is this group trying to create the embryo of a video-processing or general-time-based-media API? This is one of the important drivers for the Streams-vs-Nodes issue, and Ian pointed out that Streams implicitly pull other media types into the realm of this API. This may expand the scope of the work done here, and may introduce additional concepts into the API.

- Identification of Streams and AudioNodes from an implementation' viewpoint. To some this seems obviously a good thing, to others not, and it feels fair to say that neither conclusion is a slam-dunk. It also feels fair to say that a lot of progress could be made without having to resolve this issue. Much of the meat of the API is independent of whether Stream is extended by (or identical to) AudioNode, wrapped by AudioNode or disjoint from AudioNode.

- Is there an alternative approach being proposed for graph construction and hookup (setting aside the question of nodes-vs-streams) providing advantages over what we've been looking at to date?

- What are the cost/benefits of different approaches to connecting processing graphs to DOM elements over what we've been looking at to date?

- How does scheduling work for non-continuous media sources that are played at unique times or which recur periodically? This is an important set of use cases for games, simulations and music applications, and which seem rather distant from RTC concerns. As an example, a source could be a 10-millisecond burst of noise to be played every time some game play event occurs, the sound of one animated object bouncing against another, or a sample of a clarinet note to be started and then looped over a subrange for some arbitrary time interval.

- In games, simulations and music, media sources tend to be very numerous, to be quite short, and to overlap a great deal. How are these qualities addressed by each API? What will the 'synthesizer.js' code mentioned in Rob's proposal look like?

- What are alternative thoughts on how common, essential audio transformations can be exposed by the API, including sample rate adjustment, amplitude envelopes, and sundry filters and mixers? How do they stack up against the AudioNode approach?

My opinion that code snippets of the type given in Rob's proposal are very useful as discussion points and drivers for choices, and for the structural issues we are dealing with they may possibly be more useful than completely built-out working examples.  So my suggestion on a roadmap is as follows. 

0. Decide whether this group will be taking on the larger domain of general temporal media graphs, or agree to defer the question.
1. Build out a concrete API definition for the MediaStreamAPI proposal, somewhat comparable to the AudioNode API.
2. Augment the use cases #1-14 from Rob's proposal with additional use cases already discussed in this group and addressed by the AudioNode API, and filter out use cases that are deemed out of scope by the chair.
3. Have each API proposal include code snippets supporting each use case, like those already in the MediaStreamAPI proposal.
4. Solicit opinions on the strengths and weaknesses of the proposals, using the use cases and snippets as artifacts.
5. Generally take the best of both approaches. Try to stay ahead of the game by retaining the approach that is already implemented where the proposals are roughly equivalent.
6. Make a final decision on the Stream/AudioNode identification question if the conclusion hasn't already become obvious.


... .  .    .       Joe

Joe Berkovitz
President
Noteflight LLC
84 Hamilton St, Cambridge, MA 02139
phone: +1 978 314 6271
www.noteflight.com

Received on Saturday, 11 June 2011 14:14:07 UTC