- From: Joseph Berkovitz <joe@noteflight.com>
- Date: Wed, 15 Dec 2010 20:32:05 -0500
- To: srikumarks@gmail.com
- Cc: Doug Schepers <schepers@w3.org>, "Cutler, Roger (RogerCutler)" <RogerCutler@chevron.com>, Michael Good <musicxml@gmail.com>, public-xg-audio@w3.org
- Message-Id: <DCB20E98-239E-4145-94BD-F420C44FED99@noteflight.com>
First, thanks to Doug for pointing out that there's much room for light as well as heat here. I think we all share many common interests, and there are multiple ways forward from the current state of affairs. I am sure we'll find places to discuss all of these paths over time. The discussion of semantic music representation is very interesting to me but as I've said I consider any new standard in this are a hugely ambitious project in its own right. Now on to Kumar's likewise very practical post concerning MIDI+DLS support, which is a very relevant question for the group. My reason for being in this group is to make sure that the audio API provides exactly what notation applications need, and I've looked at this exact question. I am familiar with the innards of commercial notation tools that generate sound, having created one such product (Noteflight) and studied quite a few others. On joining the group as an invited expert, I immediately looked at the question of whether Chris's proposed API could support a minimal but feasible approach to music synthesis -- basically a simple wavetable synthesizer working off of a set of pitch- shifted instrumental sample loops, with programmatically determined amplitude envelopes. The answer was basically, yes, resulting in the playable music-notation example at http://slice.noteflight.com/audioXG (this requires a WebAudio build of Safari). The audio in this prototype is fully synthesized within the program from dynamically downloaded samples, without the aid of MIDI or DLS. The graphics are all rendered in SVG without the need of a specialized music font. My conclusion from this experiment (and borne out by other work on other platforms) is that MIDI+DLS support is definitely not necessary to get notation applications off the ground with the Audio API. What is necessary is a high-level modular API of the type that Chris Rogers is proposing, since it makes it easy to create musical sound output without becoming entangled in low-level buffer processing details. Moreover, it is not necessarily the case that notational tools use MIDI internally to represent music performance. MIDI is more of an output-stage format, invaluable for interfacing with external instruments, or with the many pluggable soft synths designed around MIDI (e.g. Garritan). I hope no one thinks I'm saying that MIDI and DLS are not important, or that they aren't worth supporting directly in the Audio API *at some point*. Browser MIDI support in the long term is a must in my opinion, and DLS is certainly worth looking at (though it's quite hellish to implement the DLS standard). What I am saying is this: MIDI and DLS are not required out of the gate to support notation apps in HTML5, provided that we have a high-level audio API. So I respectfully propose that we not stop the train at this time to take them on board. Same goes for web music fonts. Nice to have? Yes, possibly, depending on how they are defined. Necessary? Definitely not. Best, ... . . . Joe Joe Berkovitz President Noteflight LLC 84 Hamilton St, Cambridge, MA 02139 phone: +1 978 314 6271 www.noteflight.com On Dec 15, 2010, at 7:14 PM, Kumar wrote: > Thanks. This does clear up the air. > > The energies of the audio API group, to me, seem best spent in the > area of making sure notation *with* audio rendering is possible. .. > And > for that, the audio API ought to aim to support what programs like > Sibelius *require* of the audio sub-system with other W3C standards > supporting the visual part. > > For those interested in the conventional staff notation, if it is > adequate > for you if, say, google adds notation tools to google docs in the > future but > might not be able to render it as audio, then no fundamentally new > additions to W3C client-side technology seem needed ... *save*, > maybe, for the addition of a standard music font as a "web font". > (Maybe that can be pursued independent of the audio WG.) > > So "do we need MIDI+DLS support in the audio API?" ought to be > an in-scope question for this WG, given that notation programs are > likely to rely on these technologies to translate notation into sound > .. at least for pre-flighting if not for production. People familiar > with > the innards of commercial notation tools that also generate sound > can weigh in here. (*) > > Regards, > -Kumar > > (*) I'm familiar with the technology at the broad level and can make > guesses like suggesting DLS+MIDI, but I wouldn't know > whether a program like Sibelius prefers to go the whole nine yards > with its own synthesizers or use DLS+MIDI underneath. > > On Thu, Dec 16, 2010 at 7:12 AM, Doug Schepers <schepers@w3.org> > wrote: >> Hi, folks- >> >> I think this thread has drifted away from productive technical >> discussion >> into less harmonious exchanges. I think it would be valuable for >> everyone >> to step back and look for common ground, understanding each others' >> positions. >> >> Everyone agrees that a common music notation system, which can >> satisfy the >> richness of international music traditions, would be a good thing. >> >> Roger, I think it's reasonable that Michael Good, who has been >> working on >> MusicXML for several years and had good success with its deployment >> as an >> open format, would not take too kindly to criticisms of it that he >> feels >> aren't justified. He is very familiar with the current >> marketplace, and in >> fact makes his living from revising and adapting MusicXML; I think >> we can >> trust his expertise there, and recognize that he doesn't want >> anything to >> undermine the success of MusicXML. Just because some authoring >> tools don't >> give you the results you want doesn't mean that the underlying >> MusicXML >> format can't support them. And while there are many places where >> sheet >> music is prevalent, that doesn't necessarily immediately equate to >> a large >> market for web-based music notation; this isn't the sheet-music >> heyday of >> the early 1900s; most music now is electronic audio, not >> notation... but I'm >> not dismissing the usefulness, just tempering the claim. >> >> Michael, I think it's fair that people are approaching you with use >> cases >> that they've found where MusicXML doesn't meet their needs, even if >> it's the >> fault of authoring tools or other infrastructure and not MusicXML >> as a >> format; unifying the market is often done more effectively by an >> large >> organization. And the market itself may be changing; if music >> notation is >> supported natively in browsers (which seems realistic, giving the new >> emphasis on audio and multimedia, and the increase in use in ABC- >> notation >> script libraries), you have a whole new kind of user agent, which >> may well >> alter the market. I wouldn't assume that things are going to >> continue as >> they have been, with niche uses of music notation, and the only >> browser >> rendering done through script libraries and plug-ins. I understand >> that you >> want to keep making money off your stewardship of MusicXML, and >> that you >> genuinely believe that it is the most functional technical approach >> to music >> notation available. But you shouldn't be surprised when people >> push back on >> what and where they can discuss music notation formats. That said, >> I'm glad >> you've provided what seems like a good and open format, and are >> committed to >> evolving it; I think you deserve to be paid for you're doing. >> >> So... I hope that clears the air rather than muddying the waters. >> I am >> personally interested in seeing more technical discussion on this >> list of >> use cases and requirements for music notation formats, and >> comparisons and >> critiques of existing formats, in the spirit of advancing the state >> of the >> art, regardless of whether W3C is where that happens. (Though, as >> Michael >> knows, my preference is that it would ultimately happen at W3C.) >> >> Regards- >> -Doug Schepers >> W3C Team Contact, SVG, WebApps, and Web Events WGs >> >> >
Received on Thursday, 16 December 2010 01:32:44 UTC