- From: Joe Berkovitz <joe@noteflight.com>
- Date: Fri, 24 Mar 2017 12:29:06 -0400
- To: James Ingram <j.ingram@netcologne.de>, public-music-notation@w3.org
- Message-ID: <CA+ojG-YacPgT5AP2soL0GqA997-ZNyo+m6JhzybMvV9KkuioDg@mail.gmail.com>
James, It probably feels redundant, but could you please post this to public-music-notation-contrib@w3.org? That is the place to post all contributions of IP -- in effect, by posting there, you are indicating that this IP is covered by the W3C Patent Policy that we all agreed to as CG members. The public-music-notation list (this one) is for informational notices and questions about the group, not for spec development. Best, . . . . . ...Joe Joe Berkovitz Founder Noteflight LLC 49R Day Street Somerville MA 02144 USA "Bring music to life" www.noteflight.com On Fri, Mar 24, 2017 at 5:58 AM, James Ingram <j.ingram@netcologne.de> wrote: > Hi all, > > I very much enjoyed the last face-to-face meeting, and will be there again > on April 7th. > > I've been making steady conceptual and practical progress with my SVG+MIDI > project during the past year, and have just uploaded a new version of my > experimental *Assistant Performer* to > http://james-ingram-act-two.de/open-source/assistantPerformer/ > assistantPerformer.html > > This version uses a new, more generalized MIDI definition for the event > symbols, an enhanced speed control, and a new "conducting" mode -- in which > mechanical time (Javascript's performance.now()) is replaced by the live > conductor's now(). The relation between what the conductor does and how it > affects playback is programmable, pointing the way to the use of > touch-screen gestures and other methods for performing symbolically notated > scores. > > Below is a proposal that I'd like to put on the agenda somehow, but first > I'd like to thank Daniel Spreadbury, Andrew Hankinson and Craig Sapp: > 1. Daniel for his patience during an email exchange last October, in which > he convinced me that I should separate the container (SVG) and MIDI levels > more clearly in my mind, and rethink the way I code MIDI data inside event > symbols. > My applications now use a format that can store/send *any* MIDI data, not > just a small subset. And I now realise that the MIDI definition could be > used in *any* XML that has event symbols, not just in SVG. For example in > MNX or MEI. > 2. Andrew for pointing me at the Verovio team (I now watch both MEI and > the Verovio GitHub repository.), and Craig Sapp for an exchange that > reminded me that the *Assistant Performer* could quite easily change > speed at runtime, not just globally. While implementing this capability, I > realized that I could replace performance.now() completely, thus putting > the time-control back where it belongs - in the live performer's hand and > mind. > > *A proposal*: My current definition for the temporal meaning of an event > symbol is quite small and straightforward, and I would like to review its > details with other programmers interested in this area. That means > especially those whose software exports SVG. But not only those: I'd really > welcome input from anyone with expert knowledge of MIDI and/or XML. It > would be really great if we could arrive at an agreed, optimised, standard > definition. > I've put the details in a GitHub issue here: > https://github.com/notator/Moritz/issues/3 > > Looking forward to seeing you all again in Frankfurt! > Best wishes, > James > > -- > http://james-ingram-act-two.de > https://github.com/notator >
Received on Friday, 24 March 2017 16:29:48 UTC