Re: Graphical Scores and SVG - And MIDI

I've been idly watching the flurry of messages recently, looking for 
mention of Midi.   While various graphic strategies and time-based 
container issues seem to be getting good attention, I hope that back in 
the pitch world NOTHING is based on midi.   Remember that 127 values 
from one 7-bit byte was a dinosaur age construct, limiting the world to 
just those numbers of notes in just those octaves.

For some music it is critical that a C# is not the same as a Db, and for 
much music it is important that a C# from one sounding set be different 
from a C# from a different sounding set.

So, not sure if this needs to be raised -- it may have been addressed 
long ago -- but NOTHING should be based on a series of 127 note 
numbers.  Probably nothing should be related to midi at all.     Let the 
programs translate from their internal structures as needed into MIDI, 
but no more.

Will Copper

On 4/2/2017 3:44 PM, Pavel Studeny wrote:
> Hi,
>
> as a background, I'm about to cancel my music notation wikipedia 
> project, because I couldn't find partners for it. I can therefore 
> share some of my findings from there. Most importantly, HTML seems to 
> be a much better basis for an interactive music notation than SVG. 
> Generally, SVG audio features should map to existing HTML features 
> when possible. It's unfortunate that HTML5 neglects MIDI and also W3C 
> Audio Working Group makes a difference between MIDI and sample-based 
> formats like mp3.
>
>
> When it comes to connecting SVG with audio, there are a few use cases 
> and it's possible that not all of them shall be solved by the same 
> approach:
>
> 1. Music notation editing: play a tone when a new note appears - this 
> is probably out of scope of this question
>
> 2. Synchronization driven by the visual data, most likely a MIDI file 
> with SVG during a playback of the whole SVG or a part of it (by time, 
> instrument part etc.)
>
> 3. Synchronization driven by the audio data, most likely an mp3 or 
> similar format with SVG, similar to video subtitles
>
>
> If use case 1. would be included in this scope, it would be best 
> solved by A. (interspersed data), that, on the other hand, would be 
> unsuitable for use case 3., where C. (separate mapping) would be the 
> most appropriate.
> B. (interspersed references) looks like the option that would somewhat 
> suffice to most of the use cases. I perceive it more like "synchronize 
> with something" rather than "play audio from here to there", where 
> "something" could be also a video or other media where time references 
> make sense.
>
>
> Pavel
>
>
>
> On 2017-03-29 17:58, Joe Berkovitz wrote:
>> Hi all,
>>
>> I think this is a good moment to talk in more detail about the potential
>> uses of SVG in the work of this group since we've had a fair bit of
>> activity on that topic.
>>
>> To that end, I've just had a very useful exchange with W3C's Chris
>> Lilley, who is copied on this email. Chris is a computer scientist who
>> originally chaired the W3C SVG Working Group when it began in 1998, and
>> saw SVG through all the way from an abstract idea to its modern
>> realization. So it's fair to say he's been observing and thinking about
>> its uses for a very long time, with an expert's perspective. Chris is
>> also a member of the W3C Web Audio Working Group which is responsible
>> for both audio and MIDI standards on the Web.
>>
>> What I'll present here is my current thinking, informed by some
>> thoughtful points made by Chris -- who I'm encouraging to jump in with
>> his own words.
>>
>> Let me say first that I see at least two potential uses for SVG in our
>> community group, and they seem to harmonize perfectly:
>>
>> 1. SVG (with some relationship to sound) can represent musical scores
>> with arbitrary visual and sonic content. Thanks to James Ingram for
>> highlighting this particular case to the group.
>>
>> 2. SVG (with some relationship to sound) can serve as an intermediate
>> format that represents *a specific rendering* of semantic music notation
>> markup such as MNX, MusicXML, MEI, or anything else.
>>
>> So far a lot of discussion has revolved around #1, but #2 is at least as
>> significant. Consider that it permits semantic markup to be converted
>> into a format that can be easily displayed and played back by an
>> application that is much simpler than one that processes semantic
>> markup. Of course, that format is necessarily quite limited in other
>> ways (try transposing it to another key, or reformatting it for a
>> different display size!)  But, as a final-delivery mechanism, #2 seems
>> to have a lot of merit. It could provide a standardized way to package
>> the output of notation renderers, regardless of what markup language
>> they use. In fact, MathML (a semantic markup language for math formulas)
>> is routinely converted to SVG by various standard libraries for exactly
>> this purpose.
>>
>> Now: I believe we don't need to get into a big debate about which use is
>> more important. They both are. Also, in neither case do they eliminate
>> our need for a semantic markup language within the confines of some
>> known musical idiom, so there's no need to stop that train from leaving
>> the station. MNX explicitly makes room for graphical encodings to be
>> placed within it.
>>
>> Relative to SVG, then, the key question is: What's the best way to
>> augment an SVG document with information on sonic performance? There are
>> multiple ways to do it. Chris and I discussed several very high-level
>> approaches:
>>
>> A. Intersperse performance data (e.g. individual MIDI or note events)
>> throughout an SVG file. James's proposed approach follows this pattern:
>> MIDI codes are sprinkled directly into the file and attached to
>> graphical elements. One could also use a different means to specify
>> events, say like the way that MNX proposes.
>>
>> B. Intersperse *references* to a separate performance file (e.g. a
>> Standard MIDI file, MP3 audio file) throughout an SVG file. In this
>> approach, SVG elements are tagged with simpler data that "points to"
>> some time or time range within a separate document. MEI uses this
>> technique in places. Example:
>>     <measure sound-ref="ThisPiece.midi" sound-start="1:22"
>> sound-end="1:24">...
>>
>> C. Create a completely separate mapping file that identifies a
>> correspondence between SVG and a performance file. Such a file might
>> contain entries like this:
>>     <measure-mapping graphics-ref="ThisPiece.svg#m21"
>> sound-ref="ThisPiece.midi" sound-start="1:22" sound-end="1:24"/>
>>
>> I do not think there is a clear winner among these, and I don't think we
>> should immediately get into the weeds. The next step in this discussion
>> -- when we have it -- is to look at the pros and cons of these various
>> approaches for uniting graphical and sonic information. Each has
>> advantages and disadvantages, and they need to be brought to the surface
>> in what will be a lengthy discussion of its own. All of the above
>> techniques have been tried in different contexts and there are definite
>> lessons to be learned.
>>
>> As a corollary: let's stop debating the importance of pure graphical
>> music encoding. There is no need for a debate: we agree that it *is*
>> important. However, its role and its structure do need not to be settled
>> in advance of our work to move ahead on CWMN semantic music encoding. We
>> will need time to tease out the answers to the questions raised above.
>>
>> Finally a word on Frankfurt: the co-chairs plan to devote a limited
>> period of time to discussing this topic, but it will certainly be
>> smaller than many would like (myself included). We are limited by the
>> other big things on the agenda. But, in truth, most of the good stuff in
>> standards groups, happens on email and Github over time, not in large
>> in-person meetings!
>>
>> Best,
>> .            .       .    .  . ...Joe
>>
>> Joe Berkovitz
>> Founder
>> Noteflight LLC
>>
>> 49R Day Street
>> Somerville MA 02144
>> USA
>>
>> "Bring music to life"
>> www.noteflight.com <http://www.noteflight.com>
>
>

Received on Monday, 3 April 2017 00:55:06 UTC