Re: MusicXML and MIDI

The technology that we adopted to create MEI customizations allows projects
to specify the subset of MEI that they're using, as well as tightening
certain aspects of the format, like Johannes said. These customizations can
also be thought of application profiles, i.e. detailing what MEI a program
will be able to consume or output.

In case it's helpful, this is a white paper that I wrote to the MEI
Advisory Board back in 2010 detailing the customization process:
https://docs.google.com/document/d/1dZCTRnNU2yO6g_bMtwctnNUqvfgUL7ZOWrc8Oqefb7o/edit?pli=1

I think Zoltan will be talking a bit more about application profiling at
his poster at MEC.

Best,
Raff

On Mon, Apr 11, 2016 at 5:33 PM, Joe Berkovitz <joe@noteflight.com> wrote:

> Hi Johannes,
>
> Thanks for the detailed answer, which makes sense. We obviously believe
> that profiles, whatever they are called, are essential to any framework or
> language allowing a full spectrum of expressivity and consistency.
>
> At the W3C meeting in Frankfurt we discussed that encoding profiles may
> amount to more than simply support for a repertoire or subset of schema
> elements -- they may also need to embody constraints that are not easily
> expressible in a schema language, such as compliance between a measure's
> time signature and the contents of its voices or layers (to take a very
> CMN-specific example).
>
> I look forward to attending MEC and hope that I'll get a chance to
> participate in some of these discussions.
>
> Best,
>
> .            .       .    .  . ...Joe
>
> Joe Berkovitz
> President
> Noteflight LLC
>
> +1 978 314 6271
>
> 49R Day Street
> Somerville MA 02144
> USA
>
> "Bring music to life"
> www.noteflight.com
>
> On Fri, Apr 8, 2016 at 1:31 AM, Johannes Kepper <kepper@edirom.de> wrote:
>
>> Hi Joe,
>>
>>
>> Zoltan,
>>
>> Is this notion of profiles literally a per-project matter?
>>
>>
>> Not necessarily. There are generic profiles (we call them customizations)
>> which restrict MEI to support only specific repertoires like CMN (which,
>> for instance, requires the use of measures, which wouldn't be allowed for
>> mensural notation). In essence, they identify a subset to be used.
>> Individual projects, however, may use the very same mechanism to describe
>> their specific use of MEI, that is, the subset of the generic subset
>> they're actually using. This way, they're documenting their use of the
>> format (which is more of a framework and really expects such customized
>> use), and help to manage expectations for other consumers of their data.
>> Likewise, applications may use this mechanism to define the subset of MEI
>> they're able to understand, and by doing so are able to provide a schema
>> that can be used as 'input filter', so users of the application can see
>> beforehand which parts of their data will cause problems with this
>> particular app.
>>
>> That said, our generic profiles are still not very 'narrow' yet, and
>> there have been some discussions amongst the MEI developers to specify some
>> 'tighter' profiles, which may support interchange both within MEI and with
>> other formats. While this is still pretty much in the open, I expect this
>> topic of more restricted profiles to be discussed during the Music Encoding
>> Conference in Montréal in May, for instance.
>>
>> Hope this helps,
>> Johannes
>>
>> .            .       .    .  . ...Joe
>>
>> Joe Berkovitz
>> President
>> Noteflight LLC
>>
>> +1 978 314 6271
>>
>> 49R Day Street
>> Somerville MA 02144
>> USA
>>
>> "Bring music to life"
>> www.noteflight.com
>>
>> On Wed, Apr 6, 2016 at 1:34 PM, Zoltan Komives <
>> zoltan.komives@tido-music.com> wrote:
>>
>>> Great summary by Andrew.
>>>
>>> Joe also asks "*which timestamps can be relied on to be present in
>>> various situations*"?
>>>
>>> By default most attributes are optional in MEI. Hence, the recommended
>>> practice that every project should use a customization, in other words
>>> a restricted schema that specifies which attributes/elements are required.
>>> We sometimes call such a customization an *application profile*,
>>> because it describes the input interface of the application.
>>>
>>> Zoltan
>>>
>>>
>>> On Wed, Apr 6, 2016 at 5:54 PM, Andrew Hankinson <
>>> andrew.hankinson@gmail.com> wrote:
>>>
>>>> ...snip...
>>>>
>>>> > I note that MEI also addresses this question by supply a variety of
>>>> different timestamps for musical events, including both metrical (notated)
>>>> time and other assorted time bases; some tick-based (as with MIDI) and some
>>>> millisecond-based. However it is not clear to me how these time bases are
>>>> reconciled or what conventions exist for knowing which timestamps can be
>>>> relied on to be present in various situations. Perhaps someone from the MEI
>>>> community can speak to this.
>>>>
>>>> MEI treats durations and timestamps differently, and makes the
>>>> distinction between logical and gestural durations.
>>>>
>>>> <note dur="4" dur.ges="256p" />
>>>>
>>>> Would indicate a quarter note with 256 MIDI pulses-per-quarter. A
>>>> 'swung' quarter then might be:
>>>>
>>>> <note dur="4" dur.ges="264p" />
>>>>
>>>> Gestural durations can also be expressed in the case of beats or
>>>> seconds, so dur.ges="3s" or dur.ges="1.5b" would also be valid.
>>>>
>>>> MEI relies on validation to ensure the proper values of attributes, and
>>>> these values are determined by regex or built-in XML data types. So, for
>>>> example, the "dur.ges" attribute is defined as an attribute which takes a
>>>> data type of "data.DURATION.gestural":
>>>>
>>>>
>>>> https://github.com/music-encoding/music-encoding/blob/develop/source/specs/mei-source.xml#L4587
>>>>
>>>> and this duration is composed of sub-types, validated by regex:
>>>>
>>>>
>>>> https://github.com/music-encoding/music-encoding/blob/develop/source/specs/mei-source.xml#L875
>>>> (and following).
>>>>
>>>> If a user or software puts "dur.ges=4", this will fail validation
>>>> because the regex specifies gestural durations must have a number with an
>>>> associated unit of either PPQ (p), beats (b), seconds (s), or humdrum beat
>>>> proportions (r). This is all built into the schema, so one need only run a
>>>> given encoding against the RelaxNG schema using any XML validator (xmllint)
>>>> to determine whether or not a given duration is valid.
>>>>
>>>> For timestamps, we express duration as the passage of musical time
>>>> expressed as beats and measures, so an object with "@tstamp=1m+2" would
>>>> mean that that particular object began at an offset of 1 measure and two
>>>> beats. Often you will get "@tstamp" and "@tstamp2" which allows an object
>>>> to take a duration (e.g., a hairpin).
>>>>
>>>> -Andrew
>>>>
>>>>
>>>>
>>>
>>> www.tido-music.com
>>>
>>> Tido Enterprise GmbH (Amtsgericht Leipzig, HRB 29529), Talstrasse 10,
>>> 04103 Leipzig, Germany. Disclaimer: The information in this e-mail
>>> including any attachments is confidential and may be legally privileged. If
>>> you have received this message in error, please contact the sender
>>> immediately and delete this message and any copies from your computer and
>>> network. The unauthorized use, distribution, copying or alteration of this
>>> e-mail and any attachments is strictly forbidden.
>>>
>>
>>
>

Received on Monday, 11 April 2016 21:59:30 UTC