NAMM 2018 Meeting Minutes [via Music Notation Community Group]

The W3C Music Notation Community Group met in the TEC Tracks Meetup space in the Hilton Anaheim (Level 3, Room 7) during the 2018 NAMM trade show, on Friday, January 26, 2018 between 10:30 am and 12:00 noon.

The meeting was chaired by CG co-chairs Joe Berkovitz, Michael Good, and Daniel Spreadbury, and was attended by 20 members of the CG and interested guests. The handouts from the meeting can be found at

W3C MNCG NAMM 2018 Meeting Handout

Philip Rothman from the Scoring Notes blog recorded the meeting and has posted the video on YouTube. The video starting times for each part of the meeting are included in the headings below.

https://www.youtube.com/watch?v=GToEhCIBqhA&feature=youtu.be
Introduction to the W3C MNCG (Starts at 0:41)
Michael Good introduced the W3C Music Notation Community Group. This meeting was part of NAMM's TEC Tracks Meetup sessions, so several people attending were not members of the group.

Michael discussed the history of the group, its progress in 2017 in releasing MusicXML 3.1 as a Community Group Final Report, and its plans for 2018. The 2018 plans include work on the next-generation MNX project, as well as releasing a SMuFL update as a Community Group Final Report.
Group Introductions (Starts at 5:52)
We went around the room and each of the 20 attendees introduced themselves and their interest in the Music Notation Community Group. The attendees in order of their introduction on the video are:

  Daniel Spreadbury, Steinberg (co-chair)
  Jeff Kellem, Slanted Hall
  Kevin Weed, self
  Tom Nauman, Musicnotes
  Jon Higgins, Musicnotes
  Adrian Holovaty, Soundslice
  Derek Lee, Groove Freedom
  Philip Rothman, NYC Music Services
  Jeremy Sawruk, J.W. Pepper
  Bruce Nelson, Alfred
  Mark Adler, MakeMusic
  Steve Morell, NiceChart
  Jon Brantingham, Art of Composing Academy
  Evan Balster, imitone
  Fabrizio Ferrari, Virtual Sheet Music
  Simon Barkow-Oesterreicher, Forte Notation / Uberchord
  Chris Koszuta, Hal Leonard
  Doug LeBow, self
  Joe Berkovitz, Risible (co-chair)
  Michael Good, MakeMusic (co-chair)

These attendees covered a wide range of the music notation community. In addition to software developers there were composers, performers, music preparers and engravers, publishers, publication and production directors.
MNX (Starts at 21:00)
Joe Berkovitz led a discussion of the current status and future directions for the next-generation MNX project. Given the variety of attendees, Joe tried to balance the discussion between the perspectives of both developers and users of music notation standards.

Currently there are three parts of MNX:

  CWMNX is the most familiar part for conventional Western music notation. We can think of this as the next generation of MusicXML, and hope that it will take the place of what would have been MusicXML 4.0.
  GMNX, a general music notation format. This emerged from the group's discussions of how we could encode arbitrary music, not necessarily part of the Western music literature. There is a role for a literal format the encodes a linkage between arbitrary vector graphics and sound. Many applications for Western music notation could use it as well.
  The MNX Container covers the need to package an ensemble of files together in a way that reflects the need of a compound document. This is in the most primitive state now and needs to be built out further.

Why Start Again and Work on MNX vs MusicXML? (Starts at 29:50)
MusicXML predated the Internet delivery of music when print was still king. The MusicXML format includes several print-based assumptions such as page breaks and credits (page-attached text) that cause problems for more flexible, mobile, and web-based ways of delivering music.

The success of MusicXML and the web has also created more music notation use cases that people want to address. A key one is for the model of the standard to be closer to the model that you would use for building an interactive notation program. Michael elaborated on why this was an explicit non-goal for MusicXML back in 2000, when MusicXML was trying to create a standard exchange format in the wake of unsuccessful prior efforts such as NIFF and SMDL.

Times have changed since then. We now have a product developer community that has seen the benefits of music notation exchange standards. We also have many more links to the music publisher community than what MusicXML had in 2000.
Where Are We Now? (Starts at 36:40)
We do not have very much yet for MNX. There is a draft specification, but it only covers perhaps 1/4 to 1/3 of what MusicXML does. There are no reference applications, there are not many examples, and there are lots of open issues.

The hope is to have a complete draft of the specification by the end of 2018, though that may be optimistic. At that point the vendor community will not be rushing to build MNX support, but we do expect to see experimental implementations. This is fine - if you don't have implementations, you don't learn.
Container Format (Starts at 41:17)
The MNX container format tries to do a better job of representing document hierarchies than MusicXML's opus document type, which nobody appears to be using. Another goal is to provide a more solid approach to metadata compared to what we have today in MusicXML. Different score types can be included in the container, including CWMNX, GMNX, and other score types such as neumes that might be developed in the future.

Michael asked about using a zip file as an alternative or supplement to the XML format container. Joe replied that zip is just one of many ways we could package an archive, and Michael will file an issue on this.

Michael raised a second question about including digital rights management in the container format. Jeremy Sawruk replied that we should look at the HTML5 video debacle and not specify DRM ourselves. We should not preclude vendors adding DRM, but that should be done at the vendor level.

Doug LeBow raised an issue about being able to identify a creation and usage history for music within the metadata. In his experience with Disney, music get repurposed and reused all the time, and people need to know where different parts came from. Joe suggested that Doug enter issues so that we can capture his knowledge of these use cases. Joe also mentioned that MNX intends for metadata to present at any level in the document, not just at score or collection level.
CWMNX Highlights (Starts at 50:35)
Sequences and directions are at the core of the new organization of musical material in CWMNX. In MusicXML you can hop back and forth between voices and times at will. CWMNX takes MusicXML's cursor approach to ordering music and makes it much more constrained.

In CWMNX, music from a single voice is arranged into a sequence of events, including rests, notes, and chords. Directions are elements that are not events. Unlike events, they can have their own offsets into a container that they belong to. Dividing things into sequences and directions can make it easier to both encode and decode music notation. It provides a more natural mapping to data structures such as voices that are common among musical notation applications.

MNX tries to make a clear distinction between semantic markup, such as "a C4 quarter note," and presentation information. Presentation information could be left out and the application could still create readable music, though not necessarily looking as good as you might like. Examples of presentation information include fonts, changes from standard positioning, size, and color. Presentation information in CWMNX is referred to as styles, a clear reference to HTML styles and CSS.

A third category of data in CWMNX is interpretation. This is more general than MusicXML's sound element. Interpretation can specify that irrespective of what the semantics indicate, here is how some music should be played, using a MIDI-like description.

Michael added that MusicXML handles some of MNX interpretation data not only with the sound element, but with pairs of elements that indicate what is played vs how music looks. One example is using the tie element for playback and the tied element for appearance. These paired elements are a common source of confusion among MusicXML developers. MNX can offer a more systematic approach to addressing the same underlying problem.

CWMNX includes the concept of profiles. A "standard" profile would cover the great majority, but not everything, of what is in conventional Western music notation. Multi-metric music is one of the biggest examples of something that would be in CWMNX but might not be in the standard profile.

We want to support the concept of house styles in CWMNX. This includes font, distance, and other layout information that applies across an entire score. We want to easily substitute one style for another depending on context, enabling responsive styling for music notation.
CWMNX Discussion (Starts at 1:03:00)
Joe asked the group how far should CWMNX go in describing a normative style of positioning for conventional Western music notation? Should it try to do this, and if so, how far should this go? What would the benefits and drawbacks be?

Daniel Spreadbury said that if we go in this direction, then we have to specify norms, and specify them quite thoroughly. That will be difficult to do.

Kevin Weed asked what happens if we don't have these standards in MNX. What's the alternative? The alternative is what happens now, where each application decides for itself how to interpret the formatting.

Doug LeBow referred to orchestrator use cases where people split up between Finale and Sibelius to write a single cue under high time pressure, with different people writing for different instruments. Without standards for appearance between applications you would lose control over quality and stylistic consistency in the final music product.

Chris Koszuta said that Hal Leonard has been trying to get their digital files to the pristineness of the printed score. They have worked very hard to get to that point with MusicXML over the past several years, but are not quite there yet. To get the same control of the nuances in digital as you have in print, you need some agreed-upon standards. If not, when things fail and you have to go back to do additional work at the publisher, that's tens of thousands of files with all the time and money associated with that.

Hal Leonard has been converting into MusicXML over the past four years but still runs into customer problems because a digital service doesn't do something quite right yet. Customers really do notice these details. Chris hopes we can get to some level of agreement and control where it's fluid and things are fun, instead of being a lot of extra work to create the next step of interactive music notation. If we don't lock things down now, we will be fiddling with these details for years and years ahead.

Tom Nauman said that a lot of Musicnotes' use of MusicXML is inbound. Everything they import has to be tweaked to be satisfactory to the customer. Chris followed up that when Hal Leonard does content deals with partners, they don't want to provide messy files where the partner has to do extra work.

Daniel said that if we do encode positioning information, we have to lock it down and agree. It will take a long time, but if we don't do it and things aren't absolutely black and white, applications won't be predictable. In other aspects of MNX we are trying to have just one way to encode things, as with sequences. Positioning would be the same way.

Steve Morell raised the point that most developers focus on their MusicXML import, but MusicXML export has less attention paid to it. Is there any way to incentivize export as well as import quality? Doug agreed - there is so much back-and-forth exchange in today's workflows for musicians that both directions need to work equally well. Joe replied that when we have widely adopted, free, probably open source MNX viewers in browsers, that would provide an incentive to improve export.
GMNX (Starts at 1:16:42)
GMNX is a "graphics plus media" type of format. The notation is an SVG file. Musical sound or performance is either an audio file or a MIDI-like list of timed events. The time relationships can then be linked between the graphics and sound, and applications don't really need to know what the notation is. Many practice and performance applications don't need more than this.

Joe has made GMNX demos available online for examples from Fauré, Hebrew cantillation, and aleatoric music from Lutosławski. GMNX might even be applied sooner than CWMNX since it is much simpler.

Adrian Holovaty asked how we could get performance synchronization points from GMNX into CWMNX? The synchronization feature in GMNX would be useful for applications that do know the semantics of music notation. Joe asked Adrian to file an issue so we can address this.

Evan Balster asked a question about longer-term intent and if MNX was something that could be embedded within HTML browser documents in the future, like math and SVG. Joe replied that there will be a namespace immediately, and it could be viewable in a browser once there is a decent JavaScript library that supports it.
Conclusion (Starts at 1:22:30)
At this point we concluded the meeting. We had productive discussions and look forward to these conversations continuing. We hope to figure our a way to have these conversations more often than our once or twice a year meetings at NAMM and Musikmesse.



----------

This post sent on Music Notation Community Group



'NAMM 2018 Meeting Minutes'

https://www.w3.org/community/music-notation/2018/01/30/namm-2018-meeting-minutes/



Learn more about the Music Notation Community Group: 

https://www.w3.org/community/music-notation

Received on Tuesday, 30 January 2018 22:49:15 UTC