Musikmesse 2016 Meeting Minutes [via Music Notation Community Group]

W3C Music Notation Community Group meeting
The W3C Music Notation CG met in Genius/Logos (Hall 9.1) at Messe Frankfurt
during the 2016 Musikmesse trade show, on Friday 8 April 2016 between 2.30pm and
4.30pm.

The meeting was chaired by CG co-chairs Joe Berkovitz, Michael Good, and Daniel
Spreadbury, and was attended by about 40 members of the CG. A complete list of
the attendees can be found at the end of this report, and the slides presented
can be found here.
SMuFL 1.2 update
Daniel Spreadbury (Steinberg, CG co-chair) presented a brief summary of the
state of the SMuFL 1.2 development effort, with 30 issues currently open and an
expected delivery date of no later than the end of Q3 2016.

There were no substantive questions or discussion raised by this update.
MusicXML 3.1 update
Michael Good (MakeMusic, CG co-chair) presented a brief summary of the state of
the MusicXML 3.1 development effort, with 37 issues currently open and an
expected delivery date of no later than the end of Q3 2016. Michael also
explained the basic procedure of how issues will be resolved using the GitHub
issue/discussion/pull request workflow, and offered help on behalf of the
co-chairs to any member of the CG who is daunted by or has questions about this
workflow.

James Sutton (Dolphin Software) expressed concern about the noisiness of the
emails generated by the GitHub issue/pull request workflow. He suggested that
the ideal solution would be to provide a series of opt-ins/opt-outs for
different kinds of automatic emails, if possible.

ACTION: The co-chairs agreed to investigate what possibilities might exist with
their contacts at the W3C.
User stories
Werner Wolff (Notengrafik Berlin) prefaced Joe’s presentation on how user
stories should inform the capabilities of the new notation representation by
asking how we as a CG should engage the wider music writing community, and to
get to the core of what music notation really means?

James Ingram suggested that the requirements identified by the MPEG-sponsored
effort to define a new representation for music notation should be included in
our user stories.

ACTION: James Ingram to produce a link to the MPEG user stories.
What should the scope of the effort be?
Following discussion of what kinds of musical works should be considered to be
in scope for the capabilities of a new representation format, with a couple of
examples cited by Joe including George Crumb’s Makrokosmos (with its circular
staves) and Frédéric Chopin’s Prelude no. 15, or Raindrop Prelude (with note
values that appear to exceed the time signature) posited as those that might be
sufficiently complex that some aspects might be considered out of scope.

James Ingram and Werner Wolff were both of the opinion that all scores of all
kinds should be representable in the standard. Zoltan Komives (Tido) argued that
certainly if Chopin is considered out of scope, the scope is too narrow.
Christof Schardt (PriMus Software) argued that the current version of MusicXML
can represent the visual appearance of Chopin’s Raindrop Prelude quite
adequately by reproducing the techniques used in engraving programs like
Sibelius and Finale required to produce the desired appearance.

Werner Wolff raised the question of where graphical notation, as distinct from
CWMN, can be considered to start? Does, say, a heart-shaped notehead constitute
a graphical notation?

James Ingram suggested that it would be possible to use a combination of a
purely visual representation (e.g. SVG) and a purely aural/temporal one (e.g.
MIDI) to make it possible to capture these different semantic dimensions.

Ron Regev (Tonara) expressed that there is a conflict between making the
standard all-encompassing on the one hand and easy to work with on the other,
mirroring a point made in Joe’s slide that the tighter the semantic
restrictions, in general the easier the format is to work with.
Encoding profiles
Joe presented the idea of encoding profiles for documents in the new notation
representation, as a means of expressing the intent behind the encoding and
informing a consuming application (and end user) what capabilities an
application must have to be able to work with that particular document.

Thomas Weber suggested that the “menu” approach taken by the various kinds
of Creative Commons license, presenting content creators with a set of checkbox
options for what kinds of uses are permitted and prohibited, might be an
approach to how an encoding profile could be made. Joe suggested that in fact
each individual profile might be more like one of the checkboxes in the CC
licensing set-up process.

Jan Rosseel (Scora) pointed out that if these profiles are going to work, they
will need to be enforced in the editing applications used to author the content
as well as in the documents themselves.

Zoltan Komives explained that profiles are a core part of the MEI framework,
where they are known as customisations, and summarised their role as a contract
between the producer of the data and the consumer of that data.
Architectural suggestions
Joe went on to present some suggestions about how the new notation
representation could be architected, including the idea of cleanly separating
semantic, visual styling, and performance (or playback) data in a manner similar
to how the semantics and visual dimensions of web pages are separated into HTML
(semantic) and CSS (visual). He also proposed that following the DOM approach
makes creating interactive experiences driven from symbolic music
representations easy. Joe demonstrated this with a toy application that uses a
combination of MusicXML data, JQuery, HTML, CSS, and Noteflight’s embeddable
MusicXML renderer, to produce a simple music theory quiz in a few dozen lines of
code.

A video of Joe’s demo is accessible at
https://www.youtube.com/watch?v=R6Bdm0H1VtA
with example source code at
https://gist.github.com/joeberkovitz/c3b37e3d818d7f4df26f11c53e8c8328.
Note that there is not yet any public, online version of the software shown
here.

Adrian Holovaty (Soundslice) asked whether the proposal for CSS-like description
of visual aspects of notation would actually use CSS, or a new language? Joe
responded that it would make sense to borrow some of the CSS entities directly
(e.g. color) but that there would be a lot of work to do in defining entities
that make sense for music notation (e.g. dimensions might want to be expressed
in stave units rather than in, say, pixels or points).

Zoltan Komives commented that music notation is a means of describing art with
art, adding that Tido has found CSS to be insufficient to describe the visual
aspects of music notation, and has already made some progress in defining a new
language that attempts to do this. Joe asked Zoltan if he could share any
observations about the unsuitability of CSS.

Action: Zoltan to prepare some comments for the CG about Tido’s experiences
with using CSS to style music visually.

James Ingram presented the idea that the visual dimension can be thought of
purely in terms of space, and the performance or aural dimension can be thought
of purely in terms of time: in his view, everything is either time or space.
This seemed to be a controversial view among the attendees of the meeting, with
Alexander Plötz asking whether information about the forces required to perform
a work (e.g. labeling one of the staves as being played by a flute) would be
considered “space” or “time” in James’s division of responsibilities,
to which James replied that it would be “space.”

Thomas Weber commented that he felt it would be necessary to extend the DOM in
the same way that SVG has done in order to make the kinds of high-level
interactive experiences outlined by our user stories possible; in particular,
Thomas expressed concern about how to handle the complex relationships between
different entities, e.g. to ensure that if you edit the duration of one note in
a bar, this may well have consequences for other notes in the same and indeed
other bars. Music is not as cleanly hierarchical as other standards or types of
content. Michael suggested that XPath or other similar technologies might be
useful to help link separate entities together and move towards solving this
problem.

Adrian Holovaty expressed concern about using DOM programming to achieve these
interactive user stories because this approach implies that the music notation
representation is transmitted in full to the client’s computer, which may have
implications both for performance and security (e.g. rights management). Adrian
explained that although Soundslice uses MusicXML for the representation of the
music, it is transmitted to the end user’s browser by way of an intermediate
format. He expressed concern that developers and rights holders alike might find
obstacles and objections to this approach.
Thoughts on scope and feasibility
As the meeting drew towards its close, the attendees returned to the discussion
of what the scope of what the CG can hope to achieve might be.

Werner Wolff appealed for keeping the scope as broad as possible, while
recognising that the music industry is small in comparison with other
industries, and resources (time and money) are comparatively scarce. However, he
did not want the CG’s work to immediately head to the lowest common
denominator and leave many niches of musical expression on the outside.

Reinhold Hoffmann (Notation Software) made the counterpoint that the CG’s work
must be market-driven, based on what is feasible from a time and effort
perspective, and geared towards the needs of consumers; in other words, a
pragmatic approach.

Christof Schardt expressed support for the need that the new representation
format must break compatibility with MusicXML in order to be able to solve the
big problems. Michael responded that he agreed that breaking changes would be
necessary, but cautioned against making breaking changes only on the grounds of
preferring the elegance of a new solution. To minimise the effort required by
those applications and technologies that already support MusicXML, if a use case
is already adequately met by an existing capability of MusicXML, the CG should
not be in a hurry to throw it away purely because we have come up with a more
elegant solution.

The co-chairs thanked the attendees for their attendance and participation in
the meeting, which closed with a drinks reception generously sponsored by
Newzik.
Attendee list

  Manfred Knauff, Apple
  Dominique Vandenneucker, Arpege / MakeMusic
  Jan Augermüller, self
  Ainhoa Esténoz, Blackbinder
  Sergio Peñalver, Blackbinder
  Gorka Urzaiz, Blackbinder
  Brenda Cameron, Cambrian Software
  Dominik Hörnel, capella software
  Bernd Jungmann, capella software
  Wincent Balin, Columbus Soft
  Christof Schardt, Columbus Soft
  James Sutton, Dolphin Computing
  Hans Jakobsen, Earmaster
  James Ingram, self
  Michael Good, MakeMusic
  Thomas Bonte, Musescore
  Mogens Lundholm, MusicXML-Player
  Bob Hamblok, neoScores
  Aurélia Azoulay, Newzik
  Pierre Madron, Newzik
  Raphaël Schumann, Newzik
  Reinhold Hoffmann, Notation Software
  Martin Marris, Notecraft Services
  Joe Berkovitz, Noteflight
  Werner Eickhoff, Notengrafik Berlin
  Werner J Wolff, Notengrafik Berlin
  Thomas Weber, Notengrafik Berlin
  Francesca Galofré, Notes in Cloud
  Tomàs Genís, Notes in Cloud
  Leonid Peleshev, self
  Alexander Plötz, self
  Fivos Kefallonitis, PrimaVista
  Jan Rosseel, Scora
  Adrian Holovaty, Soundslice
  Daniel Spreadbury, Steinberg
  Zoltán Kőmíves, Tido
  Ron Regev, Tonara
  Lauri Toivio, Uusinta Publishing
  Robin Kidd, Yamaha



----------

This post sent on Music Notation Community Group



'Musikmesse 2016 Meeting Minutes'

https://www.w3.org/community/music-notation/2016/04/14/musikmesse-2016-meeting-minutes/



Learn more about the Music Notation Community Group: 

https://www.w3.org/community/music-notation

Received on Thursday, 14 April 2016 01:51:11 UTC