Re: Getting Off The Ground

HI All,

By way of background, I am full stack web developer and have been working
alot with MusicXML over the past year or so. I am also finishing my PhD
dissertation, and part of the project has involved building a web
application that is a a search engine for music scores.

As a web developer, I find there are some great and not so great things
about working with musicXML. It is certainly fantastic to have a
comprehensive specification that can work with machine readable data set.

However, having said that, whenever I I work with MusicXML with a view to
making a web application, I  will always strip out whatever is relevant
from the MusicXML and convert it to JSON.

I guess this approach raises a couple of questions. First, what is the
point of manipulating an already well-formed XML data set and converting it
to a different format, and second, why favour JSON over XML?

To answer the first question, I find that I will always need to do some
initial clean up of the MusicXML data set. Consider the following use case:
I might start with a printed score and use something like
Sibelius PhotoScore to read it into a computer. If its a complex score
(like an orchestral score for instance), I will need to do a lot of clean
up in PhotoScore. Then I need to export it to a program like Sibelius and
do some more clean up to ensure all the details are correct. Then I will
export it as MusicXML that I can work with. This process leaves me with a
file that I know will render nicely in programs like Sibelius or MuseScore.
But I can't really rely on it as a data store in in a web service because
during the whole process, additional formatting will usually creep in. So
some kind of clean up is needed. My usual approach is to parse the MusicXML
with Python, get rid of any inconsistencies, and then export as JSON ready
for the web.

This brings me to the second question - why JSON? Well, if I am working on
a web application, JSON is great to work with. I will often house the JSON
data in a MongoDB database which plays really nicely with web frameworks
such as AngularJS. Having it as JSON data in a MongoDB database means I can
also confidently and efficiently perform aggregations on the data. And this
raises an important point - while its important to have a data
specification that can allow us to create beautifully rendered music
notation on the page, we might also want to perform some kind of analytics
on the data. As an example, maybe I have a data set of all the Bach
Chorals. And I wouldn't want to just use this data as a means of rendering
the notation. I might want to use the same data set to explore (or provide
a way for the user to explore) the kinds of things Bach was doing when
writing these compositions.

I think this second point raises an interesting issue: MusicXML on the web
will not just be used to render music notation. Its also needs to function
as a robust data set that can be interrogated in order to gain insight into
music and composition practice  (check out
http://biodigitaljazz.org/2015/05/04/hello-world/  for a primitive example
of this kind of thing).

The other thing I think is important to keep in mind is that there there a
multitude of ways we might want to visualise this kind of data, not just as
standard music notation. Traditionally, the principle role of music
notation seems to be providing musicians with a linear set of instructions
about what and when to play. But in the web applications environment, we
might want to look at this data in different ways. Perhaps I want to use
the data to let users explore which combinations of instruments Mahler used
in certain situations. All of this in the MusicXML data, and its probably
not something I would want to view in a score type view, but I still want
to see it some kind of visualisation.

The last thing I will say (and sorry for such a long email!) is that often
when I am working with this kind of data I use the data visualisation
library D3.js. So I will take the MusicXML data that has been converted to
JSON and build whatever visualisation I need as a custom SVG thing in
D3.js. I have found this to be a really neat approach as I can create SVG
elements that are as custom as I want them to be, but also use the same
underlying dataset to do aggregations and searching. So if through this
group there emerged a different way of doing music notation on the web, to
get on board with it, I would need it to offer a greater amount of control
and flexibility than the current way I am doing things.

Jamie Gabriel
http://biodigitaljazz.org/stelupa/


On Fri, Sep 18, 2015 at 12:28 AM, Joe Berkovitz <joe@noteflight.com> wrote:

> Hi Group Members,
>
> Thanks for your patience during the Northern-summer vacation season as
> things slowed down and many were out of town and offline. Even so, many new
> members and organizations continued to join the Music Notation CG, bringing
> the total count to an impressive 191.
>
> Now, we’re finally at the point of getting started and actually doing
> something. With this post, the co-chairs hope to set a rough agenda for
> lots of good work to come. It’s also an opportunity to ask the membership
> to supply thoughts on a number of points where your input will be very
> helpful.
>
> SO... NOW WHAT?
>
> Let’s begin by reviewing some of the main areas in which we hope this CG
> can make progress. Of course we can’t do everything at once, although we
> can pursue some limited set of goals in parallel. We’ve broken these up
> into short-term projects that we think can be completed in the coming 6 to
> 12 months, and longer-term projects that can begin to be addressed in
> parallel with the short-term work, perhaps by separate subgroups within the
> CG.
>
> SHORT TERM PROJECTS
>
> Build an initial MusicXML specification. The aim of this initial document
> is tactical in nature: it needs to resolve the most significant ambiguities
> and gaps faced by developers working with the current version of MusicXML..
> It should also provide a framework for later, more complete specifications,
> and can serve as a version-controlled container for new MusicXML features
> going forward. This initial spec will be incomplete by design, though, and
> will still coexist/overlap with the current XSD documentation.
>
> Add support for use of SMuFL glyphs within MusicXML. MusicXML needs to
> include some new constructs and documentation that allow SMuFL glyphs to be
> employed usefully. The symbolic vocabulary of MusicXML must grow to support
> some new SMuFL notations. MusicXML must also be able to specify the use of
> SMuFL glyphs in already-supported notations (e.g. “use this SMuFL notehead
> for this note”). More fundamentally, MusicXML must define the manner in
> which SMuFL glyphs are joined to each other and registered with respect to
> relative or default X/Y locations.
>
> Identify and fix any remaining gaps or adoption barriers in SMuFL. We are
> at a point in this venture at which any serious problems or barriers to
> adoption need to be identified and fixed in SMuFL. It will be hard or
> impossible to fix such problems later.
>
> Document music notation use cases. We need to begin to develop a separate
> document that covers and prioritizes the use cases that the CG’s work will
> support, to aid in evaluating the many alternative proposals and solutions
> that will come up.
>
> LONGER TERM PROJECTS
>
> Improving formatting support in MusicXML. MusicXML 3.0 formatting cannot
> easily be shared between documents. Nor can it distinguish formatting that
> clarifies semantics, such as for collision avoidance, from formatting that
> is more a matter of house style, such as font choices and spacing
> preferences. Could CSS stylesheets help solve these issues and provide more
> powerful formatting support for a wider variety of use cases?
>
> Build a complete MusicXML specification document. A long-standing MusicXML
> community request has been  to build a complete specification. This would
> replace the XML Schema as a specification and address holistic or
> cross-cutting matters that do not belong to any single schema component.
>
> Adding Document Object Model (DOM) manipulation and interactivity to
> MusicXML. What would it take to be able to create interactive music
> applications on top of any standard MusicXML rendering engine? MusicXML was
> not designed with DOM interactivity in mind. Is the current document
> structure sufficient, perhaps with some minor adjustments? Or does the use
> of a time cursor that can move forward and backward, combined with the
> current structure, inhibit DOM interactivity? Would this  require a more
> structural solution such as revisiting the MusicXML element hierarchy?
>
> WHAT’S NEXT?
>
> This is where the co-chairs can use your help. We’d like to ask you to
> answer the following questions:
>
> - Are these the right major goals? What’s missing? What should go?
>
> - Are we picking the correct short-term projects to start with?
>
> - Have we defined the short-term projects properly?
>
> - What would you most like to see done with MusicXML right away?
>
> - What would you most like to see done with SMuFL right away?
>
> At this point we are looking for input from the membership. It’s tempting
> to indulge in a wide-ranging debate, but at this stage it’s going to be
> difficult to reach a conclusion through a large email discussion. So we
> want to begin by hearing people’s thoughts. Please send your thoughts to
> this list at public-music-notation-contrib@w3.org.
>
> Thank you again for your interest in the Music Notation Community Group.
> We are looking forward to hearing your thoughts as to how you would like
> the group to proceed.
>
> Best,
>
> Joe Berkovitz
> Michael Good
> Daniel Spreadbury
>
> W3C Music Notation Group Co-Chairs
>
>

Received on Wednesday, 23 September 2015 13:30:41 UTC