A copyist's perspective

Hi all,

I was very busy with other things yesterday, and the list has exploded 
since my last post, so its a bit difficult to answer everyone's 
objections directly.

The best thing I can do, this close to Frankfurt, is to provide a 
summary of my present position.

Taking it from the top, we currently have:
MusicXML --> MusicNotationEditor --> SVG (--> (timeless) paper printout)

There is also
MusicXML --> MusicNotationEditor --> StandardMidiFile (--> (spaceless) 
SMF performance)
but I think its a mistake to think that <event> symbols only have a 
single meaning, so the StandardMidiFile currently exported by MNEditors 
can only be thought of as one among many interpretations of the score.

I think we need both:
(timeless) MNX --> MNEditor --> SVG (--> (timeless) paper printout) as 
at present, and
MNX+synchronised temporal info --> MNEditor --> SVG+synchronised 
temporal info

As I outlined in
http://lists.w3.org/Archives/Public/public-music-notation-contrib/2017Apr/0009.html,
spatial perception of a score begins in regarding it as a set of pages. 
But pages are instantiated by the MNEditor, so they don't form part of 
MNX's container heirarchy.

The main object of current CWMN MNEditors is to help composers write 
their scores legibly, and with minimum effort. To do this, they provide 
default layouts based on engraving rules that have evolved over 
centuries. These rules are purely spatial. The reason that CMN is so 
effective, is that is contains advanced visual cues that aid the reader 
in chunking the information. Other, less complicated, notations can be 
regarded as using a subset of the CMN constructs. Scores are meant to be 
read by humans (who read in chunks, not strictly parallel to the arrow 
of time).

Issue #1 Graphical scores: I don't really understand what Joe means in 
his §3.1.2 by a "graphical score".  Joe: Do you mean a score that has no 
"arrow of time"? Like many Busotti or Haubenstock-Ramati scores? Such 
scores are easy to synchronize with playback. All you have to do is 
display the score while the recording is playing, and leave it to the 
audience to perceive any connections. In this case, the score is just 
one big <event> symbol.

Issue #2 Mobile Scores: Many 20th century scores are written as 
"mobiles". They have containers that can be played in more than one 
order. The arrow of time is determined by (text) instructions to the 
players. All this means is that a performing application could read an 
MNX file containing a default arrow of time (order of performance of the 
containers) but play the containers back in some other order. (Example: 
Stockhausen's /Momente/,/Refrain/, Pouseur's /Caractères/ etc.) This 
form of score has not really caught on. Stockhausen realized fixed 
versions of both /Momente/ and /Refrain/ because he felt that some 
arrangements of the containers were, after all, better than others, and 
that a real performance always has a single arrow of time.

The arrow of time in (CWMN) MNX:
In order for MNX to be readable in one pass, I think its containers 
should strictly reflect the score's graphic structure. Elements that are 
graphically inside a particular container should be inside that 
container in the MNX.
I also think that the arrow of time should flow consistently from top to 
bottom in the MNX file. The only exception is for polyphonic music, in 
which parallel <staff>s or <voice>s represent parallel temporal threads. 
But the order of the <system>s in the file and <event>s in a <voice> 
should be the order in which they are (by default) performed.

The Container Hierarchy and Profiles:
If one of the containers is going to be <measure>, then the standard 
container hierarchy for CMN could be:
<system> - <measure> - <staff> - <voice> - <event>.
These containers contain attributes and graphic elements that are not 
containers. e.g. <clef> or <annotation> etc.
In CMN, there are standard directions in which the graphics are read. In 
other words, the arrow of time flows in particular directions in CMN.
As I understand it, Profiling is going to be used to determine, for 
example, that the <event> container contains the graphic definition of a 
StandardChordSymbol (noteheads, stem, accidentals, accents, ornament 
signs, dynamic, beams etc.). This tells the parser which informaton to 
expect in the file. The StandardChordSymbol definition will be abstract 
in MNX, but instantiated by the MNEditor.

Issue #3: Could Profiles also be used to
a) say that a particular MNX file uses a different container hierarchy?
For example, an instrumental part or a score for a single player could use:
<system> - <measure> - <event> or even
<system> - <event>
b) tell the client application the optimal relation between the graphics 
and the arrow of time?
The arrow of time will be top to bottom in all MNX files but if, for 
example, the Profile says that the graphic definition of the <event> 
symbol is going to be some Japanese tablature symbol, it might also say 
that the preferred instantiation would be in a score in which the 
<system>s are read from right to left, and the <measure>s from top to 
bottom. The client application could, of course, write a score with a 
"Western" orientation anyway, but it would still then need to know how 
it should orient the tablature symbol.

----

Detailed Temporal Information (relevant to Performance Practice):
Thus far, the only temporal information I've mentioned relates to the 
order in which <system>s or <event>s are played. That's just a simple 
before-after relationship.
As Joe said, there are three ways to synchronize the graphics with more 
detailed temporal data:

1) Embed an element (or list of elements) containing purely temporal 
information inside each <event>.
Since the MNEditor deals only with graphics, it can ignore any purely 
temporal element contained by the <event> containers in the imported 
MNX, and simply copy the data unchanged to its SVG output.
The MNEditor could, of course, itself be a performing application, quite 
apart from any files it might export.
The obvious choice of data format for the embedded element is MIDI, 
since that gives the client app maximum flexibility (speed changes 
during performance, pitch transposition re the notation, conducting 
option etc.)
Issue #4 Embedding small audio files: Could a similar functionality be 
achieved using small audio files? That would involve some very fancy 
data manipulation, but maybe its not impossible.

2) Include a timestamp in each <event> pointing at an external recording 
(as done by MEI/Verovio)
Unfortunately, following the temporary demise of my main computer, I 
currently have no access to the conversations I had with the Verovio 
team last December. I have, however, found the following reference in 
Verovio Issue #379. On Dec 16, 2016, I said:

> [...] following@craigsapp <https://github.com/craigsapp>'s comments, 
> I'm no longer sure that Verovio's midiplayer and vrvToolkit have 
> really solved the underlying problem, so that approach may not be 
> viable after all.
As I remember, Craig was referring to the problem of synchronizing two 
independent system threads. Maybe someone could ask Craig what the 
problem was. Or maybe Joe could throw some light on this.
I imagine that there will, after all, be a solution in this direction, 
but it won't involve MNX. The only thing MNX can do is provide the 
timestamp. Synchronizing timestamps is not MNX's problem. Note that even 
if this approach can eventually be made to work, it will never provide 
the degree of control available in solution 1) using MIDI.

3) Have a separate document that synchronises both the <event>s and the 
external file.
This solution suffers from the same drawbacks as 2).

I think that's about it. Hope it helps. Have Fun! :-)

All the best,
James

-- 
https://github.com/notator
http://james-ingram-act-two.de

Received on Wednesday, 5 April 2017 12:35:37 UTC