W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

Re: MIDI files and streams

From: James Ingram <j.ingram@netcologne.de>
Date: Mon, 05 Mar 2012 12:50:40 +0100
Message-ID: <4F54A890.7000105@netcologne.de>
To: public-audio@w3.org
Sorry, I was a bit quick off the mark there.

I said:
> I think the web should support MIDI at two levels:
> 1. it should support the playing of SMFs in the<audio>  tag
> 2. it should support the Javascript interface I'm rooting for.
> I think that Phil is talking about an SMF authoring tool, and that Joe
> is right to ask that that tool should use seconds to describe the
> durations of events.
I then said:
>   The tool would have to *write* SMFs in terms of
> MIDI "tempo" and "ticks", because that's how SMFs do things, but the
> author ought to think simply in terms of seconds.

I should rephrase that: The tool would have to write SMFs in some way 
that could be understood by an SMF interpreter, but the author ought to 
think simply in terms of seconds.

A highly interesting part of Phil's original post, which I hope I now 
understand better, is the following:

Phil said:
> Note that using timestamps requires that one be able to query the
> current time. JavaSound supports timestamps on output but the time query
> did not work (at least in early versions) so a programer had no idea of
> what value to use for the timestamps.

I looks to me as if Phil wants to create SMFs by recording the input 
from a MIDI input device, and that what he is saying here is that 
timestamps need to be supported not only by output devices, but also by 
input devices. Is that right?

That's something I'd also like to see!



Received on Monday, 5 March 2012 11:51:15 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:49:58 UTC