W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

Re: Changes to "HTML5 game" use case

From: Samuel Goldszmidt <samuel.goldszmidt@ircam.fr>
Date: Wed, 08 Feb 2012 15:32:34 +0100
Message-ID: <4F328782.8000003@ircam.fr>
To: public-audio@w3.org
Hello James,

For UC5, you are right, I just wanted to insist on the fact that we 
could be more precise with this UC description.
We could propose how interaction could be specified between the web 
audio API and others spec (and the DOM).
There are 2 possible ways to synchronise music score with audio.
1. during performance, the score, performed by an instrumentist can 
modify rate of an audio signal associated with the score (for example, 
an electronic part of an electroacoustic work with live electronic)
2. during playback, it is the opposite : the audio can be the master 
whuch synchronize the score.
These 2 examples require a accurate scheduling, and multiple ways to 
interact with musical representation.

For MIDI, it is de facto a musical industry standard, but which work 
well. MIDI could be seen as a possible
way to control audio transformation (and it is very often used for 
that), but we could have other controls too (using for instance touch 
event etc.)



Le 08/02/12 14:54, James Ingram a écrit :
> Hi Samuel,
> You said:
>> For UC5, we could add support to custom music notation element based on
>> font-files ? I think this is a way that Sibelius or others achieve this,
>> no ? (Also, browsers are allready able to load custom font in CSS3 with
>> @font-face elements).
> Forgive me if I haven't understood you correctly, but I think that 
> graphics should be kept very strictly out of the audio specifications. 
> SVG uses @font-face to load custom fonts, so the audio group should 
> not have to worry about that. Sibelius will have to write SVG if they 
> want to create files which are both displayable and interactive in 
> browsers. Maybe they can already -- I don't know.
> You also said
>> We could also have pitch-shift and timestrech effects, which are very
>> usefull while putting samples together on a specific tempo using a DAW.
> Reading this, it occurred to me that it would be possible to include 
> non-midi instructions in SVG files too. I've been embedding MIDI, but 
> information special to a particular Javascript synthesizer could 
> easily be used instead. There's no real reason why virtual 
> synthesizers have to support MIDI.
> best,
> James
Received on Wednesday, 8 February 2012 14:36:40 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:49:57 UTC