Re: wavetable synthesis in games

Hello Werther,

Good point on the relevance of wavetable synthesis for games music - I 
have updated the document accordingly, and will be sending an updated 
revision of the Use Cases & Requirements table shortly.

Olivier

On 17/02/2012 16:42, Werther Azevedo wrote:
>> game with audio and music -- emphasis mine
> As a video game composer, I'd like to support Joe's opinion. With
> wavetable synthesis, it would be possible for developers to come up with
> libraries that enables authors to associate midi files to sample banks.
>
> I never understood why so few game engines provide the option to
> implement a midi file that triggers a wavetable bank. It's essentially
> the same approach that tracker formats use, but in a manner that's
> approachable for piano roll composers. In my opinion, this approach
> would provide a great level of procedural control over the soundtrack
> (at least compared to plain ogg/mp3 playback), while keeping the file
> sizes low, which would certainly suit the bandwith concerns we have on
> the web.
>
> The XMF (http://www.midi.org/techspecs/xmf/)format had a good take on
> this but sadly it was never well supported.
>
> Best,
> Werther Azevedo
> @midipixel
>
> ------------------------------------------------------------------------
> *De:* Joseph Berkovitz <joe@noteflight.com>
> *Para:* Olivier Thereaux <olivier.thereaux@bbc.co.uk>
> *Cc:* public-audio@w3.org
> *Enviadas:* Sexta-feira, 17 de Fevereiro de 2012 12:09
> *Assunto:* Re: Use Cases and Requirements priorities: UC 6 and "Playback
> Rate"
>
> Thank you, Olivier.
>
> I appreciate your taking up my request for a high UC 6 priority, subject
> to the WG process.
>
> With respect to the "simple" playback rate association, I think we are
> in full agreement.
>
> I would argue that the "complex" pitch-compensated playback rate change
> does deserve its own requirement, since it is necessary to support the
> family of language-learning UCs, but I may have missed something from
> which this can be inferred.
>
> Best,
>
> ...J
>
>
> On Feb 17, 2012, at 9:01 AM, Olivier Thereaux wrote:
>
>> Hi Joe,
>>
>> Thank you for your feedback! A short reply for now - more soon.
>>
>> 1) UC6 was put in "middle" priority during our call because there
>> weren't a lot of voices making a case for it (only James thought all
>> except 3, 8, 11 and 12 are high priority, hence making 6 a HIGH) but
>> there was no objection against it either. If nobody objects to your
>> message, I see no problem making it so.
>>
>> 2) I tried to keep the meaning of “playback rate adjustment” to be
>> about the simple requirement (without pitch compensation), assuming
>> that extra processing and filtering would be necessary for the use
>> cases requiring a change of speed without changing the pitch. I may,
>> however, have made a mistake in how I have associated this requirement
>> with the 13 use cases. After a quick review, I believe the requirement
>> should also be associated to use cases 2 (game) 3 (audio workstation)
>> and 6 (virtual music instrument) indeed, leaving basically UC 1 (live
>> chat), 7 (visualisation) 8 (UI sounds, where the requirement could
>> possibly apply to) and 13 (audio recording) as use cases not meeting
>> this requirement.
>>
>> If that fits your understanding, I will change the document and table.
>>
>> We also need to decide whether the "complex" requirement (with pitch
>> compensation) needs its own line in the table, or can simply be
>> inferred from the others.
>>
>>
>> Cheers,
>> Olivier
>>
>>
>> -----Original Message-----
>> From: Joseph Berkovitz [mailto:joe@noteflight.com]
>> Sent: Fri 2/17/2012 1:25 PM
>> To: Olivier Thereaux
>> Cc: public-audio@w3.org <mailto:public-audio@w3.org>
>> Subject: Use Cases and Requirements priorities: UC 6 and "Playback Rate"
>>
>> Hello everyone,
>>
>> Sorry to have been unable to participate for most of this week -- and
>> thank you Olivier for the prodigious work of mapping and summarizing.
>>
>> I think this is tremendous progress and have a few points in response.
>>
>> 1. As you can imagine I would like to see UC 6 (Wavetable synthesis)
>> elevated to a "high" priority, not only because I think it enables a
>> substantial landscape of music applications but because I believe it
>> implicitly supports the musical aspects of UC 2 (game with audio and
>> music -- emphasis mine).
>>
>> 2. I think a significant mistake has occurred with respect to the
>> requirement "Playback rate adjustment". This phrase has unfortunately
>> carried two different meanings in our discussion and I think we've
>> become confused:
>>
>> A. Upsampling/downsampling with an arbitrary time factor, resulting in
>> proportionate changes to *both pitch and speed* -- commonly thought of
>> as simply "playing a sample faster or slower". This is an extremely
>> basic and important function in audio processing, and is already
>> implemented in Chris Rogers' API as the "playbackRate" property of an
>> AudioBufferSourceNode. It requires little more than interpolation
>> between nearby samples and is very easy to standardize. It does not
>> introduce artifacts.
>>
>> B. Adjustment to *only* the playback speed of audio media, with *no
>> effect on pitch*. This is a much fancier and more complex type of
>> transformation since interpolation does not do the trick: one must add
>> or delete audio material in a way that manages to preserve the
>> subjective impression of the original sound. There are multiple
>> competing algorithms that tend to introduce different kinds of
>> artifact. I demonstrated this function at the F2F.
>>
>> Unfortunately we've conflated the two requirements. The "playback rate
>> adjustment" required by UC 6 is the very simple case (A), while the
>> one referred to in all the other cases is the complex case (B). Let's
>> try to disambiguate by renaming (B) as "speed shift", and adjusting
>> the requirement name/definition in all the cases except UC 6. (There
>> is also a "pitch shift" that turns out to be very like (B) in terms of
>> complexity)
>>
>> 3. When it comes down to the actual requirements, (not the UCs) the
>> only missing requirement for UC 6 is (2A), the simple playbackRate
>> adjustment already implemented in Chris's API. Likewise, UC 2 (games)
>> will need to make frequent use of (2A) to vary the pitch of sound
>> effects in a simple way (e.g. collision sounds between objects of
>> varying sizes).
>>
>> In summary, I have a simple request that I will probably keep on
>> repeating: let's keep the "simple" playback rate adjustment (2A) as a
>> high priority requirement. It is going to be a basic function of UC 2,
>> and it is the only thing standing in the way of UC 6. I think (2B) is
>> safe to put on the back burner.
>>
>> Thanks!
>>
>> ... . . . Joe
>>
>> Joe Berkovitz
>> President
>>
>> Noteflight LLC
>> 84 Hamilton St, Cambridge, MA 02139
>> phone: +1 978 314 6271
>> www.noteflight.com <http://www.noteflight.com>
>>
>>
>> On Feb 16, 2012, at 9:41 AM, Olivier Thereaux wrote:
>>
>> > Hi all,
>> >
>> > At our meeting this week, we discussed which use cases were deemed to
>> be "highest priority", and I took the action to look at whether the
>> "high priority" UCs were covering a large number of our requirements.
>> >
>> > TL;DR: 23 of 28 requirements are covered by the "high priority" use
>> cases. And there are other ways to split them which may help us with
>> setting the "level 1" goalposts.
>> >
>> >
>> > Now for the detail:
>> >
>> > 1) I updated the wiki page for use cases and requirements with a
>> mention of priority. At the moment, use cases 1 (Video Chat), 2 (HTML5
>> game with audio, effects, music) and 7 (Audio/Music visualization) are
>> HIGH priority, and UC6 (Wavetable synthesis of a virtual music
>> instrument) is MEDIUM priority. All the others (including the newly
>> added use case 13 for audio recording+saving) are LOW priority..
>> >
>> > These priorities have gone unchallenged since the meeting on Monday,
>> and since my pointer to the minutes two days ago, so I assume we have
>> consensus?
>> >
>> >
>> > 2) I made a detailed mapping of the 13 uses cases and 28
>> requirements. See e.g:
>> > http://www.w3.org/2011/audio/wiki/Use_Cases_and_Requirements#UC1_.E2.80.94_Related_Requirements
>> >
>> > This is the part which I have done in good faith but I am sure there
>> are mistakes. There were many use cases where it wasn't entirely clear
>> whether "Modularity of Transformation" was required, or "Dynamic
>> Compression" (the latter because I am not enough of an expert to be
>> certain). Likewise, some of our requirements (such as "Audio Quality")
>> are very vague and hardly usable, and others ("Support for basic
>> polyphony" and "Mixing Sources") are near-equivalent.
>> >
>> > That said, I think the map is a flawed but reasonable approximation
>> of the territory. I have re-mapped it to the table I had sent a couple
>> of weeks ago. I am attaching the latest version to this mail. The
>> visual representation makes it easy to know which requirements are
>> associated with more or fewer use cases.
>> >
>> >
>> > Interestingly, if I count only whether a requirement is associated to
>> a high priority use case, I find that 23 out of the 28 we have are.
>> >
>> > The exceptions:
>> > * Playback rate adjustment
>> > * Dynamic range compression (possibly my mistake)
>> > * Generation of common signals for synthesis and parameter modulation
>> purposes
>> > * The ability to read in standard definitions of wavetable instruments
>> > * Acceptable performance of synthesis
>> >
>> > Alternatively, we could split requirements thus:
>> >
>> > * 9 Requirements are shared by more than half of the UCs
>> > Support for primary audio file formats
>> > Playing / Looping sources of audio
>> > Support for basic polyphony
>> > Audio quality
>> > Modularity of transformations
>> > Transformation parameter automation
>> > Gain adjustment
>> > Filtering
>> > Mixing Sources
>> >
>> > * 14 Requirements shared by less than half of the Use Cases, but
>> required by HIGH priority UCs
>> > One source, many sounds
>> > Capture of audio from microphone, line in, other inputs
>> > Sample-accurate scheduling of playback
>> > Buffering
>> > Rapid scheduling of many independent sources
>> > Triggering of audio sources
>> > Spatialization
>> > Noise gating
>> > The simulation of acoustic spaces
>> > The simulation of occlusions and obstructions
>> > Ducking
>> > Echo cancellation
>> > Level detection
>> > Frequency domain analysis
>> >
>> > * 5 Requirements shared by less than half of the UCs and not required
>> by HIGH priority UCs
>> > Dynamic range compression
>> > Playback rate adjustment
>> > Generation of common signals for synthesis and parameter modulation
>> purposes
>> > The ability to read in standard definitions of wavetable instruments
>> > Acceptable performance of synthesis
>> >
>> >
>> >
>> > Thoughts? Opinion on whether this is helpful? Glaring mistakes in the
>> process? Other ways you'd go at it?
>> >
>> >
>> > Thanks,
>> > --
>> > Olivier
>> > <Audio-UseCases-Requirements-Map.html>
>>
>>
>>
>>
>> http://www.bbc.co.uk <http://www.bbc.co.uk/>
>> This e-mail (and any attachments) is confidential and may contain
>> personal views which are not the views of the BBC unless specifically
>> stated.
>> If you have received it in error, please delete it from your system.
>> Do not use, copy or disclose the information in any way nor act in
>> reliance on it and notify the sender immediately.
>> Please note that the BBC monitors e-mails sent or received.
>> Further communication will signify your consent to this.
>
> ... . . .. Joe
>
> *Joe Berkovitz*
> President
>
> *Noteflight LLC*
> 84 Hamilton St, Cambridge, MA 02139
> phone: +1 978 314 6271
> www.noteflight.com <http://www.noteflight.com>
>
>
>

-- 
Olivier Thereaux
BBC Internet Research & Future Services

Received on Monday, 20 February 2012 15:59:41 UTC