Re:

Hi Olivier,=0A=0AGood that I could be of help. Your proposed changeset is g=
reat in my opinion.=0A=0ABest,=0A=A0=0AWerther Azevedo=0A------------------=
---------------------------------------------
Creative Director > Nano Studio
Composer > Escape Into
Audio teacher > PUC-RIO
@midipixel


________________________________
 De: olivier Thereaux <olivier.thereaux@bbc.co.uk>
Para: Werther Azevedo <midipixel@yahoo.com> 
Cc: Audio Working Group <public-audio@w3.org> 
Enviadas: Quinta-feira, 6 de Setembro de 2012 12:28
Assunto: Re: 
 
Hi Werther,

Great suggestions for the game/music scenario. 
I've attempted to capture them in the following changeset:
http://dvcs.w3.org/hg/audio/rev/c5ecdca5be3f

Latest draft here:
https://dvcs.w3.org/hg/audio/raw-file/tip/reqs/Overview.html

Thanks,
Olivier


On 5 Sep 2012, at 04:20, Werther Azevedo wrote:

> Hi Joe and group.
> 
> Nice to see that the use cases document has evolved. I can only dream of the day when these use cases are possible to implement on different browsers and devices. 
> 
> But for now, my feedback goes to the two use cases in my areas of expertise: games and music making.
> 
> 2.2 3D game with music and convincing sound effects
> Although there is a nice description about the ways in which sound effects can be manipulated to match player actions, I miss more detail on the musical side of the soundscape. The only time music is mentioned is on the context of crossfading between different soundscapes. I miss a more specific approach to how music structure would change in response to interactive events.. Musical events (such as tempo, time signature, note properties and envelopes, to say the least) could react to :
> 
> - Player states, such as: low health, imminent defeat, proximity to a certain goal, etc.
> - Game/environmental changes, such as: an ending timer, weather/season variation, moving between places and worlds
> 
> I know that this kind of functionality is covered in use cases 2.3 and 2.5, but I think that it should also be mentioned in the 3D game use case. I understand that the use case tries to cover the common ways in which modern games manipulate audio. But I'm concerned exactly with the way they don't! The game industry lost a great deal of control over music when the technology shifted from MIDI/MOD (which allowed access to note properties) to PCM/Redbook Audio. The Web Audio API is a great chance to make music really interactive once again, in an open way that's accessible to any developer, regardless of middleware. That's why I'll always make a case for this aspect to be apparent in a working draft or any official documentation.
> 
> Another kind of musical variation not necessarily tied to note data is the multitrack aspect of the soundscape, where layers of sound can be added or removed by dynamic control of simultaneously playing channels. This is pretty common in modern games and could be easily included in the use case. (ie: "As the soundscape changes, bringing a more somber, scary atmosphere to the scene. The once full orchestral underscore is slowly reduced, instrument by instrument, to a lonely and echoing cello.")
> 
> In short, it would be imported if the interactive musical possibilities were highlighted in the use case as well.
> 
> 2.5 Music Creation Environment with Sampled Instruments
> This scenario seems to describe not only notation software (which I assume was the intention), but the piano roll functionality I'd expect to be in the DAW use case (2.3). Perhaps they could be merged, or 2.3 could reference the functionality of 2..5? Even though notation software doesn't need audio clips and audio processing, the contrary isn't true. I can't imagine a professional DAW without a capable note editor.
> 
> Another suggestion would be to include the tracker interface in this part of the description: " including conventional Western notation, tracker and a piano-roll style display". Trackers are still widely used and some scholars say they're more efficient than piano rolls. 
> 
> ----
> 
> That's it for now. I know most of my comments come from a MIDI-centric approach, and the document is titled "Audio Processing use cases". However, I think the overlap is too big to be missed, specially in the 2.5 use case..
> 
> Sorry for the long message, and I hope it helps.
> 
> 
> Best,
> 
> Werther Azevedo
> ---------------------------------------------------------------
> Creative Director > Nano Studio
> Composer > Escape Into
> Audio teacher > PUC-RIO
> @midipixel
> 
> De: Joseph Berkovitz <joe@noteflight.com>
> Para: Audio Working Group <public-audio@w3.org> 
> Enviadas: Terça-feira, 4 de Setembro de 2012 12:31
> Assunto: Use Cases and Requirements review
> 
> Dear Working Group Members,
> 
> Olivier Thereaux and I have recently been engaged in redrafting the Use Cases and Requirements document.
> 
> The document had its origins in the January 2012 face-to-face meeting. At the time, its chief purpose was to capture and rank a set of rough use cases and feature areas to help guide and compare the candidate specs.  At this time, though, it feels to us that the group's work is better served by a different document, one which complements the current specification.
> 
> Our main goals in reworking the document have been as follows:
> 
> - Paint a broad picture of the audio-enabled web, illustrating its value through plain-language scenarios readable by a non-technical audience
> - Frame some of the valuable comments and observations made by WG and list members in terms of user stories
> - Build a portfolio of scenarios which, taken together, generate a comprehensive set of requirements and goals
> - Develop these goals with reference to user needs, rather than particular APIs and architectures
> - Connect requirements to real features in the Web Audio API, where they exist
> - Highlight gaps in the API where requirements rely on features that are not supported
> 
> At this time we feel that the document is ready for a broader review and welcome your input.  If we are able to process feedback in the next several weeks, we could publish this document as a Working Draft. That seems desirable for keeping the our momentum going.
> 
> Please find the latest version of the document here:
> 
>    https://dvcs.w3.org/hg/audio/raw-file/tip/reqs/Overview..html
> 
> As a side note, the set of scenarios is not intended to be universal and all-inclusive. Rather, our goal has been to collect a manageable number of scenarios that overlap to cover as wide an area as possible: many application domains and many API requirements. In so doing, we have often chosen to combine a number of smaller use cases into a single larger one.. We also sometimes eliminated use cases that, despite their interest, did not add a novel element to the overall picture.
> 
> Thanks to everyone whose input has made this document possible.
> 
> Best regards,
> 
> ... .  .    .       Joe
> 
> Joe Berkovitz
> President
> 
> Noteflight LLC
> Boston, Mass.
> phone: +1 978 314 6271
> www.noteflight.com
> 
> 
> 
--97335089-281553990-1346967976=:45776
Content-Type: text/html; charset=iso-8859-1
Content-Transfer-Encoding: quoted-printable

<html><body><div style="color:#000; background-color:#fff; font-family:arial, helvetica, sans-serif;font-size:10pt">Hi Olivier,<br><br>Good that I could be of help. Your proposed changeset is great in my opinion.<br><br>Best,<div>&nbsp;</div><div>Werther Azevedo<br>---------------------------------------------------------------<br>Creative Director &gt; Nano Studio<br>Composer &gt; Escape Into<br>Audio teacher &gt; PUC-RIO<br>@midipixel<br></div>  <div style="font-family: arial, helvetica, sans-serif; font-size: 10pt;"> <div style="font-family: times new roman, new york, times, serif; font-size: 12pt;"> <div dir="ltr"> <font face="Arial" size="2"> <hr size="1">  <b><span style="font-weight:bold;">De:</span></b> olivier Thereaux &lt;olivier.thereaux@bbc.co.uk&gt;<br> <b><span style="font-weight: bold;">Para:</span></b> Werther Azevedo &lt;midipixel@yahoo.com&gt; <br><b><span style="font-weight: bold;">Cc:</span></b> Audio Working Group
 &lt;public-audio@w3.org&gt; <br> <b><span style="font-weight: bold;">Enviadas:</span></b> Quinta-feira, 6 de Setembro de 2012 12:28<br> <b><span style="font-weight: bold;">Assunto:</span></b> Re: <br> </font> </div> <br>Hi Werther,<br><br>Great suggestions for the game/music scenario. <br>I've attempted to capture them in the following changeset:<br><a href="http://dvcs.w3.org/hg/audio/rev/c5ecdca5be3f" target="_blank">http://dvcs.w3.org/hg/audio/rev/c5ecdca5be3f</a><br><br>Latest draft here:<br><a href="https://dvcs.w3.org/hg/audio/raw-file/tip/reqs/Overview.html" target="_blank">https://dvcs.w3.org/hg/audio/raw-file/tip/reqs/Overview.html</a><br><br>Thanks,<br>Olivier<br><br><br>On 5 Sep 2012, at 04:20, Werther Azevedo wrote:<br><br>&gt; Hi Joe and group.<br>&gt; <br>&gt; Nice to see that the use cases document has evolved. I can only dream of the day when these use cases are possible to implement on different browsers and devices. <br>&gt; <br>&gt;
 But for now, my feedback goes to the two use cases in my areas of expertise: games and music making.<br>&gt; <br>&gt; 2.2 3D game with music and convincing sound effects<br>&gt; Although there is a nice description about the ways in which sound effects can be manipulated to match player actions, I miss more detail on the musical side of the soundscape. The only time music is mentioned is on the context of crossfading between different soundscapes. I miss a more specific approach to how music structure would change in response to interactive events. Musical events (such as tempo, time signature, note properties and envelopes, to say the least) could react to :<br>&gt; <br>&gt; - Player states, such as: low health, imminent defeat, proximity to a certain goal, etc.<br>&gt; - Game/environmental changes, such as: an ending timer, weather/season variation, moving between places and worlds<br>&gt; <br>&gt; I know that this kind of functionality is covered in
 use cases 2.3 and 2.5, but I think that it should also be mentioned in the 3D game use case. I understand that the use case tries to cover the common ways in which modern games manipulate audio. But I'm concerned exactly with the way they don't! The game industry lost a great deal of control over music when the technology shifted from MIDI/MOD (which allowed access to note properties) to PCM/Redbook Audio. The Web Audio API is a great chance to make music really interactive once again, in an open way that's accessible to any developer, regardless of middleware. That's why I'll always make a case for this aspect to be apparent in a working draft or any official documentation.<br>&gt; <br>&gt; Another kind of musical variation not necessarily tied to note data is the multitrack aspect of the soundscape, where layers of sound can be added or removed by dynamic control of simultaneously playing channels. This is pretty common in modern games and could be
 easily included in the use case. (ie: "As the soundscape changes, bringing a more somber, scary atmosphere to the scene. The once full orchestral underscore is slowly reduced, instrument by instrument, to a lonely and echoing cello.")<br>&gt; <br>&gt; In short, it would be imported if the interactive musical possibilities were highlighted in the use case as well.<br>&gt; <br>&gt; 2.5 Music Creation Environment with Sampled Instruments<br>&gt; This scenario seems to describe not only notation software (which I assume was the intention), but the piano roll functionality I'd expect to be in the DAW use case (2.3). Perhaps they could be merged, or 2.3 could reference the functionality of 2.5? Even though notation software doesn't need audio clips and audio processing, the contrary isn't true. I can't imagine a professional DAW without a capable note editor.<br>&gt; <br>&gt; Another suggestion would be to include the tracker interface in this part of the
 description: " including conventional Western notation, tracker and a piano-roll style display". Trackers are still widely used and some scholars say they're more efficient than piano rolls. <br>&gt; <br>&gt; ----<br>&gt; <br>&gt; That's it for now. I know most of my comments come from a MIDI-centric approach, and the document is titled "Audio Processing use cases". However, I think the overlap is too big to be missed, specially in the 2.5 use case..<br>&gt; <br>&gt; Sorry for the long message, and I hope it helps.<br>&gt; <br>&gt; <br>&gt; Best,<br>&gt; <br>&gt; Werther Azevedo<br>&gt; ---------------------------------------------------------------<br>&gt; Creative Director &gt; Nano Studio<br>&gt; Composer &gt; Escape Into<br>&gt; Audio teacher &gt; PUC-RIO<br>&gt; @midipixel<br>&gt; <br>&gt; De: Joseph Berkovitz &lt;<a ymailto="mailto:joe@noteflight.com" href="mailto:joe@noteflight.com">joe@noteflight.com</a>&gt;<br>&gt; Para: Audio Working Group
 &lt;<a ymailto="mailto:public-audio@w3.org" href="mailto:public-audio@w3.org">public-audio@w3.org</a>&gt; <br>&gt; Enviadas: Terça-feira, 4 de Setembro de 2012 12:31<br>&gt; Assunto: Use Cases and Requirements review<br>&gt; <br>&gt; Dear Working Group Members,<br>&gt; <br>&gt; Olivier Thereaux and I have recently been engaged in redrafting the Use Cases and Requirements document.<br>&gt; <br>&gt; The document had its origins in the January 2012 face-to-face meeting. At the time, its chief purpose was to capture and rank a set of rough use cases and feature areas to help guide and compare the candidate specs.&nbsp; At this time, though, it feels to us that the group's work is better served by a different document, one which complements the current specification.<br>&gt; <br>&gt; Our main goals in reworking the document have been as follows:<br>&gt; <br>&gt; - Paint a broad picture of the audio-enabled web, illustrating its value through plain-language
 scenarios readable by a non-technical audience<br>&gt; - Frame some of the valuable comments and observations made by WG and list members in terms of user stories<br>&gt; - Build a portfolio of scenarios which, taken together, generate a comprehensive set of requirements and goals<br>&gt; - Develop these goals with reference to user needs, rather than particular APIs and architectures<br>&gt; - Connect requirements to real features in the Web Audio API, where they exist<br>&gt; - Highlight gaps in the API where requirements rely on features that are not supported<br>&gt; <br>&gt; At this time we feel that the document is ready for a broader review and welcome your input.&nbsp; If we are able to process feedback in the next several weeks, we could publish this document as a Working Draft. That seems desirable for keeping the our momentum going.<br>&gt; <br>&gt; Please find the latest version of the document here:<br>&gt; <br>&gt;&nbsp; &nbsp;  <a
 href="https://dvcs.w3.org/hg/audio/raw-file/tip/reqs/Overview.html" target="_blank">https://dvcs.w3.org/hg/audio/raw-file/tip/reqs/Overview.html</a><br>&gt; <br>&gt; As a side note, the set of scenarios is not intended to be universal and all-inclusive. Rather, our goal has been to collect a manageable number of scenarios that overlap to cover as wide an area as possible: many application domains and many API requirements. In so doing, we have often chosen to combine a number of smaller use cases into a single larger one. We also sometimes eliminated use cases that, despite their interest, did not add a novel element to the overall picture.<br>&gt; <br>&gt; Thanks to everyone whose input has made this document possible.<br>&gt; <br>&gt; Best regards,<br>&gt; <br>&gt; ... .&nbsp; .&nbsp; &nbsp; .&nbsp; &nbsp; &nbsp;  Joe<br>&gt; <br>&gt; Joe Berkovitz<br>&gt; President<br>&gt; <br>&gt; Noteflight LLC<br>&gt; Boston, Mass.<br>&gt; phone: +1 978 314
 6271<br>&gt; www.noteflight.com<br>&gt; <br>&gt; <br>&gt; <br><br><br><br> </div> </div>  </div></body></html>
--97335089-281553990-1346967976=:45776--

Received on Thursday, 6 September 2012 22:00:24 UTC