W3C home > Mailing lists > Public > public-audio@w3.org > January to March 2012

Re: UC 7 - Audio / Music Visualization

From: Steve Sims <fontfx7@gmail.com>
Date: Tue, 17 Jan 2012 09:50:46 +0000
Cc: Olivier Thereaux <olivier.thereaux@bbc.co.uk>, public-audio@w3.org
Message-Id: <A54B961F-18E3-4C79-A582-F8F796009F83@gmail.com>
To: Alistair MacDonald <al@signedon.com>
This area is my main interest in the Audio API - my company (VertPixels Ltd.) produces iTunes LPs, many of which feature visualizers, and quite frequently they involve an audio-response element to them.  The iTunes LP environment requires code to be written in HTML, CSS and JavaScript only - it's basically a sandboxed browser environment - some very basic audio data is made available through the iTunes object.  (The LPs included with Deadmau5's 4x4=12 and the deluxe versions of Pink Floyd's Dark Side of The Moon and Wish You Were Here all contain examples out our visualizers.)  It's been a big "todo" item of ours to port some of our visualizers to modern browsers - hopefully we'll get the time to do that soon.

I've got just two comments on this use-case description.

Firstly I wouldn't expect a webpage to provide a user with the ability to control buffer size since that would make for a rather poor user experience.  If the buffer size needs adjusting to cope for slower machines then ideally IMHO the underlying Audio API should be doing that completely transparently.  Failing that, there should be mechanisms in the Audio API to allow for the developer to adjust the buffer size in response to potential playback problems.

Secondly, not to denigrate the fantastic work of Mr. Doob, but Ro.me involved many more people than him - he was "just" the technical director on the project - the complete credits list includes over 100 different people.  It's arguably not even correct to describe it as a Google project, since several other companies participated, although they did lead the development.

Kind regards,

Steve


On 16 Jan 2012, at 18:06, Alistair MacDonald wrote:

> Great point. Updated to include some examples:
> 
> === UC 7: Audio / Music Visualization ===
> 
> A user is playing back audio or video media from the webpage of their favorite artist or a popular online music streaming service. The visualization responds to the audio in real-time and can be enjoyed by the user(s) in a leisurely setting such as: at home, a bar/restaurant/lobby, or traveling with an HTML5 capable mobile device. The visualization layers can be written using complimentary web technologies such as the WebGL Canvas, where 3D objects are synchronized with the audio and mixed with Video and other web content using JavaScript.
> 
> The webpage can presents a graphic visualization layers such as:
> 
> * Wave-form view of the audio data - such as on SoundCloud: http://soundcloud.com/skrillex
> * Spectrum analysis or level-meter view - like in iTunes: http://apptree.net/ledsa.htm
> * Abstract music visualizer - example, R4 for Winamp: http://www.youtube.com/watch?v=en3g-BiTZT0
> * An HTML5 Music Video - such as Mr Doob's Ro.me: http://www.ro.me/
> 
> The user can control elements of the visualization using an interface provided by the webpage developer. The user can change the colors, shapes and tweak other visualization settings to their taste. The user may switch to a new visualization modes: changing from a spectrum-analysis view, to an abstract 2D or 3D visual view, a video overlay, or a mash-up of web-content that could include all of the above.
> 
> The webpage provides the user with the ability to control the buffer size of the underlying Audio API: this allows users with slower machines to pick a larger buffer setting that does not cause clicks and pops in the audio stream.
> 
> 
> 
> 
> On Mon, Jan 16, 2012 at 12:20 PM, Olivier Thereaux <olivier.thereaux@bbc.co.uk> wrote:
> Looks good!
> 
> Would it make sense to talk about the kind of visualisation you can see on services like soundcloud, where the whole audio stream/file is visualised in a single 2D graph: e.g. http://soundcloud.com/snowpatrol ?
> 
> Link to UC7 for those who want to see it in context:
> http://www.w3.org/2011/audio/wiki/Use_Cases_and_Requirements#UC_7:_Audio_.2F_Music_Visualization
> 
> Olivier
> 
> 
> 
> 
> On 16/01/2012 16:56, Alistair MacDonald wrote:
> 
>      I took a quick pass at use-case 7 and wondered if anyone had
>      thoughts/comments?
> 
> 
> Thanks,
> 
> -- Al
> 
> 
>      UC 7: Audio / Music Visualization
> 
> A user is playing back audio or video media from the webpage of their
> favorite artist or a popular online music streaming service. The webpage
> presents a graphic visualization layer that responds to the music in
> real-time that the user may enjoy in a leisurely setting such as: at
> home, a bar/restaurant/lobby, or traveling with an HTML5 capable mobile
> device. The visualization layer is written using complimentary web
> technologies such as the WebGL Canvas, where 3D objects are synchronized
> with the audio and mixed with Video and other web content using JavaScript.
> 
> The user can control elements of the visualization using an interface
> provided by the webpage developer. The user can change the colors,
> shapes and tweak other visualization settings to their taste. The user
> may switch to a new visualization modes: changing from a
> spectrum-analysis view, to an abstract 2D or 3D visual view, a video
> overlay, or a mash-up of web-content that could include all of the above.
> 
> The webpage provides the user with the ability to control the buffer
> size of the underlying Audio API: this allows users with slower machines
> to pick a larger buffer setting that does not cause clicks and pops in
> the audio stream.
> 
> 
> 
> 
> -- 
> Alistair MacDonald
> SignedOn, Inc - W3C Audio WG
> Boston, MA, (707) 701-3730
> al@signedon.com - http://signedon.com
> 
Received on Wednesday, 18 January 2012 15:49:43 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 18 January 2012 15:49:46 GMT