- From: Maciej Stachowiak <mjs@apple.com>
- Date: Thu, 22 Mar 2007 01:51:46 -0700
On Mar 22, 2007, at 1:20 AM, Martin Atkins wrote: > Maciej Stachowiak wrote: >> I think <audio> can use almost the exact same APIs for most things >> as <video>. This has the nice side benefit that new Audio() can >> just make an <audio> element and provide all the relevant useful API. > > To me, the distinction between the <audio> element and the Audio > object is that the former has a "place" in the document where that > audio content logically belongs, while the former is more of a > global trigger for web application sound effects. > > <audio> could, for example, be rendered in-line with surrounding > text in an aural browser. A visual browser would presumably provide > some kind of representation in the document of the audio which the > user can interact with. > > In other words, <audio> should be like <img> for sound. I generally agree, but note that new Image() makes an <img> element, so new Audio() could work analogously. I think <audio> is useful for foreground/semantic audio, as opposed to purely presentational sound effects, because non-browser tools analyzing a document would have a harder time finding audio referenced only from script. (Imagine a most-linked MP3s on the web feature in a search engine.) > Of course, what the visual representation of <audio> should be is > not an easy decision. It's even harder than <video>, because > there's no inherent visual content to overlay a UI on top of. I think it would be no visual representation by default with no controller, and just controls otherwise. Regards, Maciej
Received on Thursday, 22 March 2007 01:51:46 UTC