W3C home > Mailing lists > Public > public-coremob@w3.org > October 2012

Realistic web app expectations for mobile Audio?

From: Christopher Cook <christopher.cook@webprofusion.com>
Date: Fri, 12 Oct 2012 21:26:53 +0100
Message-ID: <50787D0D.9030508@webprofusion.com>
To: public-coremob@w3.org
Hi Folks,

Regarding the Audio part of the mobile web experience I see we're trying 
to get low-latency multi-channel playback - I'm assuming that's for 
audio which has been fully loaded and is ready to play. The 10ms latency 
target sounds great but may be quite tricky to test for?

Are there any ambitions for recording support to be part of the core 
features? My specific example is a guitar tuner feature within a guitar 
related app but there are plenty of other scenarios. In a best case I 
would like musicians to be able to collaborate with each other via their 
phones 'live' (network permitting). I imagine it's only realistic to 
hope for 2 mono channels or one stereo input at best.

I realise the Web Audio API is broadly referenced here but regarding 
minimum supported features, latency in both playback and especially 
recording is likely to be unpredictable due to OS and hardware 
variation. I'm not aware if the API allows for these latencies to be 
detected and compensated for? I am aware that the current support on 
desktops for the Audio API is still yet to reach maturity and on mobile 
I'm lucky if anything works yet. Perhaps this API does not yet have the 
required test suite to determine correct functionality on any platform?

Regarding the minimal audio latency figures, I found these guidelines 
for Adobe Audition quite succinct:


Best regards,
Received on Monday, 15 October 2012 07:56:30 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 20:05:48 UTC