- From: Christopher Cook <christopher.cook@webprofusion.com>
- Date: Fri, 12 Oct 2012 21:26:53 +0100
- To: public-coremob@w3.org
Hi Folks, Regarding the Audio part of the mobile web experience I see we're trying to get low-latency multi-channel playback - I'm assuming that's for audio which has been fully loaded and is ready to play. The 10ms latency target sounds great but may be quite tricky to test for? Are there any ambitions for recording support to be part of the core features? My specific example is a guitar tuner feature within a guitar related app but there are plenty of other scenarios. In a best case I would like musicians to be able to collaborate with each other via their phones 'live' (network permitting). I imagine it's only realistic to hope for 2 mono channels or one stereo input at best. I realise the Web Audio API is broadly referenced here but regarding minimum supported features, latency in both playback and especially recording is likely to be unpredictable due to OS and hardware variation. I'm not aware if the API allows for these latencies to be detected and compensated for? I am aware that the current support on desktops for the Audio API is still yet to reach maturity and on mobile I'm lucky if anything works yet. Perhaps this API does not yet have the required test suite to determine correct functionality on any platform? Regarding the minimal audio latency figures, I found these guidelines for Adobe Audition quite succinct: http://helpx.adobe.com/audition/kb/troubleshoot-recording-playback-monitoring-audition.html#main_General_guidelines_that_apply_to_latency_times Best regards,
Received on Monday, 15 October 2012 07:56:30 UTC