W3C home > Mailing lists > Public > public-device-apis-log@w3.org > February 2017

Re: [sensors] Javascript 120Hz devicemotion events for high end inertial applications

From: Paul Adenot via GitHub <sysbot+gh@w3.org>
Date: Thu, 16 Feb 2017 10:54:54 +0000
To: public-device-apis-log@w3.org
Message-ID: <issue_comment.created-280299244-1487242493-sysbot+gh@w3.org>
> latency (from time the data point is collected to time you get to 
use it),

This is _very_ dependent on your application. It is generally 
considered that a normal human (i.e. not a professional drummer, for 
example) considers that sound happens "immediately" if the delay 
between doing the thing that makes your sensor react (be it a 
gyroscope, a fader, a knob, a keyboard press, etc.) is less than 20ms.

Now, you should experiment. Depending on the interaction, the type of 
sound, the parameter that is being modulated, etc., it is possible 
that you'll find that 100ms is acceptable. Keep in mind that the 
output latency of a normal computer using the Web Audio API is between
 10ms and 40ms. It's possible to bring this down with special hardware
 and browser builds.

> frequency (how many samples per seconds you need), and

It depends on the use case, for controlling parameters. Any rate is 
manageable with the Web Audio API, since you're simply doing js calls.
 There is a possibility to schedule a bunch of things in advance, or 
to react just in time, etc.

> whether it's OK to batch the samples together (and if so, how often 
you need these batches sent).

This is ok, iff you can take the latency hit caused by the packetizing
 of the data.

-- 
GitHub Notification of comment by padenot
Please view or discuss this issue at 
https://github.com/w3c/sensors/issues/98#issuecomment-280299244 using 
your GitHub account
Received on Thursday, 16 February 2017 10:55:00 UTC

This archive was generated by hypermail 2.4.0 : Monday, 4 July 2022 12:47:53 UTC