- From: Tobie Langel via GitHub <sysbot+gh@w3.org>
- Date: Fri, 03 Mar 2017 20:54:37 +0000
- To: public-device-apis-log@w3.org
> I agree that it's difficult to decide on "optimal" frequencies. First you have security/privacy and leak risks. Then you have usability. Even certain data at 0.1 Hz can still provide sensitive information. Of course the more larger the frequency, the more sophisticated models can be devised. I don't think it's possible to find a "safe zone". Plus much like with crypto, I imagine such a safe zone would continuously shift. > It all depends on the risk/attack type and what type of "information" is to be inferred/extracted. So some attacks will work better at 120, 60, 20 Hz, etc. In some cases, even if you have few readout values available, you can "connect the dots" and infer useful information. Obviously the more the dots, the more reliable attacks are. Each sensor has its own issues, and it's unlikely we can find a catch-all solution. Fair point. > I think we've already mentioned possibilities of asking permissions for high-freq in another thread. Not sure how feasible would that be technically to ask a user for a permission to use a particular frequency? It's technically bordering on the trivial. From a usability point of view is disastrous however. How would you expect a user that probably never heard what a gyroscope is and might not know what hertz are, to decide a what frequency as sensor should be polled? -- GitHub Notification of comment by tobie Please view or discuss this issue at https://github.com/w3c/sensors/issues/98#issuecomment-284067479 using your GitHub account
Received on Friday, 3 March 2017 20:54:44 UTC