Re: sensor calibration fingerprinting

Hi Pete,

My short answer is: PRs welcome.

Please make sure there's a matching GH issue that clearly describes the problem the PR fixes. I think some of the relevant context might be in public-privacy ML archives currently, such as in this thread.

Thanks,

-Anssi (DAS WG co-chair)

> On 12. Feb 2020, at 21.59, Pete Snyder <psnyder@brave.com> wrote:
> 
> Wonderful and terrific! Double thanks again to Paul and Nick.
> 
> Anssi, it sounds like there is now a good solution for how to have a concise, robust, easy to implement solution to the privacy harm.  Whats the next step to get this added as part of the algorithms in the spec?  I (and im sure others on this thread) would be happy to work with the WG to put a PR together, if thats helpful.
> 
> Pete
> 
>> On Feb 12, 2020, at 11:55 AM, Paul Jensen <pauljensen@google.com> wrote:
>> 
>> It's great that you came up with the same rounding thresholds!
>> 
>> Applying a similar mitigation to accelerometer data seems prudent.  I propose rounding accelerometer values to the nearest 0.1 m/s^2.
>> Similar to the orientation and rotation mitigation, I chose 0.1 m/s^2 because:
>> 1. I think this is sufficient accuracy for typical accelerometer use cases.  Use cases I considered:
>>  a. For applications involving hand movement, my testing indicated accelerometer accuracy at this level was noise when holding a phone in hand.
>>  b. For pedometer applications, 0.1 m/s^2 accuracy should be sufficient as walking step detection requires an order of magnitude more minimum measurement. [0]
>>  c. For things like automobile 0-60mph measurement, 0.1 m/s^2 accuracy should be sufficient as the slowest acceleration is nearly an order of magnitude more. [1]
>> 2. I think 0.1 m/s^2 should be larger than most sensor quantization step sizes, and near enough to low resolution sensor quantization step sizes that contributions from the other axis' sensors for calibration purposes are undetectable. [2]
>> 
>> [0]
>> https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4030415/

>> "walking (including stair climbing and fidgeting of the feet while standing) when the SMA was between 0.135 g and 0.8 g"
>> 0.135g = 1.32 m/s^2
>> [1]
>> 60mph = 26.8 m/s.
>> Slowest car (https://www.zeroto60times.com/slowest-cars-0-60-mph-times/) takes 30s, which is 0.89 m/s^2.
>> [2]
>> Common discrete accelerometers:
>> https://www.st.com/en/mems-and-sensors/accelerometers.html

>> https://www.analog.com/en/parametricsearch/11175

>> Someone looked at accelerometer ranges on some common phones:
>> https://stackoverflow.com/questions/12739143/android-accelerometer-max-values/19981219#19981219

>> These accelerometer ranges seem reasonable compared to those of "common discrete accelerometers" linked above.
>> So the biggest range, -4g to +4g is 8g which is 78.5 m/s^2.
>> Common accelerometers have resolutions from 8-bits to 16-bits.
>> So a +/-4g 10-bit sensor would have a quantization step size of 0.077m/s^s.
>> A +/-4g 8-bit sensor would have a quantization step size of 0.307m/s^s.
>> 
>> P.S. I'll miss next week's call as I'm on vacation.
>> 
>> On Thu, Jan 30, 2020 at 6:47 PM Nick Doty <npdoty@ischool.berkeley.edu> wrote:
>> It looks like we separately came to the same conclusion on a recommended mitigation, down to suggesting the same threshold number of 0.1 degrees and 0.1 degrees per second. That is most encouraging!
>> 
>> (Paul, I pasted in the current text of your github proposal below, just for the ease of those reading through these email archives.)
>> 
>> It sounds like the Chromium proposal also included some evaluation that in fact these numbers are common, and that this would work for the typical sensor, not just the sensors evaluated in the paper. If that’s right, 1) thank you! and 2) that should give us more confidence in the general success of this mitigation and that browser implementation can be effective even when deployed on a variety of hardware.
>> 
>> Finally, in double-checking the paper and the sensors covered, it also sounds like there is a threat of calibration fingerprinting on *accelerometer* data (although that particular piece of research didn’t evaluate the exact precision of that fingerprinting for Pixel 2 and Pixel 3 phones, just that it appeared to be possible with those devices), which is separate from orientation and rotation rate, but is included in the DeviceMotion event. The Accelerometer API (which extends the Generic Sensor API) also provides access to that data; its security & privacy section notes the possibility of decreasing frequency for mitigating some privacy threats, but I’m not sure that would be effective for calibration fingerprinting. 
>> 
>> 1. should we apply the same class of mitigation to reported accelerometer data?
>> 2. can we determine the typical sensor gain in order to come up with a threshold?
>> 
>> Thanks,
>> Nick
>> 
>>> On Jan 15, 2020, at 1:24 PM, Paul Jensen <pauljensen@google.com> wrote:
>>> 
>>> Here's our proposal for what to do in Chromium: https://github.com/JensenPaul/sensor-fingerprint-mitigation

>>> 
>>> Device Orientation and Rotation Sensor Calibration Fingerprint Mitigation
>>> 
>>> Background
>>> 
>>> Most mobile phones contain device orientation sensors that attempt to measure the phone’s orientation and rotation around the three spatial axes.  These sensors undergo a factory calibration to ensure their accuracy. At present on many devices, each individual device’s chosen calibration values can be inferred from a sequence of sensor readings.  These calibration values are fairly identifying. This paper found the device orientation sensor calibration values for the iPhone 6S to offer roughly 42 bits of entropy, more than enough to distinctly identify everyone uniquely.  This represents a privacy concern for mobile phone users browsing the web as the device orientation sensor readings are available to all web sites without a permission prompt, and cannot be cleared, reset or modified so the user cannot control what sites track them or for how long.
>>> 
>>> Proposed Mitigation
>>> 
>>> Fingerprinting the sensor calibration relies on being able to infer values in the gain matrix that were set during factory calibration.  This requires analysis of the sensor readings with a level of precision significantly beyond sensor quantization step size so that the gain matrix’s contributions to the sensor readings can be inferred.  The DeviceOrientation Event Specification returns the device orientation on each axis as a 64-bit double floating point value representing degrees.  To mitigate performing this attack using readings that can be collected in a reasonable amount of time, we’re proposing rounding the sensor readings to the nearest tenth of a degree.  A tenth of a degree represents two things: First, it’s generally larger than the typical device orientation sensor quantization step size so it should mitigate inferring the gain matrix.  Second, it’s small enough that Web sites using the device orientation API should not be impacted by the rounding. Experimentation indicated that for human hand and head motion, a tenth of a degree level of precision is mostly noise.  Measuring head and hand motion should cover the known use cases these sensors support, including WebVR and game input. Use of a fixed value (i.e. 0.1 degrees) has the benefit that it’s not sensor dependent so it can be applied universally; this paper recommended rounding to the nearest multiple of the sensor’s nominal gain which requires knowledge of a device’s specific sensor.  The mitigation proposed here can be applied in the Web browser app which can be updated more easily and in a more timely manner compared to the mobile operating system.  This isn’t to say this fingerprinting method shouldn’t be mitigated at the operating system level as well.
>>> 
>>> For APIs that return device orientation in other units (e.g. OrientationSensor and Gyroscope), we’re proposing rounding them to equivalent amounts (e.g. to the nearest multiple of 0.00174533 radians, which is roughly equivalent to a tenth of a degree).
>>> 
>>> Similar to the device orientation sensor outputs, for the device rotation sensor outputs, we’re proposing rounding them to the nearest tenth of a degree per second.  Similar experimentation indicated that this level of precision is mostly noise for human hand and head motion, so it shouldn’t degrade Web site experiences, and is generally larger than the typical device orientation sensor quantization step size so it should mitigate inferring the gain matrix.
>>> 
>>> On Mon, Jan 13, 2020 at 5:39 PM Nick Doty <npdoty@ischool.berkeley.edu> wrote:
>>> Regarding DeviceOrientation and research showing persistent fingerprintability based on factory calibration of orientation sensors in mobile devices; I’ll add some notes to the GitHub issue here [0], but since it might be worth longer discussion in PING, especially for how it applies to other sensors, I’m writing out these comments for the mailing list.
>>> 
>>> From my reading of the paper [1], the authors suggest two mitigations that they seem to identify as resolving the issue altogether:
>>> 
>>> 1. adding random noise of -0.5-to-0.5 to each value and then rounding to the resolution of the sensor (16 bits, in the case of the iOS devices in question)
>>> 2. rounding the output to the nearest multiple of the nominal gain of the sensor (61 or 70 millidegrees per second, in the case of the iOS devices in question)
>>> 
>>> Those mitigations do seem to be hardware specific, so we might not be able to include hard numbers in the spec, but could still provide a normative indication that sensors should be rounded using the provided algorithm, based on their hardware details. If nominal sensor gain and sensor resolution are known for each device (*and* accessible to the browser from the underlying operating system), then those mitigation algorithms could be implemented by all browser implementers.
>>> 
>>> I’m not entirely clear on this yet, but it also seems possible that we could do a survey of the accelerometers and gyroscopes currently in use by most mobile devices and then require no more precision than slightly below the typical or lowest precision sensor in the survey. Would that be a huge detriment to functionality? I’m not sure, but if I’m reading it correctly that the iOS devices measure a range of 4000 degrees per second with a resolution (and nominal gain) of 0.061, then we’d be talking about an extra error of 0.1 / 4000 = 0.0025%. That certainly sounds small, though I’m sure it depends on the particular use cases.
>>> 
>>> We might also chat with Apple folks about the mitigations that they implemented in response to the paper; it sounds like they added noise before rounding and deployed that for native iOS devices. Did that cause functionality problems for iOS apps? Was the implementation different for every hardware version?
>>> 
>>> This is all based on my non-expert reading of the “SensorID" paper, so I’d welcome more ideas from implementers or those more familiar with gyroscope/accelerometer sensors. I believe there will be some discussion of that at this week’s PING call (with regrets, I won’t be able to join, but will read the minutes).
>>> 
>>> Because the authors demonstrated an in-the-wild risk of permanent (across factory reset of the device even) identifiers, there are good reasons for implementers and spec maintainers to try to implement short-term fixes, and some have over the past year in response to previous research (like permission gating). But the paper also suggests — and this is something I would really value input on from others — that other device sensors are likely to have similar risks of persistent cross-origin identification, based on being able to detect factory calibrations from a relatively small sample of readings. If we can understand what causes that risk, how to identify it and how to mitigate it, we can provide more specific guidance (in the fingerprinting guidance doc, or perhaps the threat model) and apply it to all new sensor specs that come up for review.
>>> 
>>> Cheers,
>>> Nick
>>> 
>>> [0] https://github.com/w3c/deviceorientation/issues/85

>>> There’s also some related discussion in https://github.com/w3c/deviceorientation/issues/57 although that thread goes a little off-topic, so it’s not the easiest to follow.
>>> 
>>> [1] https://www.ieee-security.org/TC/SP2019/papers/405.pdf

>>> Zhang, Jiexin, Alastair R. Beresford, and Ian Sheret. “SensorID: Sensor Calibration Fingerprinting for Smartphones.” In 2019 IEEE Symposium on Security and Privacy (SP), 638–55. San Francisco, CA, USA: IEEE, 2019. https://doi.org/10.1109/SP.2019.00072.

>>> 
>>> 
>>> 
>>> 
>>> Previously discussed on public-privacy here:
>>> 
>>>> On Oct 24, 2019, at 4:38 PM, Pete Snyder <psnyder@brave.com> wrote:
>>>> 
>>>> 3. I double checked with the authors of the paper I linked to, and they say that the functionality they use to fingerprint devices (quote: “the DeviceMotionEvent APIs, particularly DeviceMotionEvent.acceleration and DeviceMotionEvent.rotationRate”) don’t require permissions to access.  The standard needs to be updated so that users cannot be passively fingerpritined.
>>>> 
>>>> Also re the findings: the findings were that devices with high quality accelerometers are vulnerable.  The Pixel and iPhone devices they tested were all vulnerable; the other android devices they tested just had lower quality sensors.  As the avg quality of phones increase, more and more devices will be vulnerable then.  So its still an active harm / concern.
>>>> 
>>>> I’ll open up issues for 1-3 in the repo now.
>>>> 
>>>> Thanks,
>>>> Pete
>>>> 
>>>>> On Oct 21, 2019, at 2:09 PM, Kostiainen, Anssi <anssi.kostiainen@intel.com> wrote:
>>>>> 
>>>>> 
>>>>>> On 21. Oct 2019, at 22.40, Pete Snyder <psnyder@brave.com> wrote:
>>>>>> 
>>>>>> In the meantime though, was wondering if your group was familiar with this work [1] on using sensor APIs for permission less fingerprinting, if the standard has been updated to fix / prevent these attacks, and if not, how the standard should be adopted to fix.  (TL;DR; you can derive devices w/ ~67 bit identifiers if they have accelerometers installed, using the sensor APIs).
>>>>> 
>>>>> Thanks for the pointer.
>>>>> 
>>>>> This paper published after the specs reached CR has not been discussed in the group. Other similar fingerprinting vectors brought to the group’s attention have been considered, however:
>>>>> 
>>>>> https://w3c.github.io/sensors/#device-fingerprinting

>>>>> 
>>>>> We could add this paper to the list of references for completeness.
>>>>> 
>>>>> Mitigations are discussed in:
>>>>> 
>>>>> https://w3c.github.io/sensors/#mitigation-strategies

>>>>> 
>>>>> We could consider adding the other proposed mitigation (add uniformly distributed random noise to ADC outputs before calibration is applied) to the mitigations section. Permissions is already covered and the specs define hooks for prompting.
>>>>> 
>>>>> As a general comment, it *seems* this particular attack was fixed in more recent iOS versions and on most or all(?) Android devices tested there was less entropy, so no global uniqueness.
>>>>> 
>>>>> Suggestions (and PRs) from PING welcome.
>>>>> 
>>>>> Thanks,
>>>>> 
>>>>> -Anssi
>>>>> 
>>>>> 
>>>>>> Refs:
>>>>>> 1: https://www.repository.cam.ac.uk/bitstream/handle/1810/294227/405.pdf?sequence=3

>>> 
>> 
> 

Received on Wednesday, 12 February 2020 20:58:05 UTC