- From: James Darpinian <jdarpinian@google.com>
- Date: Wed, 7 Aug 2019 14:30:09 -0700
- To: Kevin Rogovin <kevinrogovin@invisionapp.com>
- Cc: Dean Jackson <dino@apple.com>, Doug Moen <doug@moens.org>, public-gpu <public-gpu@w3.org>
- Message-ID: <CAORar-zHbC3CkfEyevgD2VhM_nOMLg9FF0HSV9k3CHfPz9sj1g@mail.gmail.com>
Kevin makes an interesting point that I hadn't considered. Explicitly providing GPU info by default can give the browser and user more control. If we don't provide GPU info, sites will be forced to implement GPU fingerprinting methods that are infeasible for us to block. Such methods will be widespread and included in popular libraries. If we do provide GPU info by default, sites will likely rely on it instead of implementing unblockable GPU fingerprinting. Then we will have the option to modify the GPU info we provide in some situations, such as a private browsing mode. In practice, the privacy of private browsing mode could be higher in the latter scenario than the former. On Wed, Aug 7, 2019 at 1:30 PM Kevin Rogovin <kevinrogovin@invisionapp.com> wrote: > Hi, > > I am sorry, this is very counter-productive to the success of WebGPU whose > goal is to close the performance gap for hardware accelerated graphics > between native and web. > > The previous posts prove that that it is quite possible and feasible to > identify (roughly) the GPU at the cost of slower start-up performance and > batter life on mobile. GPU-intensive apps will NEED to do this (because the > performance profile for different GPU's is literally all over the map > across architectures for various specific loads), thus it will be done. > Rather than making the developer's life more difficult along with making > the user's experience worse, there is a pretty clear way forward: provide > the GPU info directly. If hiding this is important to a user, just as many > browsers allow for changing the browser ID, perhaps a browser can also > provide an option to NOT provide the GPU (i.e. a WebGPU implementation > would report "Generic GPU"). This would then give customers the control of > identifying the hardware, and for those users which take that option who > use a highly GPU intensive app, the downside that it the app will run > perf-test-probing along with other probing on the device. In all honesty, > compared to all the other shenanigans going on with browser tracking, this > is by far the smallest potato. > > If we were in a world where the GPU architectures were quite similar, this > would not be needed, but that is not the world we live in (in all honestly > thankfully too, since variety in hardware is a good thing in my eyes). > > My 2 cents. > > -Kevin > > On Wed, Aug 7, 2019 at 11:04 PM Dean Jackson <dino@apple.com> wrote: > >> >> >> > On 7 Aug 2019, at 14:05, Doug Moen <doug@moens.org> wrote: >> > >> > >> > >> > On Tue, Aug 6, 2019, at 10:35 PM, Myles C. Maxfield wrote: >> >> We’ve heard this defeatist argument before and entirely disagree with >> it. >> > >> > My point is that security designs by non-experts have a poor track >> record. >> > I think fingerprinting security for WebGPU should be designed by a web >> browser fingerprinting security expert >> >> We agree. This *is* advice from our Web Browser fingerprinting security >> experts. >> >> It's not a secret that there are low-level techniques to identify >> hardware, and thus users. Their existence does not mean it is acceptable to >> provide high-level ways to identify hardware/users. >> >> To give an analogy (that will probably cause more distraction than help, >> but whatever).... just because someone can throw a brick through your >> window doesn't mean you should leave your front door unlocked. Maybe some >> day you'll get brick-proof windows. >> >> Dean >> >> >> > , and I also think that you are trying to fix the problem at the wrong >> level of abstraction. The security should be provided at a level above the >> WebGPU API. >> >> >>
Received on Wednesday, 7 August 2019 21:30:45 UTC