- From: <bugzilla@jessica.w3.org>
- Date: Thu, 31 Jul 2014 11:35:27 +0000
- To: public-webcrypto@w3.org
https://www.w3.org/Bugs/Public/show_bug.cgi?id=25985 --- Comment #41 from Henri Sivonen <hsivonen@hsivonen.fi> --- (In reply to Ryan Sleevi from comment #40) > I think it entirely disregards the point that a "web" device ranges from > your smartwatch to your laptop, and that there are a variety of concerns and > requirements in between. This is the sort of argument that was used for Mobile Profiles. Mobile Profiles weren't the solution for making the Web work on mobile. Porting full browser engines to mobile devices was. > > I also think it's a mistake to make browser API specs vaguer in order to > > make devices with low-quality browsers or browser ports comply. See: Various > > Mobile Profiles from the feature phone era. > > Low-quality is an entirely unnecessary pejorative here, unless you believe > that "Web" is meant to encompass "Desktop machines with high-power, general > purpose CPUs". Which the W3C, nor do many of the UAs, consider to be the > end-all, be-all. You can get a full browser engine ported to a settop box. The hardware required to run a full browser engine fits on a HDMI+USB dongle these days and doesn't even require a *box*. B2G is available for porting. It looks like Google launched an Android variant for this space, presumably capable of running Chromium. There are various consultancies that'll port WebKit to a settop box. In that case, it's a matter of how well the layer below WebKit is done--it's not a matter of CPU power. (I mean the layer of functionality that B2G and Chromium include but the cross-platform WebKit does not include and needs supplied on a per-port basis.) If a WebKit port for a TV-oriented device cuts some corners, how should it be described? > Chromium's ability to launch was predicated on it's ability to use the > existing platform crypto libraries - including on Windows. Only as it grew, > and the "experiment" was worthwhile, was it even possible to switch to NSS, > and only recently has it become "possible" to switch to BoringSSL - and not > without significant costs and engineering investment over *years*. That's > not something to so glibly dismiss with a one-off, nor does it do anyone in > the WG any favours, when the heart of the question remains - which is, "what > is conformant" For launching a new browser, "what's conformant" is less relevant than "what works". However, it's a spec failure if you can't get "what works" by looking at the specs to implement "what's conformant". Being able to launch new browsers by not having to reverse engineer everything that came before is supposed to be facilitated by specs. Suppose Servo gets to the point later on when adding Web Crypto becomes relevant in an effort to make Servo useful for using the Web in a future where various Web sites/apps use Web Crypto. Servo then needs to implement a set of algorithms that makes Web Crypto-using sites work in Servo. If the spec doesn't say what that set is, the spec is failing to serve a function specs are supposed to serve. > > > - Is a distinction made between the library being an older version, which > > > may not support RSA-OAEP, and the library being newer, but having RSA-OAEP > > > explicitly disabled by the user (through means outside of Chromium's > > > control?) > > > > I'd say the question of whether Chromium is conforming in that case wouldn't > > be of practical importance. From the practical perspective, it matter if > > stuff works. If stuff doesn't work, the system+configuration as a whole > > would be non-conforming if RSA-OAEP was in the set of algorithms that the WG > > has deemed to be part of the Web Platform. > > This is of the utmost practical importance, because one defines the other. > You can't just wave this away, as you did earlier. What does conformance > mean on such a system. It'll mean whether some sites work or don't work. > Does the lack of RSA-OAEP mean you lack WebCrypto entirely? Probably not. > > > - What about for algorithms for which there are no PKCS#11 mechanism points > > > assigned yet, like Curve25519 or the NUMS curves? > > > > Seems reasonable to implement Curve25519 without any PKCS#11 layer in > > between in that case. > > It may seem reasonable to you, but only because you seem to be choosing to > ignore the very concerns being presented to you. The use of the PKCS#11 > layer is what allows a UA to avoid, entirely, any of the legal or regulatory > frameworks surrounding it. I understand the merit of separation of concerns in the abstract sense. I fail to understand how that's relevant in practice when in all the common cases the same entity (Google, Mozilla, Microsoft, Apple, Opera, $LINUX_DISTRO, $EMBEDDED_DEVICE_VENDOR) ships both the browser engine and the crypto library. If you use Chrome, you get both Blink and NSS from Google, right? If you use pre-built Chromium, you probably get both Chromium and NSS from the same Linux distribution. > So by introducing such arbitrary value judgements, and even if they're based > on an objective set of criteria (the absolute MINIMUM requirement for MTI), > the criteria themselves will be somewhat arbitrary (or "consensus driven", > which is indistiguishable from arbitrariness), it just serves to drive > endless debate in a WG not well-suited for that. I can see that there are political reasons that will make the Web Crypto spec fail to serve a part of the function that Web specs are supposed to serve for the purpose of the Servo example above. I guess the algorithms being discrete units makes the outcome less bad in practice than a spec stipulating that any random parts of the spec are optional. *Someone* needs to decide which algorithms get shipped by the entities that ship both browsers and crypto libs and the browsers with enough market share that Web authors bother testing with them will determine what the set of algorithms practically needed for Web compat ends up being. So to go from research to a product, a project like Servo would then, instead of referring to the spec, have to examine the sets of algorithms present by default in popular browsers and take the intersection of those sets as a lower bound and the union as an upper bound of what needs to be implemented and reconcile the difference between these sets by examining how often broken sites are encountered. I'll concede that the political issues probably make the WG ineffective at overtly saying that which will end up being determinable (as a lower bound and upper bound per the previous paragraph) for practical purposes from what the WG participants end up doing each in their own corner, so I won't debate this further. -- You are receiving this mail because: You are on the CC list for the bug.
Received on Thursday, 31 July 2014 11:35:30 UTC