- From: Mike Bishop <Michael.Bishop@microsoft.com>
- Date: Wed, 2 Nov 2016 18:16:34 +0000
- To: Patrick McManus <pmcmanus@mozilla.com>, Cory Benfield <cory@lukasa.co.uk>
- CC: "Roy T. Fielding" <fielding@gbiv.com>, Julian Reschke <julian.reschke@gmx.de>, Kazuho Oku <kazuhooku@gmail.com>, HTTP Working Group <ietf-http-wg@w3.org>
- Message-ID: <BN6PR03MB2708F3F550D3E40958D9B55B87A00@BN6PR03MB2708.namprd03.prod.outlook.com>
Patrick, thank you for helping to maintain balance here. An aside: My favorite anti-ossification UA-sniffing encounter is this one: “Edge, could you please stop claiming to be that particular version of Chrome? That version of Chrome had a bug and we sent different HTML to work around it, but you don’t have that bug and can handle the normal content. So now I have to special-case your lie out of the special-case for Chrome.” I’m personally a fan of “follow the spec and let broken code fall where it may.” And professionally, I spend a lot of my day discussing how not to make the broken code freak out when we fix a bug. So I understand the tension very well. While I agree with Roy that probably the first thing that should have been implemented in any given library is “unknown NXX is treated as N00, make sure I have a processing path for N00,” N=1 is a rare case that could easily have been skipped in a pragmatic implementation. I’m in favor of the same flow that currently happens for 100 – clients can indicate when they think it would be useful, servers MAY send unsolicited, but with a compat warning that likely will keep them from actually doing so without the hint. And maybe we should note for a future HTTP/1.1 update that this should be made an explicit pattern for 1XX codes, since it’s become the de facto norm. From: Patrick McManus [mailto:pmcmanus@mozilla.com] Sent: Wednesday, November 2, 2016 8:17 AM To: Cory Benfield <cory@lukasa.co.uk> Cc: Roy T. Fielding <fielding@gbiv.com>; Julian Reschke <julian.reschke@gmx.de>; Kazuho Oku <kazuhooku@gmail.com>; HTTP Working Group <ietf-http-wg@w3.org> Subject: Re: New Version Notification for draft-kazuho-early-hints-status-code-00.txt [co-chair hat on] On Wed, Nov 2, 2016 at 5:59 AM, Cory Benfield <cory@lukasa.co.uk<mailto:cory@lukasa.co.uk>> wrote: > On 1 Nov 2016, at 22:50, Roy T. Fielding <fielding@gbiv.com<mailto:fielding@gbiv.com>> wrote: > No. What I've learned is that every feature in every protocol is poorly > implemented by some poor soul who thinks they deserve special consideration > for their inability to interoperate with the future. I have, in the past, > consistently refused such considerations. I don’t understand where you think anyone who wrote a broken implementation is asking for special consideration. The only HTTP implementation I fully wrote is a HTTP/2 implementation that can handle 1XX codes per the specification. I certainly don’t need the special consideration for libraries I maintain, because I’ll be shipping patches for them. I am simply informing the IETF that the vast majority of widely deployed HTTP client libraries today will fail in the face of the 103 status code. Since I looked at Python I have gone back and looked at Ruby, Javascript, and Go, and the same remains true there. This is a normal and potentially helpful tension. Cory has done some research involving running code that helps influence this discussion. The IETF values running code - thanks for the data and your implementations. Certainly interop is a valid consideration. Roy is pushing back more or less with an ossification argument - we can't be afraid to use features that are well defined. That's also a valid consideration - thanks for the point Roy. Balancing this is the soup of early draft rough consensus, right? (Please note that this isn't currently a WG draft - but its certainly on topic to discuss as long as it doesn't drown out our adopted work.) I'm sure we can all keep it within the bounds of the normal professionalism this list typically exhibits. What that means is that the user-agent field will not be used to flag non-compliant implementations, it will be used to flag *compliant* ones, as there are vastly more of the former than the latter. That means that Chrome, Safari, Firefox, Opera (maybe), curl, and wget will all get a pass, and everyone else will be “guilty until proven innocent”. That means that we are rolling out a feature that is expressly a "browsers-only” feature if we deploy it in this way. I'd like to hear more opinions on this general topic - abstracted away from 103 if we can. UA and Server of course were aimed at the whitelist/blacklist idea but in reality, at least from my pov, lead to a pretty terribly ossified world - full of sniffing and masquerading and out of date lists. Explicit negotiations have, at least recently, developed a better track record.. and with h2's more efficient heading encoding seem to be a better practice (again from my pov).
Received on Wednesday, 2 November 2016 18:17:11 UTC