Re: FYI... Binary Optimized Header Encoding for SPDY

On 2012/08/04 4:43, Mike Belshe wrote:
> On Fri, Aug 3, 2012 at 10:33 AM, James M Snell<>  wrote:

>> On Fri, Aug 3, 2012 at 10:15 AM, Mike Belshe<>  wrote:
>>> OK.  More concrete use cases for why we need this would help me
>>> understand better.
>>>  From the User Agent's perspective - how would it know whether a given
>>> HTTP/2 server would be able to grok its UTF8 headers?  Just try and fail?
>> No different than what the user-agent does today: attempt to send the
>> message and be prepared to deal with failure. Currently, there's no
>> guarantee that an origin server supports the various Allow-* headers...
>> some do, some don't. That's ok.

Well, actually, if HTTP/2 says headers can contain UTF-8, then every 
HTTP/2 server must grok them, or not? Of course that doesn't mean that 
every protocol element in every header can use UTF-8, or that where 
UTF-8 is allowed by the definition of the header syntax, every value 
will be accepted by the server. But that's the same for ASCII now.

> OK - so this means browsers won't use it :-)

I'm rather sure that browser makers would prefer to use UTF-8 rather 
than some obscure encoding if HTTP allowed that. Actually, that would 
have been the case 10 and more years ago.

> So now we need to know which use cases *would* use it :-)

Look e.g. at and, which are just a hopeless mess.

> Overall, its a fine idea, I just don't think we need it.

You probably don't think you need it because you are not an user of HTTP 
who has to deal with non-ASCII characters day-in-day-out.

If HTTP used something like Baudot code, and occasionally a bit of ASCII 
or EBCDIC (potentially identified as such, but potentially not) 
depending on various circumstances, I guess you'd have an easier time 
understanding the need.

Regards,    Martin.

Received on Monday, 6 August 2012 02:42:21 UTC