Can HTTP headers encode enough URLs? (was: XDomainRequest Integration with AC)

Julian Reschke wrote:
> 
> Ian Hickson wrote:
>> On Mon, 21 Jul 2008, Julian Reschke wrote:
>>> Ian Hickson wrote:
>>>> ...
>>>> ...which basically just says it's a valid URL if it's a valid URI or 
>>>> IRI
>>>> (with some caveats in the case of IRIs to prevent legacy encoding 
>>>> behaviour
>>>> from handling valid URLs in a way that contradicts the IRI spec). This
>>>> doesn't allow spaces.
>>>> ...
>>> Correct. But it does allow non-ASCII characters. How do you put them 
>>> into an HTTP header value?
>>
>> Presumably HTTP defines how to handle non-ASCII characters in HTTP as 
>> part of its error handling rules, no?
> 
> Non-ASCII characters in header values are by definition ISO-8859-1. Yes, 
> that sucks. It's not sufficient to encode all IRIs, thus you need to map 
> IRIs to something you can use.
> 
> And no, that has nothing to do with error handling.

It sounds like what you are asking is if HTTP headers can encode all the 
values for 'url' that we need? This is different from my original 
concern, but is certainly a valid question.

Given that we don't need to encode the all possible paths, since all 
paths are disallowed, is there still a concern? People would have to use 
punycode to encode non-ascii characters if they are part of the domain 
name, which is unfortunate, but hopefully tooling will help here.

/ Jonas

Received on Monday, 21 July 2008 07:44:05 UTC