Re: Globalizing URIs

At 12:37 PM 8/9/95, Martin J Duerst wrote:
>Keith Moore wrote, on a posting of mine:
>>...
>>And there is a strong argument that (human-meaningful) names and
>>(machine-meaningful) addresses should be kept separate anyway.  Make
>>the document titles human meaningful, let's build search services that
>>understand various character sets, and let the search services resolve
>>into pure-ASCII URIs.
>
>I have no problem with that if you restrict URIs in such a way (e.g.
>just allowing numbers or such) that even in the English-speaking
>part of the world, there is no danger that builders of search
>services think that the URL contains meaningful information.
>Currently they do, and that's one of the reasons we are thinking
>about the problem at hand.

Martin,
 
    People are going to try and decompose URL's despite all of our
    warnings to the contrary.   The way to get the desired behavior
    is to create alternatives (such as URN's and rich directories)
    which perform better.
 
    While I'd like Internet services to provide equal functionality
    to users of all languages, the determination of the best layer
    to support such functionality is a very difficult topic.  I believe
    that we can learn quite a bit from the internationalization of 
    Internet email, where (with the right implementations) it is now easy 
    for users of all languages to interchange messages.  Note that this
    was accomplished without reengineering the domain name system, and I
    believe that we can accomplish similiar goals for HTML documents (and
    the linkages displayed to the user) without destabilizing the current
    URL specification.

    While it may not be the optimal overall approach to the problem, it 
    does follow the Internet traditions of running code and incremental
    evolution.

/John

Received on Sunday, 13 August 1995 15:51:21 UTC