W3C home > Mailing lists > Public > ietf-http-wg-old@w3.org > May to August 1995

Re: Globalizing URIs

From: Gavin Nicol <gtn@ebt.com>
Date: Wed, 2 Aug 1995 23:48:32 -0400
Message-Id: <199508030348.XAA22475@ebt-inc.ebt.com>
To: masinter@parc.xerox.com
Cc: glenn@stonehand.com, html-wg@oclc.org, http-wg%cuckoo.hpl.hp.com@hplb.hpl.hp.com
>I don't like this model, but prefer another one:
>Let me explain this via an 'ftp' example.
>The FTP protocol doesn't care what character set your file system
>uses. You open a 8-bit connection and send US-ASCII characters to the
>server. If you want to retrieve a file, you send 'RETR xxxx' and when
>you want to store a file, you send 'STOR xxxx', where 'xxxx' are
>characters *NOT* in the native character set of the file system, but
>rather in whatever transcription of that character set is made
>available by the FTP server.

I don't understand how this works, especially the "some transcription"
part. How is the receiving server to know the name to store it under,
or is "%B0%F5%BA%FE.html" translated to insatsu.html for storage

There are 3 problems I have with this model:

1) It would probably require the use of some kind of database to map
   the local filename to the HTTP representation, because there are
   possible transcription collisions, and because HTTP is stateless.
2) Without some standard mapping it seems somewhat difficult for a
   browser to decide what to send to the server. Yes, I know
   people will say that the server decides, because they make the
   URL's available in the first place, but what happens if a server
   sends me an EUC URL, and I send it a SJIS one back?
3) URL's are *not* used solely in HTTP transactions. 
Received on Wednesday, 2 August 1995 20:45:55 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 7 January 2015 14:40:14 UTC