Re: Globalizing URIs

I don't like this model, but prefer another one:

Let me explain this via an 'ftp' example.

The FTP protocol doesn't care what character set your file system
uses. You open a 8-bit connection and send US-ASCII characters to the
server. If you want to retrieve a file, you send 'RETR xxxx' and when
you want to store a file, you send 'STOR xxxx', where 'xxxx' are
characters *NOT* in the native character set of the file system, but
rather in whatever transcription of that character set is made
available by the FTP server.

The *PROTOCOL* doesn't define the mapping between the bytes sent and
the the file system. This is completely up to the implementation of
the FTP server.

Now, the FTP URL scheme defines yet another mapping. If you happen to
want to send 'RETR /?#frob' to your FTP server, you have to actually
encode the '#' in the FTP URL. Thus, there's another level of encoding
in URL -> ftp-protocol that sits on top of the encoding chosen by your
implementation of FTP-protocol -> file system.

The same situation holds with the HTTP protocol. Implementors of HTTP
servers which deal with files on file systems that allow file names to
be written in other character sets will have to chose some mapping
between those files and the HTTP protocol. That mapping is *not* part
of the HTTP protocol, and I don't think it should be.

Received on Wednesday, 2 August 1995 19:39:11 UTC