W3C home > Mailing lists > Public > www-svg@w3.org > November 2004

Re: SVG1.2 encoding for Sockets and URLRequests

From: Jan-Klaas Kollhof <jan@kollhof.net>
Date: Tue, 23 Nov 2004 13:26:20 +0100
Message-ID: <41A32C6C.3010101@kollhof.net>
To: Thomas DeWeese <Thomas.DeWeese@Kodak.com>, www-svg@w3.org


>    Well, no matter what you need to say what the encoding is.
> Since AFAIK a string in JavaScript uses at least a short, should we
> send those shorts as is?  

 From experience with HTTPRequest in IE I have a feeling that a char is 
sent as byte if  the ord(char) < 127 else as 16bit.
Then there are limitations of what chars can be sent and what cannot. 
This is not cool !!!! Maybe it was faulty testing on my side.

>    I suspect you intend this to either send just the low 8bits of
> a UTF16 encoded String (good for binary data), or send a UTF8 encoded
> string (good for text).  However, you _must_ specify this.

Would be good to be able to send both the low 8bits of each cahr in a 
string (to do binary stuff)
and to send text as UTF-8. I guess a setEncoding() would be usefull.

What about  data coming back from the server side?
How are the bytes to be interpreted and represented in script?

The way Strings to be sent and bytes received are to be interpreted
must be specified so one can impl. and use the sockets or impl a server 
the socket connection can talk to.



 Jan-Klaas Kollhof
 Founding Partner, Vectoreal
 phone: +49 174 968 9477
 email: jan@kollhof.net
 web: http://jan.kollhof.net
Received on Tuesday, 23 November 2004 12:22:53 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 8 March 2017 09:47:01 UTC