W3C home > Mailing lists > Public > public-svg-wg@w3.org > July to September 2008

Attempting to encode a bad DOMString with stringToBinary()

From: Cameron McCormack <cam@mcc.id.au>
Date: Thu, 17 Jul 2008 10:44:54 +1000
To: public-svg-wg@w3.org
Message-ID: <20080717004454.GA10624@arc.mcc.id.au>

In writing the test for stringToBinary(), I’m left wondering what to do
when a DOMString contains malformed UTF-16.  For example:

  stringToBinary('\ud800', 'UTF-16');

DOM 3 Core says that DOMStrings are represented using a sequence of
16-bit units which are interpreted as UTF-16.  ECMAScript doesn’t
require that a String contain valid UTF-16, though.  Should we add some
text to say that an ENCODING_ERR is thrown if the string passed in is
not valid UTF-16?

Thanks,

Cameron

-- 
Cameron McCormack ≝ http://mcc.id.au/
Received on Thursday, 17 July 2008 00:45:47 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:20:09 UTC