- From: Cameron McCormack <cam@mcc.id.au>
- Date: Thu, 17 Jul 2008 10:44:54 +1000
- To: public-svg-wg@w3.org
In writing the test for stringToBinary(), I’m left wondering what to do
when a DOMString contains malformed UTF-16. For example:
stringToBinary('\ud800', 'UTF-16');
DOM 3 Core says that DOMStrings are represented using a sequence of
16-bit units which are interpreted as UTF-16. ECMAScript doesn’t
require that a String contain valid UTF-16, though. Should we add some
text to say that an ENCODING_ERR is thrown if the string passed in is
not valid UTF-16?
Thanks,
Cameron
--
Cameron McCormack ≝ http://mcc.id.au/
Received on Thursday, 17 July 2008 00:45:47 UTC