- From: <bugzilla@jessica.w3.org>
- Date: Wed, 04 May 2011 22:55:30 +0000
- To: public-html-bugzilla@w3.org
http://www.w3.org/Bugs/Public/show_bug.cgi?id=12100 Ian 'Hixie' Hickson <ian@hixie.ch> changed: What |Removed |Added ---------------------------------------------------------------------------- CC| |ian@hixie.ch --- Comment #3 from Ian 'Hixie' Hickson <ian@hixie.ch> 2011-05-04 22:55:29 UTC --- Agreed that DOM-tree-manipulating APIs should work in 16bit codepoints. The intent of this change was to make sure that APIs like window.alert() worked with Unicode, and that all the various algorithms in the spec worked with Unicode, etc. We don't want algorithms that talk about doing things character by character breaking every time an astral plane character gets involved because they get split in two. Does anyone have any suggestions for how to do this without having to go down every single method and attribute or every single algorithm saying which ones are operating in Unicode space and which are operating in UTF-16 word space? -- Configure bugmail: http://www.w3.org/Bugs/Public/userprefs.cgi?tab=email ------- You are receiving this mail because: ------- You are the QA contact for the bug.
Received on Wednesday, 4 May 2011 22:55:33 UTC