- From: L. David Baron <dbaron@dbaron.org>
- Date: Thu, 6 Aug 2009 10:13:41 -0700
- To: www-style@w3.org
On Thursday 2009-08-06 18:41 +0200, Bert Bos wrote: > I don't think the comment parsing is a problem for a modern computer > either. If the thing you're parsing is indeed a style sheet with a > typo, rather than a stream of random bytes whose CSS parse tree doesn't > interest anybody anyway, then buffering the unclosed comment will maybe > cost you a few tens of kilobytes of memory. Not something to worry > about. (If you already don't have enough memory to store the style > sheet as a text string, you're unlikely to have enough for its DOM...) The real issue isn't the buffering; it's getting consistent behavior (including making different parts of the spec say the same thing). > I (reluctantly) agreed to the change, but I think I only agreed to > change the grammar of comments. That was also where some browsers we > tested differed from the spec. Most browsers did not exhibit the same > bug for unclosed URLs. (Which makes sense: Zack's change allows you to For what it's worth, after implementing a fix for Zack's testcase yesterday, I'm no longer confident that it's really testing tokenization; in Mozilla's case what it was really testing was whether we were doing proper parenthesis-matching in the parser, not whether the tokenizer backtracked or not. (That said, we actually implement the tokenization using a strategy rather like what I described in http://lists.w3.org/Archives/Public/www-style/2009Aug/0098.html .) I'm not sure I want the url() change anymore either (though I think the comments part is right). But I'm also not confident the current spec is internally consistent, though I'd like to think about it further. -David -- L. David Baron http://dbaron.org/ Mozilla Corporation http://www.mozilla.com/
Received on Thursday, 6 August 2009 17:14:18 UTC