- From: Bert Bos <bert@w3.org>
- Date: Thu, 6 Aug 2009 18:41:32 +0200
- To: www-style@w3.org
On Thursday 06 August 2009, Andrey Mikhalev wrote: > On Wed, 5 Aug 2009, fantasai wrote: > > - RESOLVED: Proposal accepted for CSS2.1 Issue 129 (Backup in > > Tokenizer) http://wiki.csswg.org/spec/css2.1#issue-129 > > Objection. > introduced new url tokenization contradicts core grammar prose, and > backward compatibility parsing rules got broken on specification > level. > > current grammar is perfectly crystal: > invalid url recognized as function token and processed by parser. > no reason for changes. Did we really agree to the whole proposal? I think we talked about comments only. The URLs weren't seen as a problem. I don't think the comment parsing is a problem for a modern computer either. If the thing you're parsing is indeed a style sheet with a typo, rather than a stream of random bytes whose CSS parse tree doesn't interest anybody anyway, then buffering the unclosed comment will maybe cost you a few tens of kilobytes of memory. Not something to worry about. (If you already don't have enough memory to store the style sheet as a text string, you're unlikely to have enough for its DOM...) I (reluctantly) agreed to the change, but I think I only agreed to change the grammar of comments. That was also where some browsers we tested differed from the spec. Most browsers did not exhibit the same bug for unclosed URLs. (Which makes sense: Zack's change allows you to optimize parsing of comments by throwing away each character almost as soon as it is read, but the contents of a URI token obviously have to be buffered anyway. Zack's change doesn't gain you anything there.) Bert -- Bert Bos ( W 3 C ) http://www.w3.org/ http://www.w3.org/people/bos W3C/ERCIM bert@w3.org 2004 Rt des Lucioles / BP 93 +33 (0)4 92 38 76 92 06902 Sophia Antipolis Cedex, France
Received on Thursday, 6 August 2009 16:42:10 UTC