- From: Eduard Pascual <herenvardo@gmail.com>
- Date: Tue, 20 Jan 2009 16:33:44 +0000
- To: "François REMY" <fremycompany_pub@yahoo.fr>
- Cc: "David Woolley" <forums@david-woolley.me.uk>, www-style@w3.org
On Mon, Jan 19, 2009 at 8:39 PM, François REMY <fremycompany_pub@yahoo.fr> wrote: > I think it was already discussed about pros and cons of ":matches" selector. > > I wroted a document about it. It's only my vision of the situation, and you > may don't agree with me (please say it, if it's the case). > I hope this document is well-balenced and show the enormous advantages and > problems of the selector. > This document is not an essay about the problem, but I think it's a good > start to dicuss with full(er) knownledge about the subject. > > Fremy I have read your document and agree with most of the ideas it presents; but I think it misses some substantial points. Basically, it boilds down to "browser vendors should rewrite their browser's engines" which is probably a true fact, but it misses the point that "browser vendors will not rewrite their browser's engines (unless there is a compelling enough reason for them to do so, of course)", which is known to be a true fact. So the CSS WG has two ways to go: they can produce an utopic standard that offers the best solution to each problem, but will be useless because nobody will implement it; or they can produced a realistic standard which can't solve all the problems but at least solves some, and becomes implemented by enough parties for its solutions to be usable by authors. Although I would love to have the utopic standard implemented and use it on the content I publish; I have to stay in touch with reality. So I think we should try to get as near as possible to the utopy without taking off from reality. That's why I tried to figure out how much can be reasonably achieved before stepping into tough issues. It appears to me (please, someone correct me if I am wrong) that the main concern applies to page-loading, rather than arbitrary modifications from script: the current logics and limitations of CSS are optimal for any progressive rendering algorithm, so trying to break the essential limitations would affect adversely such algorithms. After all, during page loading browsers just add children to existing elements, one after another, so all the information needed to evaluate nowadays selectors for a given element is already available when the element is initially added. Once the page is loaded, modifications to the DOM are arbitrary (ie: adding/moving/removing stuff anywhere, compared to always adding "last-childs" during load), so something like a full :matches wouldn't really hurt: scripts can anyway tweak the class or id of the root element. Actually, if authors try to rely on javascript emulation of the :matches capability, the impact is even greater: not only does it do the same re-checking across the entire tree, but it is done through an interpreted language engine rather than natively. Hence, unless I'm missing something (please don't hesitate to point it out if you think I am), once the page has loaded a full :matches would have a positive impact, allowing the browser to natively perform styling tasks that would otherwise be handled by scripts. Putting it all together: ":matches" is horrible during loading; but it's not so bad (actually it's even good) after loading. Then simply ignore the ":matches" until the page is loaded. Or, in more formal words (these questions are addressed to implementors): - If the blocks containing ":matches" on their selectors were ignored during initial page loading, would the concerns still be there? - Would it be actually doable to skip such blocks during the initial loading, and "review" them once the page has been loaded? I'm eager to hear some implementors' voices on these two questions.
Received on Tuesday, 20 January 2009 16:34:20 UTC