W3C home > Mailing lists > Public > public-houdini@w3.org > April 2015

RE: [parser-api] Polyfilling CSS

From: François REMY <francois.remy.dev@outlook.com>
Date: Sat, 4 Apr 2015 16:36:00 +0200
Message-ID: <DUB130-W633FCB3189757C8817D205A5F00@phx.gbl>
To: Paul Irish <paul.irish@gmail.com>, "public-houdini@w3.org" <public-houdini@w3.org>
Hi Paul,

> If you look at most of the polyfills listed in the CSS section here:
> https://github.com/Modernizr/Modernizr/wiki/HTML5-Cross-Browser-Polyfills#css-core-modules

I didn't know about that polyfill list, it's an amazing work!


> … you'll find that nearly all of them ship with their own CSS parser, 
> written in JS. (Typically using either glazman's or Tab's). 
> 
> Naturally, they also have to XHR in current stylesheets and deal 
> with CORS and all that. :)
> 
> You've probably heard of most popular of these: Respond.js. 
> This stuff is so common Philip Walton wrote a single library to automate 
> this stuff: http://philipwalton.github.io/polyfill/But obviously, the total
> filesize and execution overhead for all this work is considerable. 
> 
> So houdini has an opportunity to address this.  
> 
> Brian Kardell pointed me to the Sydney discussions on a CSS Parser API 
> which has me quite excited. Much of the discussion was around tooling
> needs from the parser, but I wanted to share this developer-driven need.
> 
> Primary usecase:I want to keep all my style information in the stylesheet, 
> even things not consistently supported across all browsers. I want to 
> drop in a JS polyfill for those ones.I need a hook for unrecognized 
> items like rulesets, @at-rules and selectors.For rulesets, I want a 
> mechanism to address matching elements.
> 
> Thanks!

If
 I can share my two cents here, while I believe a native CSS Parser API 
would be a nice to have (I hope work is still going on following the 
last houdini meeting), I can also share with you that the number one 
reason why people don't want to use my polyfills is that they try to 
parse the stylesheets client-side and do heavy work to make sense out of
 them. The number two slowing down their page load, and the number three being 
they serve their CSS without CORS and have issues for this reason.

While
 having native API would help, this would not be sufficient as CORS 
stylesheets cannot be exposed to arbitrary JS code without security 
concerns. 

For this reason, I've been convince since a few months
 now that the solution was to find using a grunt pre-processing tool 
which would parse the stylesheets and output a list of things to execute
 to polyfill that particular stylesheet instead of a generic engine that
 parse the stylesheets, and try to evaluate it.

The performance 
gain from this approach would be enormous, similar to how parsing your 
Ember/Angular/Knockout templates should be done ahead of time and not 
repeated client-side on each page load. We don't convert LESS to CSS client-side, nor do we run Traceur ES6 converter on each page load, I'm more and more convince we shouldn't either for futurizing CSS files.

I still
 haven't made plans for the next week (I'm finally on vacation for 5 
days!) but one of the things on the top of my mind would be to make a 
first implementation of such a preprocessor (though it may not fit the 
limited amount of days and all the other things I want to do within those days, I need to apply more scheduling here). Anyway, the point was that when/if I get something done, I will share the results here as a comparison point.

Best regards,
François
 		 	   		  
Received on Saturday, 4 April 2015 14:36:28 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 19:53:23 UTC