Re: Extensible, performant Web

On Wed, Jun 26, 2013 at 6:45 PM, Yoav Weiss <yoav@yoav.ws> wrote:
> Hey,
>
> I'm new at this group, joined after the Manifesto was published. I'd like to
> start out by saying that I'm pretty excited about the leading principles of
> enabling low level APIs and prototyping high level features in JavaScript,
> which would enable to iterate on them, and revert bad design decisions
> (which today is very hard).

Yes.   I'm glad you were excited and joined - help spread the
excitement.  The more folks we get to sign, the more members we have,
the more this is a movement with some weight behind it.


> # But...
>
> At the same time, there are a few things that scare me about that model.
> Problems that I believe are ignored in the myriad of blog posts and
> presentations about this subject. I think it'd be best to tackle them
> upfront, so I'll start by outlining them.
>
> ## Loading a lot of blocking JS
>
> The approach of defining new platform capabilities in JS is extremely
> powerful, and has been proven successful. The poster boy example of this
> approach is the JQuery selector engine which was eventually standardized
> (kind of) as QuerySelectorAll.
>
> But when looking at that example, we should remember that for a long while
> (and for some, still) using JQuery's selector engine meant loading JQuery up
> front, before your application code can start running.

> When we're talking about defining new HTML elements in JS, we need to load
> these scripts before these elements can be usable.
> This will have performance implications, and we need to do our best to
> mitigate them upfront.
>

This is taking place in Web Components and HTML Import - you should
read through and share your thoughts on where that is at.


> ## Resource loading
>
> Another high-profile example of the polyfill approach is picturefill,
> polyfilling for the picture element. But when people say "There was a
> responsive images problem, and then Web devs solved it with picturefill",
> they ignore the performance implications incurred by using JS to load
> resources
> .

> Currently polyfills suck at resource loading. When we're using polyfills to
> download resources, we're hindering the preloader, and our resources start
> loading later than they would be if they were decalred in markup. That is
> true for HTTP/1.1, and significantly more so for SPDY & HTTP/2.0.
>
> ## NavigationController & First page load
>
> When discussing polyfilling current required features that involve resource
> loading (e.g. CSP, Responsive images), NavigationController comes up as a
> candidate low-level API for that purpose.
>
> The NavigationController was designed as a low level API for Offline Web
> apps, and as such it is great. But it was not designed to answer the needs
> of polyfilling other resource-loading features, and especially, it is not
> designed to operate on the initial page load.
>
> With ~20% of page views coming without the site's resources in the cache[1],
> we cannot rely on NavigationController to protect users from XSS, nor to
> display them with the right images.
>
>
> # What can we do about that?
>
> I'm not claiming to have *the* solution to these issues, but I have an idea:
> We need a way to "install" controllers/polyfills.
>
> IMO, if we had a way to install controllers/polyfills/frameworks once and
> reuse them across sites, that'd be a good start to solving most of the above
> problems.
> It'd give us the ability to share JS code between sites, and the performance
> implications of blocking JS at the page's top will bother the user only
> once.
> It may also enable NavigationController to support a controller on its first
> page load (even though I might be ignoring other issues this may provoke)
>
> I know what you're saying: "That's what the browser's cache is for". I don't
> want to be blunt, but you're wrong :P
> We've seen this approach fail with JQuery[2][3], because the myriad of
> versions Web devs use and the multiple CDN options resulted in fragmentation
> causing any JS code sharing across sites to be coincidental.
>
> # How?
>
> A natural model is the Debian/Ubuntu repository model, only with the browser
> as the OS, and JS code as the packages.
>
> Imagine a world where browsers come with all the versions of popular Web
> frameworks installed, and when a new version comes out, it is automatically
> fetched & installed. An evergreen JS framework landscape, if you will.
> Web pages will have script tags that include a URL, but also include a
> "package name", which the browser can use if it has that package installed.
> If it doesn't, it can install it on the spot, or later on (based on
> performance considerations).
> Installed JS code will not come from the sites themselves, but from the
> repositories, which are deemed secure.
>
> Who will control these repositories? In the Debian model, the user decides
> which repositories to use, which basically means the OS decides for him. In
> the Web's case, that'd be the browser. If you trust your browser vendor with
> auto-updates, you should be able to trust him with JS code as well.
>
> # Is it realistic?
>
> I don't know. You tell me.
>
>
> I'd appreciate any thoughts on these issues and this or any other solution.
>
> Thanks,
> Yoav
>
> [1] http://www.stevesouders.com/blog/2012/03/22/cache-them-if-you-can/
> [2] http://statichtml.com/2011/google-ajax-libraries-caching.html
> [3] http://www.stevesouders.com/blog/2013/03/18/http-archive-jquery/
>
>
>

Mostly random, rambling comments below... sorry, it's been a long day.

I've had some similar thoughts in the past - but... It might be easier
to tweak systems we already have to improve the caching problem than
to convince everyone to adopt a new sort of 'trusted cache' of common
libs.  It does seem to maybe game the system toward libs that already
have uptake and it seems that you'd need a fallback mechanism anyway
for where they don't exist (older browsers) - which starts to look a
lot like cache...

We have already had discussions on setting up a repository for
prollyfills where we could list them, they could grow and get peer
review, etc.  Perhaps if we do that right we could convince folks to
use that as a neutral, trusted source with special cache treatment for
things that are past a certain point... like, just avoid expelling it.

This could improve things, but - that said... I don't know that, for
example, loading jQuery has ever been that huge a problem in my
experience.   It's crazy widely used and most parts of the Web stack
also have to load, parse, etc. and are treated in much the same way
before you can move on.   When I look at a great number of sites - the
CSS takes longer to download than jQuery and has way, way less chance
of being in cache across sites, etc.  (though maybe there is something
you could do there for Web Components too).

I think a bigger difference when you look at it is where you can plug
into the lifecycle... One big difference between them though is that
CSS starts applying as the tree is constructed while, generally
speaking, jQuery needs to wait until the initial DOM is constructed.
In a lot of web sites - even without cache, you could have downloaded
loaded a few images for the body at that point already.  Web
Components improves that story for custom elements I think because it
is doing a lot of the stuff we used to do with script on the fly, and
it is encapsulating things in a smaller scope with better lifecycle
events that allow you to run things that require DOM to run much
earlier in the process.  I'd love to get the same kinds of hooks in
CSS - in fact - I had a proposal and I will be working on exactly
those sorts of things.

We also have to be careful about where we make the comparisons between
things probably and how we measure 'acceptable' performance.  For
example, native is way faster at matching than jQuery (at least for
most queries) when you measure across weirdly arbitrary benchmarks.
In the real world, I think I've never seen anyone's site where
selector performance of jQuery was really the bottleneck.  Not saying
there aren't any - but I don't think that is as common as some people
think it is.  Also, it's only *that( fast natively because it has to
be - if the use cases for native selector matching were 'a few queries
at a time after the dom is ready' - I doubt the vendors would put so
much pressure on making them so fast... I could be wrong.



--
Brian Kardell :: @briankardell :: hitchjs.com

Received on Thursday, 27 June 2013 01:43:32 UTC