Re: Extensible, performant Web

> > When we're talking about defining new HTML elements in JS, we need to
> load
> > these scripts before these elements can be usable.
> > This will have performance implications, and we need to do our best to
> > mitigate them upfront.
> >
>
> This is taking place in Web Components and HTML Import - you should
> read through and share your thoughts on where that is at.
>
>
I can see how HTML import can somehow help here - the JS essential to a
particular element will probably be loaded as part of its import and will
block only the element, rather than the entire page.
I'm still not sure regarding the blocking relationship there between the
parent HTML, the element's HTML, and the element's JS. Who is blocking who
and why?
I guess I'll have to read the spec with a close eye and figure this out.


>
> > ## Resource loading
> >
> > Another high-profile example of the polyfill approach is picturefill,
> > polyfilling for the picture element. But when people say "There was a
> > responsive images problem, and then Web devs solved it with picturefill",
> > they ignore the performance implications incurred by using JS to load
> > resources
> > .
>
> > Currently polyfills suck at resource loading. When we're using polyfills
> to
> > download resources, we're hindering the preloader, and our resources
> start
> > loading later than they would be if they were decalred in markup. That is
> > true for HTTP/1.1, and significantly more so for SPDY & HTTP/2.0.
> >
> > ## NavigationController & First page load
> >
> > When discussing polyfilling current required features that involve
> resource
> > loading (e.g. CSP, Responsive images), NavigationController comes up as a
> > candidate low-level API for that purpose.
> >
> > The NavigationController was designed as a low level API for Offline Web
> > apps, and as such it is great. But it was not designed to answer the
> needs
> > of polyfilling other resource-loading features, and especially, it is not
> > designed to operate on the initial page load.
> >
> > With ~20% of page views coming without the site's resources in the
> cache[1],
> > we cannot rely on NavigationController to protect users from XSS, nor to
> > display them with the right images.
> >
> >
> > # What can we do about that?
> >
> > I'm not claiming to have *the* solution to these issues, but I have an
> idea:
> > We need a way to "install" controllers/polyfills.
> >
> > IMO, if we had a way to install controllers/polyfills/frameworks once and
> > reuse them across sites, that'd be a good start to solving most of the
> above
> > problems.
> > It'd give us the ability to share JS code between sites, and the
> performance
> > implications of blocking JS at the page's top will bother the user only
> > once.
> > It may also enable NavigationController to support a controller on its
> first
> > page load (even though I might be ignoring other issues this may provoke)
> >
> > I know what you're saying: "That's what the browser's cache is for". I
> don't
> > want to be blunt, but you're wrong :P
> > We've seen this approach fail with JQuery[2][3], because the myriad of
> > versions Web devs use and the multiple CDN options resulted in
> fragmentation
> > causing any JS code sharing across sites to be coincidental.
> >
> > # How?
> >
> > A natural model is the Debian/Ubuntu repository model, only with the
> browser
> > as the OS, and JS code as the packages.
> >
> > Imagine a world where browsers come with all the versions of popular Web
> > frameworks installed, and when a new version comes out, it is
> automatically
> > fetched & installed. An evergreen JS framework landscape, if you will.
> > Web pages will have script tags that include a URL, but also include a
> > "package name", which the browser can use if it has that package
> installed.
> > If it doesn't, it can install it on the spot, or later on (based on
> > performance considerations).
> > Installed JS code will not come from the sites themselves, but from the
> > repositories, which are deemed secure.
> >
> > Who will control these repositories? In the Debian model, the user
> decides
> > which repositories to use, which basically means the OS decides for him.
> In
> > the Web's case, that'd be the browser. If you trust your browser vendor
> with
> > auto-updates, you should be able to trust him with JS code as well.
> >
> > # Is it realistic?
> >
> > I don't know. You tell me.
> >
> >
> > I'd appreciate any thoughts on these issues and this or any other
> solution.
>
>
> Mostly random, rambling comments below... sorry, it's been a long day.
>
> I've had some similar thoughts in the past - but... It might be easier
> to tweak systems we already have to improve the caching problem than
> to convince everyone to adopt a new sort of 'trusted cache' of common
> libs.  It does seem to maybe game the system toward libs that already
> have uptake and it seems that you'd need a fallback mechanism anyway
> for where they don't exist (older browsers) - which starts to look a
> lot like cache...
>
>
I agree that it creates a bias towards established libs, since it gives
them a performance advantage. This is a problem that needs addressing.
I also agree that we will need to provide a URL along with the package-name
for each library. It can then be used both for older-browsers and as a
fallback in case the repository is down. Not an issue IMO.

It does look like cache, but the unbundling of the resource from the URL is
significant. IMO, this is not something we can tweak current caches to do.
I also think that if we'd let any Web page install its JS in the browser
using some cryptographic hash (which is an option I contemplated), the day
this hash breaks (and it eventually will), all hells break loose.
This is the advantage of a central repository. Code can be verified, and
Web sites can't install their own arbitrary code.

We have already had discussions on setting up a repository for
> prollyfills where we could list them, they could grow and get peer
> review, etc.  Perhaps if we do that right we could convince folks to
> use that as a neutral, trusted source with special cache treatment for
> things that are past a certain point... like, just avoid expelling it.
>

Past experience shows that this has failed with JQuery. Maybe we can do
things different this time, and convince devs to use a single URL for each
prollyfill, but I'm not sure we can.
OTOH, if we will succeed, we will create a single point of failure for the
Web :(


>
> This could improve things, but - that said... I don't know that, for
> example, loading jQuery has ever been that huge a problem in my
> experience.   It's crazy widely used and most parts of the Web stack
> also have to load, parse, etc. and are treated in much the same way
> before you can move on.   When I look at a great number of sites - the
> CSS takes longer to download than jQuery and has way, way less chance
> of being in cache across sites, etc.  (though maybe there is something
> you could do there for Web Components too).
>

I don't have data regarding specific performance impact numbers of
re-downloading JQuery over and over again (anecdotally, I've seen a
significant impact on a number of occasions in the past).
Let's just say that it's certainly not as fast as it can be. If we want to
add more JS to the Web, and rely heavily on it for longer periods of time
(since we say that standardization can wait until the JS polyfills battle
will settle), we need to find way to optimize that performance bottleneck.
Regarding CSS, the same problem exists with bootstrap, so maybe the package
idea can be expanded to any shared resources between sites. (CSS,
Components, fonts)

>
> I think a bigger difference when you look at it is where you can plug
> into the lifecycle... One big difference between them though is that
> CSS starts applying as the tree is constructed while, generally
> speaking, jQuery needs to wait until the initial DOM is constructed.
> In a lot of web sites - even without cache, you could have downloaded
> loaded a few images for the body at that point already.  Web
> Components improves that story for custom elements I think because it
> is doing a lot of the stuff we used to do with script on the fly, and
> it is encapsulating things in a smaller scope with better lifecycle
> events that allow you to run things that require DOM to run much
> earlier in the process.  I'd love to get the same kinds of hooks in
> CSS - in fact - I had a proposal and I will be working on exactly
> those sorts of things.
>
> We also have to be careful about where we make the comparisons between
> things probably and how we measure 'acceptable' performance.  For
> example, native is way faster at matching than jQuery (at least for
> most queries) when you measure across weirdly arbitrary benchmarks.
> In the real world, I think I've never seen anyone's site where
> selector performance of jQuery was really the bottleneck.  Not saying
> there aren't any - but I don't think that is as common as some people
> think it is.  Also, it's only *that( fast natively because it has to
> be - if the use cases for native selector matching were 'a few queries
> at a time after the dom is ready' - I doubt the vendors would put so
> much pressure on making them so fast... I could be wrong.
>
> We need to differentiate between areas where native code will *run* faster
(the querySelector example) and places where native will result in overall
better performance (mostly because of better network performance, e.g.
preloader working properly, better resource download prioritization, etc.
Blocking page parsing/rendering, can also play a significant role).

For the first case, the difference is usually not huge, and it gets smaller
with general JS engine optimizations.
For the latter, at least in some cases, native is inherently significantly
faster. I'm not saying that all new features fall into that category, but
those that do, should move to native faster, unless we can come up with
primitives that will enable to get rid of these performance bottlenecks.

But we're deviating from my primary concerns into my secondary ones :)

Received on Friday, 28 June 2013 15:16:20 UTC