- From: <bugzilla@jessica.w3.org>
- Date: Wed, 21 Sep 2011 19:20:13 +0000
- To: public-html-bugzilla@w3.org
http://www.w3.org/Bugs/Public/show_bug.cgi?id=14194 --- Comment #8 from Boris Zbarsky <bzbarsky@mit.edu> 2011-09-21 19:20:11 UTC --- > But that type of delay, especially on mobile (limited CPU, etc) basically can > act to decapitate the whole purpose for preloading Yes, that was exactly my point. And yet mobile is also where memory pressure is the biggest problem. > Are you suggesting that this delay would still be synchronous upon attempt to > execute the preloaded script? It'd have to be to give you sync-execute semantics, right? > Even on large JS files like 300k+, the most I've ever seen the "processing" > stage of a script take is a couple hundred ms. 300K is a medium size JS file, from what I've seen... Heck, jQuery is about 100K even minified, before compression. The difference between several hundred ms and 1s is just a matter of different device. But yes, the point was just that there would be a quite noticeable pause. 200ms is way above human perception level. I just tried decompressing gzipped minified jquery on a pretty new tablet, and it took me about 100ms. On a slower device, it would of course take longer. And there are plenty of JS files larger than minified jquery out there. > It seems like keeping preloaded scripts compressed in memory is missing The point is that in low memory situations we'd like to be able to fall back on discarding all that compiled code, static analysis information, decompressed data, etc. > Are you planning to compress such uncompressed scripts before keeping them in > memory In low-memory conditions, probably yes. We're already moving toward compressing them before putting them in our memory and disk cache, no matter how the server sent them. -- Configure bugmail: http://www.w3.org/Bugs/Public/userprefs.cgi?tab=email ------- You are receiving this mail because: ------- You are the QA contact for the bug.
Received on Wednesday, 21 September 2011 19:20:14 UTC