- From: Boris Zbarsky <bzbarsky@MIT.EDU>
- Date: Mon, 15 Jul 2013 09:30:17 -0400
- To: whatwg@lists.whatwg.org
On 7/15/13 3:42 AM, Bruno Racineux wrote: > Wouldn't browsers be able to store "pre-parsed/compiled' scripts in a > separate "byte code" cache, You mean like https://bugzilla.mozilla.org/show_bug.cgi?id=679942 ? There's some discussion in there about whether this is a worthwhile optimization with modern JS engines, which feature lazy compilation and even lazy parsing to a large extent.... Measurement is needed of the various pieces discussed in <https://bugzilla.mozilla.org/show_bug.cgi?id=679942#c6> and <https://bugzilla.mozilla.org/show_bug.cgi?id=679942#c7>. In either case, note that there is already caching of this sort to some extent in SpiderMonkey; see https://bugzilla.mozilla.org/show_bug.cgi?id=883154 which is fixed. Note that in the context of that bug a "script" is what you probably think of as a "function body": that's the fundamental unit of compilation in SpiderMonkey now that we do lazy compilation. > i.e. Why do we have to keep re-parsing and re-evaluating the very same > scripts, especially CDN libraries and social apis largely shared among > websites, over and over? The re-evaluating is because evaluation has little things like side-effects. ;) > the whole parse/evaluation time, could be cut 'partially', yet very significantly Some hard data would be useful here, as I said above. -Boris
Received on Monday, 15 July 2013 13:30:54 UTC