- From: Chris Holland <frenchy@gmail.com>
- Date: Mon, 15 Jun 2009 12:34:34 -0700
As an alternative,common libraries could get shipped as browser plugins, allowing developers to leverage "local" URIs such as "chrome://" in XUL/mozilla/firefox apps. This would only effectively work if: - all vendors define a same local URI prefix. I do like "chrome://". Mozilla dudes were always lightyears ahead in all forms of cross- platform app development with XUL. - all vendors extend their existing plugin architecture to accomodate this URI and referencing from network-delivered pages. - some form of discovery exists, with ability to provide network transport alternative: "use chrome URI if exists, use http URI if not" Library vendors would then ship their releases as browser plugins, using existing discovery mechanisms, as well as software update mechanisms. -chris On Jun 15, 2009, at 11:55, Oliver Hunt <oliver at apple.com> wrote: >> Pros: >> - Pre-Compiled: By bundling known JS Libraries with the browser, >> the browser could store a more efficient representation of the >> file. For instance pre-compiled into Bytecode or something else >> browser specific. > I think something needs to be clarified wrt to compile times and the > like. In the WebKit project we do a large amount of performance > analysis and except in the most trivial of cases compile time just > doesn't show up as being remotely significant in any profiles. > Additionally the way JS works, certain forms of static analysis > result in behaviour that cannot reasonably be cached. Finally the > optimised object lookup and function call behaviour employed by > JavaScriptCore, V8 and (i *think*) TraceMonkey is not amenable to > caching, even within a single browser session, so for modern engines > i do not believe caching bytecode or native is really reasonable -- > i suspect the logic required to make this safe would not be > significantly cheaper than just compiling anyway. > >> - Less HTTP Requests / Cache Checks: If a library is in the >> repository no request is needed. Cache checks don't need to be >> performed. Also, for the 100 sites you visit that all send you the >> equivalent jquery.js you now would send 0 requests. I think this >> would be enticing to mobile browsers which would benefit from this >> Space vs. Time tradeoff. > I believe http can specify how long you should wait before > validating the cached copy of a resource so i'm not know if this is > a real win, but i'm not a networking person so am not entirely sure > of this :D > >> - Standardizing Identifier For Libraries: Providing a common >> identifier for libraries would be open for discussion. The best >> idea I've had would be to provide the SHA1 Hash of the Desired >> Release of a Javascript Library. This would ensure a common >> identifier for the same source file across browsers that support >> the feature. This would be useful for developers as well. A debug >> tool can indicate to a developer that the script they are using is >> available in the Browser Repository with a certain identifier. > This isn't a pro -- it's additional work for the standards body > >> Cons: >> >> - May Not Grow Fast Enough: If JS Libraries change too quickly the >> repository won't get used enough. >> - May Not Scale: Are there too many JS Libraries, versions, etc >> making this unrealistic? Would storage become too large? > - Adds significant spec complexity > - Adds developer complexity, imagine a developer modifies their > servers copy of a given script but forgets to update the references > to the script, now they get inconsistent behaviour between browsers > that support this feature and browsers that don't. > > --Oliver >
Received on Monday, 15 June 2009 12:34:34 UTC