- From: Robin Berjon <robin@w3.org>
- Date: Tue, 23 Jun 2015 10:48:55 +0200
- To: "Liam R. E. Quin" <liam@w3.org>, Liza Daly <liza@safaribooksonline.com>
- CC: Dave Cramer <dauwhe@gmail.com>, W3C Digital Publishing IG <public-digipub-ig@w3.org>
On 23/06/2015 00:09 , Liam R. E. Quin wrote: > Invalidating the entire cache for a book might be a pain, though, if > the book is, say, a gigabyte in total size. Invalidating the manifest for a book does not necessarily invalidate your whole cache. Resources that were cached as part of the book can be re-checked individually without necessarily being redownloaded. The problem you'll have with a gigabyte book is that it won't ever get cached anyway. You'll have hit hard-to-predict quota limits at least one order of magnitude before, possibly two or three. Some of the most vocal complaints I've heard about AppCache (during the "Fixing AppCache" session) were Pearson (more specifically FT Labs, who handle the FT and Economist). You can read Andrew Betts' summary of those meetings: http://labs.ft.com/2012/08/fixing-app-cache/. One of the scary things with AppCache is that you can get a room full of the brightest in Web application and browser engine development, and no one in the room can fit the full mechanics of how AppCache works in their head. This doesn't mean you can't make it work (some people do) but you can imagine that at this degree of complexity things could perhaps get to be a bit brittle. In my experience AppCache is like magic in Buffy. You can use it, but it will exact an unpredictable cost. It will deliver everything you want in the worst way possible. -- Robin Berjon - http://berjon.com/ - @robinberjon
Received on Tuesday, 23 June 2015 08:49:08 UTC