- From: Adrien W. de Croy <adrien@qbik.com>
- Date: Thu, 17 Jan 2013 00:12:05 +0000
- To: "HTTP Working Group" <ietf-http-wg@w3.org>
- Message-Id: <em7dfc0da1-1b9a-4534-be5f-8e7cb235179d@bombed>
Hi all p6 and RFC 2616 when talking about heuristic caching, place a should level limit on heuristic freshness (10% of time elapsed since L-M), but only when there is a Last-Modified header. However, it has been noted that some caches will store and reuse responses that have no validators at all. Obviously it only makes sense to cache something if it can be re-used, and without validators, it can't be re-validated with the O-S, and therefore the only way such a resource can be re-used is if the cache makes some assumption about freshness. In previous versions, we had heuristics for minimum effective freshness based on content type. This caused all manner of problems, so we dropped it for our current version, however we're finding relatively poor cachability of the internet, and so the value of caching is proving to be limited. Since we can't wait forever for web site operators to consider and roll out reasonable caching directives, I believe in order to provide some real benefit from caching, we need to take a much more aggressive stance. This will require at least heuristic caching, and I'm fairly certain also heuristic caching of responses that don't have any validators. Are there any guidelines for this? Since we're a shared intermediary cache, we can't do things like cache for the duration of the browser session, so it's going to come down to a table of min freshness per content-type (where there are no validators). Does this issue deserve any discussion in the RFC / p6? It's very light on heuristic freshness - probably for a reason. Regards Adrien
Received on Thursday, 17 January 2013 00:12:56 UTC