Re: dont-revalidate Cache-Control header

> On Jul 14, 2015, at 7:53 PM, Ilya Grigorik <> wrote:
> On Tue, Jul 14, 2015 at 5:19 PM, Roy T. Fielding < <>> wrote:
> On Jul 14, 2015, at 3:33 PM, Ilya Grigorik < <>> wrote:
>> On Tue, Jul 14, 2015 at 3:03 AM, Ben Maurer < <>> wrote:
>> That said, this doesn't feel like a great thing for us to promote as a web performance best practice. "If you use long cache lifetimes for your static content, the dont-revalidate cache control header will reduce the cost of client reloads" seems like a piece of advice folks might take, as would "Use the <meta> tag 'dont-reload-non-expired-resources' to avoid browsers revalidating your content when the user presses reload". On the other hand "you should find every image, script, stylesheet, etc and set the fetch option on each to say force-cached" feels more tedious and unlikely to be used.
>> To this point, the HTTP mechanism is something that FEO / optimization proxies can do on your behalf - e.g. rewrite and/or bundle resources, add version fingerprint, append the HTTP header we're discussing here. By comparison, rewriting markup (HTML, CSS, JS) is significantly harder and very expensive. Which is to say.. +1 for HTTP directive over markup.
> No, it would be managed in the CMS along with all of the other decisions that led to a static version. Sane folks don't manage their content in an optimization proxy.
> Most every CDN has an FEO product that performs resource optimization (minification, obfuscation, bundling, fingerprinting + cache extension, and more). PageSpeed modules [1] alone, which I'm most familiar with myself, power many hundreds of thousands of sites. Which is to say, "sane folks" do deploy such tools and with great success.
> [1] <>
Umm, I don't consider mod_pagespeed to be an optimization proxy, but I guess it can be configured that way
in combination with mod_proxy. Managing configuration of CDNs is a common feature for a CMS.

In any case, the place where an output filter like pagespeed should be adding the static indicator is the same
place it is doing the content modification to modify the URL reference by adding a content hash: the HTML
element attributes. It is making a decision to force the reference to have a static representation and we want that
decision to have an impact on the page rendering algorithm of a browser, so the static association belongs with the
reference so that it can be retained regardless of how many other protocols might be used to deliver, cache,
or otherwise distribute that modified content.

OTOH, if we want to have metadata that indicates a given resource has one and only one representation
for all time, regardless of the use context, that metadata would not belong in Cache-Control either (because
it isn't about controlling a cache -- it is asserting some knowledge about the resource that can be used
by any recipient, regardless of cache behavior). That could be defined as a new header field or as a
relation for Link (where the link could point to the original resource that isn't static).

We would then be left with the question of when does the page rendering process have sufficient
confidence that it has the right representation in order to avoid making a conditional request for a
given static resource.  I think having either some sort of Content-Hash (or similar) [also orthogonal
to cache-control], or properly marking incomplete responses as suggested by HTTP/1, would be
sufficient for browsers to make their own decision to optimize away that request.


Received on Thursday, 16 July 2015 22:46:47 UTC