the case for multiple entities per URL

- web browser capability will vary
- content providers know this
- content providers will want to optimize content to match
    browser capability
- currently this happens by user-agent negotiation
- user-agent negotiation may continue, but if we're lucky, some
  of this will happen by content-type and feature negotiation
  instead
- a single cache *will* be used simultaneously by multiple
  users with different capabilities
- many content-providers will be doing dynamic construction
  of pages based on perception of user agent capabilities
  without wanting the separate pages to have separate URLs
...

some more steps here ... but ultimately leading to the conculsion
that: 

THEREFORE the protocol must support caching of multiple entities for
the same URL, in that a proxy may return different fresh entities for
the same URLs as long as the proxy determines that the request headers
of the subsequent requests match the appropriate request headers of
the original request that evoked the original entity.

I think the only difficulty comes when a resource might have several
(stale) entities associated with it in a given cache and the cache
recieves a new fresh entity, it isn't clear which of the old stale
entities might be discarded. Personally, I think this is a cache
optimization issue and not a protocol correctness issue; I can think
of several heuristics that a cache might employ to do a reasonable job
in such a case.

Larry

Received on Tuesday, 14 May 1996 19:45:39 UTC