Re: Multiple Content-Location headers

[Oops, sent this out before filling in the citation to the paper.]

    Sorry if I am missing something, but is duplicate suppression really a
    big issue ? And for what reason ?
    
    Last time I checked (which was perhaps in summer '96, and looking at
    cache performance) on our proxy, the amount of duplicate objects
    was very small -- at least in size, I think it was maybe between
    1 and 2% of the total cache size.

A recent paper showed that 18% of the references in a trace were
for duplicated content (sorry, I don't have figures based on # of
bytes).  See
	http://www.usenix.org/
	     publications/library/proceedings/usits97/douglis_rate.html

Also, the DRP people are thinking in terms of doing software
distribution via HTTP.  A lot of programs are composed of
a small core plus a lot of library modules; the DRP people would
like to avoid retransmitting the same library over the network
more than once, while still being able to ensure that different
versions of a library are properly managed.  (I.e., the duplicate
suppression mechanism should not substitute one version of a library
for another, since this substitution might break the program that
uses the library.)   If this use of HTTP becomes popular, the number
of bytes of duplicated content could increase dramatically.

This is still in the early stages of discussion, and it probably
shouldn't waste bandwidth on the HTTP-WG mailing list at this point.

-Jeff

Received on Monday, 12 January 1998 13:55:17 UTC