W3C home > Mailing lists > Public > ietf-http-wg@w3.org > October to December 2012

Re: Require UAs and intermediary caches to assume Vary: User-Agent

From: Amos Jeffries <squid3@treenet.co.nz>
Date: Thu, 18 Oct 2012 10:03:05 +1300
To: <ietf-http-wg@w3.org>
Message-ID: <5f99215ce00479775f9461103705fa5c@treenet.co.nz>
On 18.10.2012 00:30, Phillip Hallam-Baker wrote:
> How much of the Web actually passes through client side proxies these 
> days?
> Of those, how many have  caches?
>

Unknown, but still a large enough portion of the population to matter.

There are still countries like .au and .nz which have very high 
international transit costs or low bandwidth using large proxy farms in 
the major telecos. The historical reasons for caching have not 
disappeared, the bandwidth increases these late few decades have just 
spoilt the user base and added to the need if anything. Educational 
institutions and corporate sites are also still using proxies a lot as 
gateway control servers.

Ubuntu and Debian provide software install counters and inform me there 
are over half a million Squid, Varnish, HAproxy, polipo, oos still 
installed and running around the world - thats anything up to 500 M 
users right there. Other more popular distros, other proxies, unreported 
installs - who knows how big the total actually is.


> There was a good reason for caching proxies in 1993. I don't see much
> justification or utility in fiddling with that part of the spec now. 
> The
> content that is relevant for caching these days is huge chunks of 
> video,
> audio and images, the part that is generated is text.

With most of user generated www traffic being videos and media the case 
for caching those still stands and is gaining popularity amongst 
sysadmin.

AYJ
Received on Wednesday, 17 October 2012 21:03:32 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 17 October 2012 21:03:35 GMT