Re: compression of HTML would save a lot of money

I'm a little late to this discussion, but I was actually going to make  
the same suggestion before I saw this thread.

> From what I know of W3C (which is a reasonable amount), I am
> certain this will consume valuable time, and do not expect it to save
> any money ...

Bandwidth is fairly cheap nowadays. Server load might decrease thanks  
to reduced connection durations and fewer retransmissions, but it  
might also increase due to the burden of compression (though use of a  
good reverse-proxy/web accelerator cache like Squid or Varnish can  
largely solve this problem, and reduce load on the main servers at the  
same time).

However, there is also the time of site readers to consider. If it  
takes X seconds for a web professional to download a page, that's X  
seconds that they could use more productively. Cutting that in half -  
along with reducing the number of packets in the net by the same  
amount - seems like a good deal. Perhaps this is worth mentioning if  
the issue of budgets arises . . .

Several W3C specifications are large, and not easy to download over a  
mobile or low-bandwidth device. Consider, for example, the draft HTML  
5 spec, which is 4Mb uncompressed, but a svelte 680kb compressed:
http://dev.w3.org/html5/spec/Overview.html
Admittedly that has a paged option, but it's not the one that comes up  
when you search for it.

> I think if it is reasonably easy it is worth doing as part of  
> showing how
> to make the web better for the world ...

Honestly, W3C would merely be catching up to standards already in  
widespread use. It's been over ten years since the RFC specifying HTTP  
compression went out. All top-10 websites use it, and have for some  
time. As for user-agent support, Lynx implemented gzip in 1997;  
multiplatform graphical browsers have supported it well for at least  
the last five years. And bear in mind that gzipped content is only  
served to those agents that request it.

Both Google (http://code.google.com/speed/articles/) and Yahoo (http://developer.yahoo.com/performance/rules.html#gzip 
) have been pushing gzip as a standard means of improving the  
performance of the web. If W3C isn't going to lead, it should at least  
follow.

Implementing gzip is easy - in its simplest form, it's a matter of  
loading/compiling in mod_deflate and adding one line in a config file,  
like this:
AddOutputFilterByType DEFLATE text/html text/plain text/xml text/css  
application/javascript

Gzip can be enabled selectively for certain directories, which may be  
useful if there are critical services that cannot be safely  
compressed. However, I suspect there will be few (if any) situations  
in which this is necessary.

-- 
Laurence "GreenReaper" Parry
http://greenreaper.co.uk/ - http://wikifur.com/
"Eternity lies ahead of us, and behind. Have you drunk your fill?"

Received on Thursday, 13 August 2009 03:47:40 UTC