W3C home > Mailing lists > Public > site-comments@w3.org > August 2009

AW: compression of HTML would save a lot of money

From: FinanzNachrichten.de, Markus Meister <markus.meister@finanznachrichten.de>
Date: Wed, 5 Aug 2009 21:45:08 +0200
To: <w3c-ac-forum@w3.org>, <site-comments@w3.org>
Message-ID: <046f01ca1605$3d2a4b70$b77ee250$@meister@finanznachrichten.de>
Thank you for the good summary, Rotan.
I fully agree with that. For me, the key point is that I thought there would
be the possibility to save some money by compressing. As we have a flat rate
for the traffic anyway and I guess that it would not change anything if we
had e.g. -30 % less TBs traffic, we can close this discussion and keep this
issue in mind for the upcoming years.

Best regards,

Markus Meister

Alle News zu Aktien, Börse und Finanzen!!
DER SPEKULANT - Der Börsenbrief für clevere Anleger

Von: w3c-ac-forum-request@w3.org [mailto:w3c-ac-forum-request@w3.org] Im
Auftrag von Rotan Hanrahan
Gesendet: Mittwoch, 5. August 2009 20:30
An: w3c-ac-forum@w3.org
Betreff: RE: compression of HTML would save a lot of money

Because nothing appears to break when the feature is missing, nobody cares.
To the best of my knowledge, there are no IP limitations that prevent the
incorporation of gzip in servers or browsers. Many servers and browsers
already have the capability. Some that have the capability are flawed in
their implementation. The absense of motivation to fix the feature is
explained by the absense of apparent impact.
However, as Chaals points out, there is a potential saving that could be
achieved globally if it was universally adopted. The cost would be
threefold: the development/test of the solution, the deployment to the Web
community and the running cost. The last item refers to the additional
processing overheads of the gzip compression/decompression. The saving would
be in less consumption of bandwidth (possibly offsetting all of the codec
overheads), and wasted user time (because response times would improve). Is
this cost-effective?
Of course, in-band network compression technologies already move a lot of
these uncompressed packets around the 'Net in a transparent manner so we're
probably not saving the operators/ISPs as much as we think, but the first
and last miles will still benefit.
As for whether or not the W3C should do it, I have already argued that it
should not be done unless we are careful about legacy/buggy browsers, Ted
has said that we won't benefit financially and Chaals suggests that in fact
it would probably cost the Team. So the only motivation that has any merit,
as far as I can see, is to set an example and establish a "best practice". I
agree with this, but think that in the current financial constraints there
are probably more deserving initiatives to which the W3C can direct its
limited and dwindling resources.
Let's just keep the idea on the back burner for now.

From: David Ezell
Sent: Wed 05/08/2009 18:58
To: w3c-ac-forum@w3.org
Subject: Re: compression of HTML would save a lot of money
Charles McCathieNevile wrote:
>I think if it is reasonably easy it is worth doing as part of showing how
to make the 
>web better for the world, but I agree with David that beyond that
expression of desire, 
>I think this can be safely left in the capable hands of them team to make
an intelligent
>decision based on all the facts.

I agree 100% that the team is making the best possible operating decisions.
Full stop.

But there's a lot of value in the question if we pose it as:

"This clearly useful and resource-saving feature doesn't have universal
availability.  What's wrong with it?"

The answer may be "nothing", but it could also be IP issues across browser
and server providers.  Or something else.

In any case, we, the AC, should pay attention to the considered practices of
our team in implementing the web.  We'll undoubtedly learn things.

Best regards,
David Ezell (NACS)
Received on Wednesday, 5 August 2009 19:45:56 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:15:40 UTC