Re: Support for compression in XHR?

On 10 Sep 2008, at 15:47, Dominique Hazael-Massieux wrote:

>
> Le mercredi 10 septembre 2008 à 06:31 -0700, Sullivan, Bryan a écrit :
>> Re " surely this is relevant to the "delivery of Web applications on
>> mobile devices", and thus to the Mobile Web Application Best  
>> Practices
>> ", I agree and that's why it is already present in the Mobile Web
>> Applications Best Practices.
>
> Right, with the caveat "The Working Group is researching the  
> conditions
> under which compression should be recommended on mobile devices and is
> looking for feedback on this topic."
>
>> Re the use cases you mention in which compression could be avoided:
>>
>> (1) Small files is the strongest case, and should be easy to code for
>> (?). What would be a good lower threshold to recommend?
>
> My early research showed that under 1K, the benefits of compression  
> are
> in most real cases negligible. I think 2K is probably a good  
> threshold.
>
>> (2) Progressive-rendered pages are unlikely I think in the mobile  
>> case
>> (if what you mean is a large page that is partially presented before
>> the rest is received) since I think (at least for XHTML) the whole
>> base page is needed (to be validated) before anything is presented.
>
> In practice, that isn't true though - most mobile browser simply don't
> validate (or check well-formedness) before rendering the page. And in
> fact, one of the arguments that is raised against XHTML (served as
> application/xhtml+xml, i.e. supposedly parsed as XML) is that it
> prevents (or should prevent) progressive rendering.
>

Indeed, one area I've a lot of experience in is web compatibility (it  
is my main job).  We often have to add browser javascript patches to  
our mobile browsers to make the page think we should be getting html  
instead of xml, as a site will say it is xml, and yet not be well  
formed (or missing something important), so we'd just get an error  
page unless we spoof.  Major Google services had this issue recently  
(luckily I found the right people in Google and got them to fix it),  
as they didn't close their meta tags correctly.  With sites that allow  
user generated content, it becomes a mess fast.
>> (3) Not compressing based upon the speed of the serving network is an
>> interesting optimization, but testing should validate the benefit,
>
> FWIW, I did some rough testing that I reported upon a few weeks ago:
> http://lists.w3.org/Archives/Public/public-mwts/2008Jun/0025.html
> http://lists.w3.org/Archives/Public/public-bpwg/2008Jul/0002.html
> http://www.w3.org/2008/06/gzip-mobile/results.php (which I'm afraid is
> not very easily interpretable)
>
> I agree that it is a difficult to make optimization on a
> request-per-request basis, but surely we could tell developers that if
> most of their users come from such a network then they should do this,
> or this if on another type of network. I don't expect that advice to  
> be
> practical for every developer out there, but hopefully sufficiently
> practical to a fair number of developers even though.
>
> But maybe this is going into too low-level details for the document
> we're developing; I personally think we ought to do that analysis  
> work,
> since it seems to be not obvious that compress-by-default is  
> necessarily
> a good thing.
>
> Dom
>
>
>

David Storey

Chief Web Opener,
Product Manager Opera Dragonfly,
Consumer Product Manager Opera Core,
W3C Mobile Web Best Practices Working Group member

Consumer Product Management & Developer Relations
Opera Software ASA
Oslo, Norway

Mobile: +47 94 22 02 32
E-Mail: dstorey@opera.com
Blog: http://my.opera.com/dstorey

Received on Wednesday, 10 September 2008 14:04:11 UTC