- From: Kris Zyp <kris@sitepen.com>
- Date: Tue, 9 Sep 2008 16:41:50 -0600
- To: "Jonas Sicking" <jonas@sicking.cc>, "Geoffrey Sneddon" <foolistbar@googlemail.com>
- Cc: "Dominique Hazael-Massieux" <dom@w3.org>, "Boris Zbarsky" <bzbarsky@MIT.EDU>, <public-webapps@w3.org>
>>> Well, at least when an outgoing XmlHttpRequest goes with a body, the >>> spec could require that upon setting the Content-Encoding header to >>> "gzip" or "deflate", that the body be adequately transformed. Or is >>> there another e.g. to POST a gzip request with Content-Encoding? >> >> Why can it not just be added transparently by the XHR implementation? > > I doubt that it could. An UA implementation won't know which encodings the > server supports. > > I suspect compression from the UA to the server will need support on the > XHR object in order to work. I don't think the right way to do it is > through setRequestHeader though, that seems like a hack at best. I would have thought this would be negotiated by the server sending a Accept-Encoding header to indicate what forms of encoding it could handle for request entities. XHR requests are almost always proceeded by a separate response from a server (the web page) that can indicate the server's ability to decode request entities. However, the HTTP spec is rather vague about whether it can used in this way (server providing a header to indicate encoding for requests, rather than the typical usage of a client providing the header to indicate encoding for responses), but it certainly seems like it should be symmetrical. Perhaps this vagueness is why no browser has ever implemented such compression, or maybe it is due to lack of demand? IMO, a server provided Accept-Encoding header would be the best way to encourage a browser to compress a request. Accept-Encoding: gzip;q=1.0, identity; q=0.5 Perhaps we should talk to the HTTP group about clarifying the specification. Kris
Received on Tuesday, 9 September 2008 22:43:06 UTC