- From: Lloyd Wood <l.wood@eim.surrey.ac.uk>
- Date: Mon, 31 Jan 2000 16:01:11 +0000 (GMT)
- To: Bless Terje <link@rito.no>
- cc: www-validator@w3.org
On Mon, 31 Jan 2000, Bless Terje wrote: > Lloyd Wood <l.wood@eim.surrey.ac.uk> wrote: > > > >On Mon, 31 Jan 2000, Bless Terje wrote: > >> > >>Well, I happen to agree with you about Case Sensitivity, > >>but, unfortunately, for XML Applications (such as XHTML 1.0) > >>that is the way it is. If I'd had any say in the matter I > >>would probably have fought this desition (as I've > >>seen no sane argment in favour of it), > > > >compression. > > > >eventually, we'll have transparent http compression between client and > >server; tags in a single consistent case compress better than those in > >mixed case. > > That is, at best, a weak argument. > > Given that even with current round-peg-in-square-hole attempts at HTTP > compression case difference has negligable impact on compression > efficiency -- due to the ratio of markup to data and the tendency of > authors and authoring tools to be internally consistent okay, so far. Unfortunately, database-generated output mixes and matches content from a variety of sources; compare ad banner insertion code, templates and content on any online magazine site you like. > -- and that > any specialized HTTP compression scheme should contain optimizations > for the kind of data that is likely to occur, the optimisations you appear to be considering would damage the integrity of the original bytestream. The only specialised text compression I'm aware of are bizarre proprietary schemes to get more out of GSM short message services... L. <L.Wood@surrey.ac.uk>PGP<http://www.ee.surrey.ac.uk/Personal/L.Wood/>
Received on Monday, 31 January 2000 11:01:17 UTC