- From: Christian Mayer <Vader@t-online.de>
- Date: Sat, 10 Jun 2000 10:36:23 +0200
- To: www-xml-query-comments@w3.org
Hi, I was wondering if there shouldn't be an officiall standard way to compress XMLs. Currently people are using gzip (or something similar) which has good compression rates and is freely aviable. But I'm sure that someone who knows a bit about compression algorithms can make up an algorithm that gives a much higher compression rates as every XML looks a bit the same. (BTW: there are currently algorithms around that give higher compression rates for any sort of input data - e.g. bzip or rar) Such a compression standard should be the official XML compression standard (so that as many readers and editors as possible support it), easy to add to your programms (e.g. including zlib - the gzip compression library - is very easy), free of patent issues (never let the GIF disaster happen again), and allow you to read parts of the file w/o un'zip'ing it completly (like zlib lets you do it). What do you think about that? IMHO there should be a special comitee that creates such a standard. CU, Christian
Received on Saturday, 10 June 2000 04:53:51 UTC