W3C home > Mailing lists > Public > www-xml-query-comments@w3.org > June 2000

Compressed XMLs

From: Christian Mayer <Vader@t-online.de>
Date: Sat, 10 Jun 2000 10:36:23 +0200
Message-ID: <3941FE07.BCEFFED2@christianmayer.de>
To: www-xml-query-comments@w3.org

I was wondering if there shouldn't be an officiall standard way to
compress XMLs.

Currently people are using gzip (or something similar) which has good
compression rates and is freely aviable. But I'm sure that someone who
knows a bit about compression algorithms can make up an algorithm that
gives a much higher compression rates as every XML looks a bit the same.
(BTW: there are currently algorithms around that give higher compression
rates for any sort of input data - e.g. bzip or rar)

Such a compression standard should be the official XML compression
standard (so that as many readers and editors as possible support it),
easy to add to your programms (e.g. including zlib - the gzip
compression library - is very easy), free of patent issues (never let
the GIF disaster happen again), and allow you to read parts of the file
w/o un'zip'ing it completly (like zlib lets you do it).

What do you think about that? IMHO there should be a special comitee
that creates such a standard.

Received on Saturday, 10 June 2000 04:53:51 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 20:21:11 UTC