- From: Gavin Nicol <gtn@eps.inso.com>
- Date: Wed, 21 May 1997 09:28:08 -0400
- To: pflynn@curia.ucc.ie
- CC: w3c-sgml-wg@w3.org
>The more I look at the arguments, the more convinced I am that
>XML is not likely to be a meaningful format for distributing
>database-resident bulk data. This shouldn't surprise anyone: it
>was never designed to do this. You want to ship bulk data, you
>send a Java applet followed by a gzipped datafile, and let the
>applet generate XML locally for the display.
I do not think this to be the case. You can use XML to mark up
database records for distribution/interchange, in which case,
one would (intuitively) expect that any compression applied would
result in a file little different, because the markup would tend
to form a very regular input pattern (easily compressed).
To test this, I performed another little experiment. I created
a 1024x1024 table of (random, integer, small) numeric data in both
comma delimited, and xml form, and then compressed both files.
The results follow:
Before compression:
foo.txt 11,873,460
foo.xml 17,123,519
Markup adds about 44% to the size of the data.
After compression (using gzip):
foo.txt.gz 5,323,235
foo.xml.gz 5,704,385
markup adds about 7% to the compressed size.
Received on Wednesday, 21 May 1997 09:29:31 UTC