- From: len bullard <cbullard@hiwaay.net>
- Date: Wed, 21 May 1997 19:31:00 -0500
- To: Gavin Nicol <gtn@eps.inso.com>
- CC: pflynn@curia.ucc.ie, w3c-sgml-wg@w3.org
Gavin Nicol wrote: > > To test this, I performed another little experiment. I created > a 1024x1024 table of (random, integer, small) numeric data in both > comma delimited, and xml form, and then compressed both files. > The results follow: > > Before compression: > foo.txt 11,873,460 > foo.xml 17,123,519 > > Markup adds about 44% to the size of the data. > > After compression (using gzip): > foo.txt.gz 5,323,235 > foo.xml.gz 5,704,385 > > markup adds about 7% to the compressed size. Gavin, doesn't this look pretty much like the arguments in VRML about the binary formats in which it was decided that modems and gzips did the job about as well with regards to transmission size. I don't think I buy the "transmission size will be a hazy memory" argument because even where complete infrastructure change is needed (why not dump TCP/IP while at it?), it won't happen too quick and in all of the places XML should be able to reach. I think it likely to happen to the well-heeled in the short term, but that's life. OTH, reduction of end tagging just isn't an issue with enough arguments one way or another to convince me that it requires a lot of time right now. I think it should be reconsidered in XML 2.0 if there is sufficient evidence it gives the application vendors grief (such as, they ignore it and do it anyway). BTW: On conformance. Anyone looking at that issue now should look at Mary Brady's site for VRML 2.0 at NIST. Top marks! len
Received on Wednesday, 21 May 1997 20:31:23 UTC