RE: question: Increasing factor for XML vs Binary

At the strong risk of stating the patently obvious -- it seems to me
that how much good deltas are going to do depends a LOT on the usage
scenario.  Of those with which I am personally familiar, I think it
might be a huge winner in the one where you have Point of Sale
information coming to backoffice systems from a large number of
retailers, and would do absolutely no good whatsoever with seismic data.
Seems to me that identifying which usage scenarios deltas are likely to
be useful, and of course the potential impact of those use cases, would
be a good idea. 

-----Original Message-----
From: public-xml-binary-request@w3.org
[mailto:public-xml-binary-request@w3.org] On Behalf Of Stephen D.
Williams
Sent: Thursday, November 18, 2004 10:43 AM
To: Mike Champion
Cc: Silvia.De.Castro.Garcia@esa.int; public-xml-binary@w3.org
Subject: Re: question: Increasing factor for XML vs Binary


One thing that is missing from a lot of these analyses is what could be
saved by being able to do deltas.  In a situation where there is any
kind of repetition such as protocol messages (in XMPP), records of some
kind in a stream or file, or a request/response, the ability to send
only what's different efficiently may use less CPU and be more efficient
than even schema-based solutions.

I plan to benchmark and demonstrate this kind of solution soon.  There
is a way to use the idea of a delta in a way that is very schema-like,
but isn't so firmly tied to a schema.  Use in a 'header compression' 
style is even more powerful although it is somewhat more entangled in
the semantics of the application.

sdw

Mike Champion wrote:

>
>Sigh most of that was lost somewhere ... I'm on a handheld ...
>
>I'll interperet this as 'how much of a compression factor can be
achieved by using a binary vs XML encoding of the same data.'  The usual
answer, I'm afraid: it depends.  As best I recall from a literature
survey:
>
>larger docs compress better than small,
>
>you can get more compression if you use more CPU (and hence battery) 
>power,
>
>you can get very good compression if you assume that the schema is
known to both sides and docs are valid instances,.
>
>My recollection is that 5:1 compression is realistic for arbitrary XML
and 10:1 and higher is feasible with shared schemas.
>
>
>-----Original Message-----
>From:  Silvia.De.Castro.Garcia@esa.int
>Date:  11/4/04 8:56 am
>To:  public-xml-binary@w3.org
>Subj:  question: Increasing factor for XML vs Binary
>
>Hi all,
>        I would like to know the estimation order of the increasing 
>factor for the XML format respect to the equivalent binary product, I 
>mean, which is the order of the overload that will supose using XML 
>instead of binary format?
>
>Thank you very much,
>Best regards,
>
>Silvia de Castro.
>
>
>  
>


--
swilliams@hpti.com http://www.hpti.com Per: sdw@lig.net http://sdw.st
Stephen D. Williams 703-724-0118W 703-995-0407Fax 20147-4622 AIM: sdw

Received on Friday, 19 November 2004 16:35:15 UTC