W3C home > Mailing lists > Public > www-ws@w3.org > June 2007

Webservices and huge datavolume

From: Friedrich-Carl von Gneisenau <Friedrich-Carl_von_Gneisenau@msg.de>
Date: Mon, 4 Jun 2007 12:38:49 +0200
To: www-ws@w3.org
Message-ID: <OF56B0DA18.CEB11259-ONC12572F0.0035520B-C12572F0.003A7C47@notes.msg.de>
Hi,

Some questions about webservices and huge datavolumes. When sending, for 
example more than 20.000 java beans with a deep of two levels, an axis 
framework needs more than 3 minutes to parse them into a xml format (With 
a pentium 4 and two GB ram). The result is a document which is greather 
than 40 MB (the raw data needs 4 MB in the database). Also some frameworks 
stop parsing the data with a deep more than 100 elements (IBM).
The question is:
- Which techniques are provided to send so many elements?
- Can webservices provide a cache for the xml documents which are already 
parsed?
- Is there a solution to compress the size of such an document?
- How can a tree send by webservices and xml with a deep more than 100?


Thanks,
Gneisenau
Received on Monday, 4 June 2007 16:00:51 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 3 July 2007 12:25:54 GMT