- From: Assaf Arkin <arkin@intalio.com>
- Date: Tue, 7 Jan 2003 00:50:28 -0800
- To: "Jacobs,David B." <djacobs@mitre.org>, "Ugo Corda" <UCorda@SeeBeyond.com>, <www-ws-arch@w3.org>
> I would think existing web application builders would find this > much easier > to wrap themselves around. And in fact make it much easier to support the > web site for humans and machine agents at the same time. This also solves > the problem or workflow coordination for web services because the web > services tells you when you can execute different methods (just > like it does > for people). If you designed an Amazon that was intended for machine consumption you would make different design choices. A lot of the information there is useful for human readers but not quite useful for machine. Some of the information is presented in a particular way to make the page more appealing, improve perceived response time, etc. For example, if a book has ten reviews Amazone would only list two on the first page. But if a machine was reading the information you could cut down all the fluff (images, tables, redundant links) and provide all ten reviews in one page that is actually smaller than the one containing two reviews + fluff. The software would then be able to retrieve all ten reviews in one HTTP request. You would also simplify the steps required to make an order. A lot of the steps were introduced to assist people, but if you automate the process than a lot of the text/options are no longer necessary and you can cut it down to a single page. arkin > > Am I missing something fundamental? > > David >
Received on Tuesday, 7 January 2003 03:51:15 UTC