Re: Myth of loose coupling

I agree that things can be simplified when designing for machine 
reading.  But wouldn't it be nice to get your site machine accessible 
for very little work and then you can work on optimizing it to your 
hearts desire.  Of course the amount you do that will depend on how much 
of your client population is machine based, how much the lack of 
optimization causes additional machine load and additional work for the 
clients.

David

Assaf Arkin wrote:

>>I would think existing web application builders would find this
>>much easier
>>to wrap themselves around.  And in fact make it much easier to support the
>>web site for humans and machine agents at the same time.  This also solves
>>the problem or workflow coordination for web services because the web
>>services tells you when you can execute different methods (just
>>like it does
>>for people).
>>    
>>
>
>If you designed an Amazon that was intended for machine consumption you
>would make different design choices. A lot of the information there is
>useful for human readers but not quite useful for machine. Some of the
>information is presented in a particular way to make the page more
>appealing, improve perceived response time, etc.
>
>For example, if a book has ten reviews Amazone would only list two on the
>first page. But if a machine was reading the information you could cut down
>all the fluff (images, tables, redundant links) and provide all ten reviews
>in one page that is actually smaller than the one containing two reviews +
>fluff. The software would then be able to retrieve all ten reviews in one
>HTTP request.
>
>You would also simplify the steps required to make an order. A lot of the
>steps were introduced to assist people, but if you automate the process than
>a lot of the text/options are no longer necessary and you can cut it down to
>a single page.
>
>arkin
>
>  
>
>>Am I missing something fundamental?
>>
>>David
>>
>>    
>>
>
>  
>

Received on Tuesday, 7 January 2003 13:03:51 UTC