- From: Mark Nottingham <mnot@yahoo-inc.com>
- Date: Fri, 1 Sep 2006 13:21:52 -0700
- To: public-web-http-desc@w3.org
Imagine that you're going to take a trip this weekend (especially relevant for the US folks). You can either; a) Get onto your local freeway and hope to get where you want by following the signs, based on what you know. b) Navigate only using a map (maybe you've got a fancy in-car GPS), with your headlights off. c) Plan your trip on a map, but navigate by signs and landmarks you see while you're on the road. (a) is how the Web works for people, but it isn't really practical for machines (yet). (b) is how most WS-* tools work. (c) is what I'd like to encourage (without explicitly disallowing (b), although I'm not sure I want to be on the same road as them!). People will always write broken tools, but what I'm really concerned about is that people will write Web applications that *require* a description file to navigate/use; i.e., there will be (meta-)data in there that won't be in representations. For example, people might start wondering why they need all of those pesky <link> elements and other metadata in Atom documents when they've got a description file that contains all. What might help is concrete guidance about what shouldn't be in a WADL file, and what shouldn't be only in a WADL file. Does that seem reasonable? Also, I've heard lots of people talk about consuming WADL on the client side, but no concrete proposals for what the use would entail. Are people assuming that they'll look at a WADL, generate some code, and then blindly make requests for URLs according to it? Or would the WADL be re-checked at runtime? Or...? Cheers, -- Mark Nottingham mnot@yahoo-inc.com
Received on Friday, 1 September 2006 20:22:23 UTC