- From: Chris Weber <chris@lookout.net>
- Date: Fri, 12 Aug 2011 10:17:13 -0400
- To: public-iri@w3.org
Hello Randall, On 8/8/2011 12:56 AM, Randall Sawyer wrote: > Hello, All! > > Only recently have I stumbled upon the need to parse and normalize URLs > for a couple of projects I'm working on. In doing my research - > including reading all of rfc3986 and part of A. Barth's "Parsing URLs > for Fun and Profit" - I find it frustrating the amount of effort > required to anticipate and correct malformed URLs. I have a suggestion > as to how content-providers and client-developers may voluntarily make > their services and products work better together. [I have searched the > archives for something like this, and have not found any so far.] > > What I have in mind is something comparable to SGML/XML validation. > Just as a *ML document may contain a declaration at the top stating that > it is compliant with a specific template, what if we made it possible > for an organization to declare that every existent path on their site is > compliant with a specific path-syntax template? Did you find the URI Template specification <http://code.google.com/p/uri-templates/> useful? I know there are some other ideas and IDs in concept stages attempting to define limited scheme-specific canonicalization in normative terms for implementations that want to be strict. Best regards, Chris Weber
Received on Friday, 12 August 2011 14:17:21 UTC