- From: Kingsley Idehen <kidehen@openlinksw.com>
- Date: Thu, 15 Nov 2012 16:50:43 -0500
- To: Alexandre Bertails <bertails@w3.org>
- CC: Andrei SAMBRA <andrei.sambra@gmail.com>, "public-webid@w3.org" <public-webid@w3.org>, "public-rww@w3.org" <public-rww@w3.org>
- Message-ID: <50A563B3.6030109@openlinksw.com>
On 11/15/12 4:31 PM, Alexandre Bertails wrote: > On 11/15/2012 12:39 PM, Kingsley Idehen wrote: >> On 11/15/12 12:14 PM, Andrei SAMBRA wrote: >>> >>> On Thu, Nov 15, 2012 at 12:02 PM, Kingsley Idehen >>> <kidehen@openlinksw.com <mailto:kidehen@openlinksw.com>> wrote: >>> >>> On 11/15/12 11:40 AM, Andrei SAMBRA wrote: >>>> >>>> >>>> Restricting ourselves to http, https URLs does make for a >>>> clearer spec, without >>>> creating interoperability issues. I can see that ftp and ftps >>>> would also work, but >>>> we would certainly have a more testable system if we limited >>>> ourselves at first. >>>> >>>> +1 >>>> We should remember that WebID is a _W3C_ group, not an IETF one. >>> So you infer that URIs belong to IETF and URLs to the W3C? At the >>> same time you assume this is architecture with real >>> interoperability in mind. >>> >>> You are making an important point here, quite profound. I really >>> need to know if this is the view shared by others. >>> >>> The most powerful virtue of the Web is its interoperability. That >>> virtue is inextricably linked to URI abstraction. >>> >>> >>> No, my point is that WebID URIs use HTTP(S) schemes. The point is to >>> avoid ftp:// WebIDs (or any other scheme) in order to simplify the >>> spec. >>> >>> Andrei >>> >> >> Abstraction != Difficult. Please understand that the principle in play >> re. AWWW is "deceptively simple" which is a function of good >> abstraction. URI abstraction is a fine example of said principle. This >> is what makes the Web work. > > You're confusing "difficult" and "complex". I am not. > > By being too abstract and general, you artificially increase the > complexity of WebID by implicitly asking support for many different > implementations. Wrong! > > By narrowing-down the definition to very precise concepts, you define > the minimal set of expectations that the system must support. You can achieve that without compromising the AWWW. > > Also, examples are not acceptable as definitions, because they don't > say anything about expectations. And when did I infer to you that an example is a definition? > You need to define the invariants of > the system. What are the invariants of the World Wide Web? The very system used by the entire world. > > That was the concern of the people who set the definition for WebID at > TPAC. Not wanting to go backwards, instead of forwards. The conclusion at TPAC were simply wrong albeit well intended. > I don't understand why people are loosing time with changing the > definition. Because any definition of WebID that includes specific references to hash URIs and Turtle is broken. Simple as that. Kingsley > > Alexandre. > >> >> For WebID based authentication to work it doesn't need to compromise the >> virtues of URIs. Just use simple examples to make matters clearer. >> >> The solution to the problem is that you don't introduce technology via a >> technical spec. It's conventionally achieved as follows: >> >> 1. conceptual guide and overview >> 2. technical specs >> 3. implementation guides and examples -- this is where you can be >> specific about URLs, Turtle docs etc.. by using them in all the >> examples. >> >> When you start from #2 you are vulnerable to: >> >> 1. political distractions -- e.g., format (as opposed to semantics) >> oriented warfare >> 2. FUD -- when the abstract nature isn't obvious those threatened will >> come at you with FUD. >> >> We don't need to compromise the essence of the Web for all of this to >> work. >> >> Remember, HTML wasn't prescribed to the world en route to WWW bootstrap, >> the "view source" pattern from early browsers enabled folks to cut and >> paste what was behind the page (which could have been anything) into new >> spaces en route to understanding the implications of fusing Hypertext >> and TCP/IP. >> >> Standards are retrsopective things, they are the result of coalescing >> around what works, so the sequence is always: >> >> 1. de facto standard -- common practice >> 3. industry standard -- accepted best practice. >> >> >> Kingsley >> >> -- >> >> Regards, >> >> Kingsley Idehen >> Founder & CEO >> OpenLink Software >> Company Web:http://www.openlinksw.com >> Personal Weblog:http://www.openlinksw.com/blog/~kidehen >> Twitter/Identi.ca handle: @kidehen >> Google+ Profile:https://plus.google.com/112399767740508618350/about >> LinkedIn Profile:http://www.linkedin.com/in/kidehen >> >> >> >> > > > > > -- Regards, Kingsley Idehen Founder & CEO OpenLink Software Company Web: http://www.openlinksw.com Personal Weblog: http://www.openlinksw.com/blog/~kidehen Twitter/Identi.ca handle: @kidehen Google+ Profile: https://plus.google.com/112399767740508618350/about LinkedIn Profile: http://www.linkedin.com/in/kidehen
Attachments
- application/pkcs7-signature attachment: S/MIME Cryptographic Signature
Received on Thursday, 15 November 2012 21:51:07 UTC