Re: New WebID spec on identity.

On Nov 19, 2012 1:09 PM, "Kingsley Idehen" <kidehen@openlinksw.com> wrote:
>
> On 11/19/12 12:00 PM, Andrei SAMBRA wrote:
>>
>>
>> On Mon, Nov 19, 2012 at 11:49 AM, Kingsley Idehen <kidehen@openlinksw.com>
wrote:
>>>
>>> On 11/19/12 10:23 AM, Henry Story wrote:
>>>>
>>>>
>>>> I have updated the picture and put Tim Berners Lee as the example.
>>>> I think it is really important to have a real person be the reference
of the WebID for explanatory reasons. People need to be able to do an http
GET on a real URI and see that it actually does work. They must also know
that the person in the real world exists, because otherwise we have to
create a fictional character, and there will be a tendency for that
fictional character to be thought of as just a diagrammatic person - making
it difficult to help people distinguish between symbolic elements and real
elements.
>>>>
>>>> Henry
>>>
>>>
>>> Now that we have the depiction in place, it's really important to use
this context to explain *indirection*.
>>>
>>> Note: the URIs in this document should be user agent accessible. Right
now, I can't access TimBL's WebID: <
http://www.w3.org/People/Berners-Lee/card#i> as shown via:
http://linkeddata.uriburner.com/about/html/https/dvcs.w3.org/hg/WebID/raw-file/tip/spec/identity-respec.html.
If done right, his URI/WebID would be exposed a value of sioc:links_to
property.
>>>
>>> Back to indirection.
>>> When used in the Linked Data context, a hash URI uses *implicit*
indirection to enable critical look-up association between a URI that
denotes an entity and the URL used to locate said entity's description
document. The same thing happens re. DBpedia's hashless URIs, but the
indirection is *explicit* and requires the user agent to handle 303
redirection to the URL of the entity description document. This is all
about abstraction and data access by reference. While the aforementioned
pattern is old, HTTP really brings it to the masses in manner that's a lot
easier to appreciate.
>>>
>>> An Identity Provider (the issuer of X.509 certificates) SHOULD be able
to mint hash or hashless HTTP URIs re. WebIDs placed in the SAN slot of an
X.509 certificate. That's the pattern in broad use today re. Linked Data,
as exemplified by most of the LOD cloud.
>>
>>
>> You're going back on what we've agreed on in the last teleconf.
>
>
> No I am not.
>
> Ted: made it clear we are raising an issue re. this matter. Please note,
in the transcript I made it crystal clear that my +1 was for an issue to be
opened.
>
>> The consensus was that all NEW WebIDs MUST contain the hash, but
verifiers should not fault on hashless ones.
>
>
> Yes, that's the essence of the matter.  SHOULD NOT FAULT is the heart of
the matter.
>
>
>> It's been marked in red as an issue, but it's difficult to spot at this
point (no HTML markup when looking at the hg raw file).
>
>
> Hence my response, if it isn't visible the instinct would be to lose the
critical nuance re. verifier behavior.

It was correctly marked, but it currently doesn't show up as such because
the js overlay is not applied to the hg raw file.

>>
>>>
>>> An Identity Verifier (what performs WebID authentication e.g., over
TLS) needs to be able to simply de-reference an HTTP URI (as other user
agents do e.g., browsers, curl etc.) . Having them only look for hash based
HTTP URIs is an unnecessary limitation.
>>
>>
>> Maybe this is something we should discuss further. How do we process
WebIDs? We could open an issue for it.
>
>
> It is ultimately affected by the definition of a WebID. If the definition
goes wrong we are going to repeat this process all over again when defining
the TLS based authentication protocol. Unfortunately, this issue is
inescapable which is why it needs to be sorted out right now. There is a
reason for my insistence.
>
>
>>
>>>
>>> A profile document publisher (who doesn't have to be an IDP per se.)
SHOULD be encouraged to use hash based HTTP URIs to denote entities
described by its profile documents since this style of URI inherits the
deployment cost effectiveness associated with *implicit* indirection re.,
Linked Data deployed using hash HTTP URIs.
>>>
>>> All:
>>>
>>> These nuances are important. The thing to be prevented, above all else,
is having WebID over TLS based verifiers coded to parse for hash based HTTP
URIs instead of HTTP URIs. This also means not treating 303 as a fault
since that's all about *explicit* redirection which can be used for the
very indirection required by the Linked Data concept.
>>>
>>> The performance headache (real or perceived) shouldn't be the basis for
making this kind of decision.
>>>
>>> Examples of the importance of these issues re. interoperability:
>>>
>>> 1. hashless URIs enable simply integration of Facebook, Twitter,
LinkedIn, and many other Web 2.0 data spaces into Linked Data -- today, any
Facebook, LinkedIn, Twitter etc. user can acquire a fully functional WebID
that verifies with the WebID authentication protocol via the click of a
button
>>
>>
>> Facebook has hash URIs. A _billion_ hash URIs.
>
>
> That's and over simplification that leads to problems down the line. To
save time, how do those Billion plus hash based WebID hook into the TLS
based authentication protocol?

They are valid WebIDs, though not using WebID-TLS at this point. I guess
you could consider them as using WebID-facebookConnect. That's the reason
why we've decoupled the authentication bit. :-)

Andrei

Here are the key issues here:
>
> 1. lip service to Facebook Turtle in the context of WebID is useless if
we can't use these profiles as part of the TLS authentication protocol
> 2. how do you get a public key into a Facebook profile document?
> 3. how do you achieve the above as a Facebook end-user as part of an
effort to regain control of your identity via TLS based authentication?
> 4. how do you build a FB end-user service that aids #3 ?
>
>
> As I said, for FB's turtle based profile docs to be useful and relevant,
you need a solution to the items above. We built that solution a while
back, and it requires hashless proxy/wrapper URIs. It's a middleware
solution that leverages URIs and HTTP.
>
>>
>>
>>>
>>>
>>>
>>> 2. there are already numerous WebIDs out in the field that are hashless
.
>>>
>>> The cost of hash specificity is too high and the reward too low. There
is a middle line that will work fine for everyone.
>
>
> So you end up with lip service claims re. FB profiles or you end up with
an immediately useful solution that makes the WebID and FB association
immediately meaningful and useful.
>
> To see what I mean, just go to: http://id.myopenlink.net/certgen . Click
on the Facebook button, See the end result i.e., test it with any of
today's WebID verifiers.
>
> Links:
>
> 1. http://bit.ly/U9HLEe -- Using Facebook as an Identity Provider for the
WebID authentication protocol .
>
> Kingsley
>>>
>>> --
>>>
>>> Regards,
>>>
>>> Kingsley Idehen
>>> Founder & CEO
>>> OpenLink Software
>>> Company Web: http://www.openlinksw.com
>>> Personal Weblog: http://www.openlinksw.com/blog/~kidehen
>>> Twitter/Identi.ca handle: @kidehen
>>> Google+ Profile: https://plus.google.com/112399767740508618350/about
>>> LinkedIn Profile: http://www.linkedin.com/in/kidehen
>>>
>>>
>>>
>>>
>>>
>>
>
>
> --
>
> Regards,
>
> Kingsley Idehen
> Founder & CEO
> OpenLink Software
> Company Web: http://www.openlinksw.com
> Personal Weblog: http://www.openlinksw.com/blog/~kidehen
> Twitter/Identi.ca handle: @kidehen
> Google+ Profile: https://plus.google.com/112399767740508618350/about
> LinkedIn Profile: http://www.linkedin.com/in/kidehen
>
>
>
>

Received on Monday, 19 November 2012 21:05:14 UTC