W3C home > Mailing lists > Public > public-xg-webid@w3.org > December 2011

Re: Another Translator for RDF

From: Kingsley Idehen <kidehen@openlinksw.com>
Date: Fri, 30 Dec 2011 12:57:17 -0500
Message-ID: <4EFDFB7D.6040405@openlinksw.com>
To: Peter Williams <home_pw@msn.com>
CC: "public-xg-webid@w3.org" <public-xg-webid@w3.org>, imitko@openlinksw.com
On 12/30/11 1:59 AM, Peter Williams wrote:
>
> Lets take this rather more slowly (or perhaps, have a phone call with 
> folks at your level). This is way over my head. I'm down at compiling 
> against crypt32.dll, marshalling bytes across DLL calls so dot net can 
> call windows binary crypto and cert libs (from 1995 era). Thats all Im 
> good for, these days.
>
> To me, I saw a translator of formats, from RDFa serialization to TTL 
> serialization. This was a layer 6 function, from one presentation data 
> value to another, in different syntax/format. It made an output that 
> visually looked just like an old LDIF file. It was very natural. There 
> were facts about the object, and facts about the container. The object 
> has an X.500-like object class (with some schematically-correct 
> attributes, with types and syntaxes), and the container has an implied 
> object class (container). Furthermore, the schema of object classes is 
> also in the directory (I mean web). ONe could even determine that 
> cert:certificate is an improper attribute for the object class of the 
> whose distinguished RDN is "me"
>
> This is how I think. It has nothing to do with how Im supposed to 
> think (which is just too hard for me).  But, me and a million others 
> can think like this, now TTL is making it all look like an LDIF fie.
>
>
> Then there is X500 information model, which gets us into names vs 
> address, and dualities. The X500 world was easy: white pages with real 
> objects on the heap, and yellow pages pointers on the stack making 
> references to the heap. Thats it. The web is more complex in its 
> identity model, facilitating theorem proving. This is beyond me, these 
> days, however.
>
> When I considered putting the translator URI in the SAN cert 
> pointing to a TTL stream  - rather than the pointer to an RDFa stream 
> - it was to engender clarity. I was not Infact intending to be making 
> a identity statement, about 2 SAN URIs, or what such double 
> referencing SHOULD/MUST/OUGHT to mean (in a object theory sense). I 
> think this is what you did, though.
>
> I think I have just learned (and im leaping here) that there is 
> just like in C++ a magic #fragment name called "this". I dont have a 
> literal in my stream named #this, though - any more than I have such a 
> literal in C++ structure. OK. Wonderful. That makes sense (by analogy 
> with C++). I now have a way to always name what is otherwise an 
> address (causing endless fuss).
>
> So, now I have a collection of SAN URIs in a signed cert blob. The 
> original idea was: mr validtor, pick one, any one, and do your 
> de-ferencing trick.

Ah! And this is where the Fragment Identifier cross the wire re. GET, 
right?  If so, as indicated, our re-write rule should still understand 
that you seek the description of the referent of the SAN URI. Basically, 
that the URI is a Name. We can handle that for sure by fixing our rules. 
The only issue is that others might not be able to do that *so easily* 
esp. if they don't have SPARQL to exploit on the server side as we do.

> Oh, and then do a sparql query to run a matching rule. In my mind, 
> this was no different to getting the X.500 object at a DN from a DSA 
> (that might chain off elsewhere), and running the X.500 matching rules 
> for the attribute once the entry was provided, where the value of said 
> attribute (cert:key) is a compound ASN.1 type of definition SEQ { int 
> mod, int exp }. The matching rule is able to match values of compound 
> types. You have on this list in David Chadwick a world expert in doing 
> just such definitions. To my mind, what Henry now has is directly 
> equivalent.

Yes.

>
> But you went further. You invite me in my yorkporc entry to assert an 
> owl:sameAs relation (hope thats the right term, and not "property") to 
> what I though of as an onthe fly transacltion of bit formats. But, you 
> are saying, in web identity theory, no : go futher. Its not just a 
> codec translation. What OPTIONALLY some validator (with the optional 
> OWL reasoner) can do is be more calculating. I can be making 
> equivalency assertions between identity names, and not just locatiing 
> service endpoints which transform bit formats, much like a UNIX pipeline.

Yes, that having multiple URIs in SAN == signed coreference claim. It 
also means that on the IdP side the data space can have coreference 
relations or have all the SAN URIs associated with the Public Key. Going 
even further, the Cert. Ontology used by WebID simply needs a little 
tweak that:
<http://www.w3.org/ns/auth/cert#key> a owl:InverseFunctionalProperty .

Once the above is in place you have two sides of equivalence fidelity in 
play i.e., equivalence by name or values. Reasoners can take either 
route to higher level assurance delivery.

>
> Ok I can buy that. And, I see how Im getting what our OASIS friends 
> were offering with the XDI graph model, when talking about XRI 
> synonyms in a non RDF basis.

Yep!

G+, Facebook etc.. are all falling flat on their faces re., what 
commonly referred to as the Nymwars. None of them comprehend synonyms.

> Here, we OPTIONALLY stay within (I think) the RDF theory of EAV, but 
> for security/key management purposes, we get to apply some of the 
> synonym theory (to help address practical life issues, and some of the 
> needs of security doctrine). Perhaps, it will sort out the apparent 
> mess I described in my blog post recently, concerning 7 layers of 
> identity following, in the openid world. it becomes something that one 
> CAN reason with.
>
> So, if I add that owl:sameAs as relation to the yorkporc2 entity 
> pointing at (referring to?) the translation service's URI (agumented 
> by the magical #this), to a validation agent that is RUNNING OWL, it 
> gets to offer a higher assurance validation logic. This goes beyond 
> (1) the matching rule of the spec, and goes beyond (2) Bergi's walking 
> a triple collection.

Yes, absolutely!

This is also where Semantic Web narrative get into trouble. In the 
beginning they focused on description logics aspects of OWL modulo 
existence of actual data, and compounded incomprehension by making 
RDF/XML the be all and end all re.  RDF narrative (even its underlying 
model was invisible).

In recent times, the Semantic Web narrative have done a 180 and kinda 
tossed OWL to the side and tried to rebuild off Linked Data.

Trouble with this kind of oscillation is that it misses the "magic 
middle" . Thus, when people bite (as you are doing now) they eventually 
hit unexpected problem areas. Eventually, for you, it will be: why can't 
I make a signed claim about coreference in my info card, have that 
mirrored in my IdP space and build on that re. higher fidelity 
assurances?  Re. the WebID spec there will push-back using the "keep it 
simple" stance.

Trouble with the "keep it simple" stance is that users and implementers 
aren't really simple people, they just seek "simple introductions" and 
once engaged up the ante re. complexity.

I refer to the AWWW endlessly because in its design I see exploitation 
of the "deceptively simple" doctrine where post-engagement complexity by 
(users and implementors) is already in place.


> Ok. Do we have 3 levels of identity assurance, executed at different 
> validation agents with different levels of sophistication for reasoning.

Yes. I think this is an inevitability.

> (Hmm. I lost a million dollars in the ValiCert startup for certs, 
> becuase evidently it was 7 years too early in its business plan!)

I can understand how that happened for sure. The critical infrastructure 
and resulting ecosystem for such an endeavor  wasn't in place.
>
> If ODS says its valid, there is more assured than if Bergi say its valid.
>

At this point yes, but Bergi can evolve his verifier :-)

> Providing we agument the spec so the requestor of validation can KNOW 
> which class of assurance during validation was used, Im sold.

We should certainly have a go at language in the spec for validators re. 
this matter. Ultimately, it serves WebID very well. In the future my 
hope is that people will also see it as inheriting the "deceptively 
simple" DNA of AWWW.

Linked Data and WebID are in my eyes the greatest demonstrators of the 
AWWW and its intrinsic virtues. It's also why I find this recent quote 
so poignant:

"The Web as I envisioned it, we have not seen it yet." -- Tim Berners-Lee


Kingsley
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> .
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> > Date: Thu, 29 Dec 2011 22:28:54 -0500
> > From: kidehen@openlinksw.com
> > To: public-xg-webid@w3.org
> > CC: imitko@openlinksw.com
> > Subject: Re: Another Translator for RDF
> >
> > On 12/29/11 9:12 PM, Kingsley Idehen wrote:
> > > On 12/29/11 12:32 PM, Kingsley Idehen wrote:
> > >> On 12/29/11 12:26 PM, Kingsley Idehen wrote:
> > >>> I meant:
> > >>>
> > >>> 
> <http://rdf-translator.appspot.com/parse?url=http%3A%2F%2Fyorkporc2.blogspot.com%2F&of=n3#this> 
>
> > >>>
> > >>> wdrs:describedby
> > >>> 
> <http://rdf-translator.appspot.com/parse?url=http%3A%2F%2Fyorkporc2.blogspot.com%2F&of=n3> 
>
> > >>> .
> > >>>
> > >>>
> > >>> Put this in your browser for implicit indirection effect:
> > >>> 
> http://rdf-translator.appspot.com/parse?url=http%3A%2F%2Fyorkporc2.blogspot.com%2F&of=n3#this 
>
> > >>> .
> > >> Peter,
> > >>
> > >> seeAlso:
> > >>
> > >> 1.
> > >> 
> http://linkeddata.informatik.hu-berlin.de/uridbg/index.php?url=http%3A%2F%2Frdf-translator.appspot.com%2Fparse%3Furl%3Dhttp%253A%252F%252Fyorkporc2.blogspot.com%252F%26of%3Dn3%23this&useragentheader=&acceptheader= 
>
> > >> -- URI Debugger output
> > >>
> > >> 2.
> > >> 
> http://validator.linkeddata.org/vapour?vocabUri=http%3A%2F%2Frdf-translator.appspot.com%2Fparse%3Furl%3Dhttp%253A%252F%252Fyorkporc2.blogspot.com%252F%26of%3Dn3%23this&classUri=http%3A%2F%2F&propertyUri=http%3A%2F%2F&instanceUri=http%3A%2F%2F&defaultResponse=dontmind&userAgent=vapour.sourceforge.net 
>
> > >> -- another Debugger (but it suffers from RDF/XML specificity so jump
> > >> to its conclusion re. Object/Entity Name / Descriptor Resource
> > >> Address disambiguation) .
> > >>
> > > Peter,
> > >
> > > A correction to the above.
> > >
> > > URI in SAN:
> > > 
> http://rdf-translator.appspot.com/parse?url=http%3A%2F%2Fyorkporc2.blogspot.com%2F&of=n3&html=1 
>
> > > 
> <http://rdf-translator.appspot.com/parse?url=http%3A%2F%2Fyorkporc2.blogspot.com%2F&of=n3&html=1>#this 
>
> > >
> > >
> > > Means that in your (X)HTML you need to add the relation:
> > >
> > > <http://yorkporc2.blogspot.com/#me> owl:sameAs
> > > 
> <http://rdf-translator.appspot.com/parse?url=http%3A%2F%2Fyorkporc2.blogspot.com%2F&of=n3&html=1 
>
> > > 
> <http://rdf-translator.appspot.com/parse?url=http%3A%2F%2Fyorkporc2.blogspot.com%2F&of=n3&html=1>#this> 
>
> > > .
> >
> > First off, cut and past fix for the relation. It should be:
> >
> > <http://yorkporc2.blogspot.com/#me> owl:sameAs
> > 
> <http://rdf-translator.appspot.com/parse?url=http%3A%2F%2Fyorkporc2.blogspot.com%2F&of=n3&html=1#this> 
>
> >
> >
> > >
> > > Please add that to your posts and repeat your tests with our
> > > validator. The relation above is crucial, I completely forget to
> > > mention that :-(
> > >
> >
> > In addition to the above, if you have two HTTP URIs in SAN:
> >
> > 1. http://yorkporc2.blogspot.com/#me
> > 2.
> > 
> http://rdf-translator.appspot.com/parse?url=http%3A%2F%2Fyorkporc2.blogspot.com%2F&of=n3&html=1#this 
>
> > .
> >
> > A WebID verifier should really assume that you are claiming that:
> > <http://yorkporc2.blogspot.com/#me> and
> > 
> <http://rdf-translator.appspot.com/parse?url=http%3A%2F%2Fyorkporc2.blogspot.com%2F&of=n3&html=1#this> 
>
> > are co-references for the certificate's subject .
> >
> > Basically, that claim is signed and verifiable. One could go the extra
> > mile on the IdP side and also sign the claim. But that's for another
> > time, once we are beyond the basics :-)
> >
> >
> > Hopefully, no cut and paste craziness this time around .
> >
> >
> > Kingsley
> >
> >
> > --
> >
> > Regards,
> >
> > Kingsley Idehen
> > Founder& CEO
> > OpenLink Software
> > Company Web: http://www.openlinksw.com
> > Personal Weblog: http://www.openlinksw.com/blog/~kidehen
> > Twitter/Identi.ca handle: @kidehen
> > Google+ Profile: https://plus.google.com/112399767740508618350/about
> > LinkedIn Profile: http://www.linkedin.com/in/kidehen
> >
> >
> >
> >
> >
> >


-- 

Regards,

Kingsley Idehen	
Founder&  CEO
OpenLink Software
Company Web: http://www.openlinksw.com
Personal Weblog: http://www.openlinksw.com/blog/~kidehen
Twitter/Identi.ca handle: @kidehen
Google+ Profile: https://plus.google.com/112399767740508618350/about
LinkedIn Profile: http://www.linkedin.com/in/kidehen








Received on Friday, 30 December 2011 17:58:02 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Friday, 30 December 2011 17:58:05 GMT