RE: social recasting and webid - an asde on of RE: Mandatory client supported serializations

let me make sure I deliver the message.  Its tends to get lost in the technical know-how.

 

I can use a webid (i.e. endpoint) that is more than a sparql query engine's endpoint  (handling queries embedded in a URL querystring argument of the webid URI). The  output can be an HTML (or other) format, and is itself a webid profile.

 

The particualr sparql server can be more than an mere "remote evaluator" of queries of triples for some named subject. It can manufacture a webid profile on the fly, bu managing multiple sources of triples about said subject, using additional document sources that are deemed equivalent to that of the canonical URI at the base of this.

 

To be specific, I can direct an ASK query specified in the webid validation agent spec at a special kind of webid profile, contructed on the fly in the working memory of uriburner. Given a suitable URI, the agent pulls the source document from the canonical URI (say some blogspot RDFa triples, transformed on the fly into RDF/XML by a rdfa-transfomer service) and makes a unique store in working memory. To the store it can add triples pulled in real time from related sources, as declared by the owl:sameAs predicates attached to the webid subject in the original stream (or a cached copy thereof, in virtuoso's crawling data space). Said sources may "augment" the data delivered on the wire from the original source.

 

IN particular, the equivalent sources providing "additional triples" may in practice deliver additional cert:certificate predicates and values (i.e. x.509 certs) to the "mashup" in working store. If the other sources are n CA endpoints about #me identified using owl:sameAs, with each minting and signing and chaining a particular cert about a _common_ public key, I have a distributed cert store. it comes into being, in practice, as one de-references the particualr sparql URI, along with its embedded query with the markup that induces all this behaviour.

 

if we can put a simpler proxy URI (that does document cleanup) into the SAN URI and FOAFSSO could handle it, I dont see why I cannot put such a sparq query into the SAN URI, too, and present it to FOAFSSL, expecting it to work.

 

I noted that it seemed possible to have uriburner create a "tiny URI" that stored such a query, rather than stuffing it in an long URI in the SAN field. I will try to learn how to do that, now. At the same time, it makes sense in assurance sense, to have the queryin the signed cert, rather than relying on the redirector.

 

 

 


 




Date: Sat, 31 Dec 2011 13:27:01 -0500
From: kidehen@openlinksw.com
To: home_pw@msn.com
CC: public-xg-webid@w3.org
Subject: Re: social recasting and webid - an asde on of RE: Mandatory client supported serializations

 On 12/31/11 1:03 PM, Peter Williams wrote: 
I going to edit my little validator agent experiment today, to make further use of the linkburner sparl-protocol endpoint, doing remote query execution in the course of enforcing the SSL handshake protocol. 
You mean: URIBurner, a Linked Data service that we offer to the public :-)


I assume its owl-enabled, with reasoner.
 You add this pragma to your SPARQL:
 DEFINE input:same-as "yes"

 Example from prior queries:

 ## owl:sameAs pragma. Comment out if you don't seek owl:sameAs reasoning applied to the SPARQL query .

 DEFINE input:same-as "yes"
 ## pragma use end

 ## HTTP GET (sponging) pragma, use "add" for progressive named graph population, so you can add relations to the source without overwriting 
## the local graph used by the Virtuoso server. Note, this is also a default cache invalidation scheme override

 DEFINE get:soft "add" 

## pragam use end. 

PREFIX : <http://www.w3.org/ns/auth/cert#>
 PREFIX xsd: <http://www.w3.org/2001/XMLSchema#>


 ASK FROM <http://rdf-translator.appspot.com/parse?url=http%3a%2f%2fid.myopenlink.net%3a80%2fdataspace%2fperson%2fhome_pw&of=xml> WHERE {
 ?s :key [
 :modulus "c531b19280ed0e1a64d9cf327801296366657325ff08a35c93b406293429415430d6d832fa3694f05d05ace8a2ac95db5147feb1c19bc5eb7a80aedc510b79bbbe2ddce7badd9d00a36566445bba5065f66478ac2c4c24e1e8869f0a6eb7b9feef54a194c4f1e77d1918662967f02878e0f27e6880f93a1c32feac1a0861f349"^^xsd:hexBinary ;
 :exponent ?exp ;
 ] .
 } 



I will need help formulating working queries that exercise some query ...working over 2 profiles, named using 2 SAN URIs from 1 cert, with 1 key. 
You idp space simply needs to have owl:sameAs relations in place and the reasoner will handle the query solution. 



Your linkburner sparql server has an option to auto-pull owl:sameAs data from the triples *received* from the securely-named endpoint, when its "useful" to do so, in the course of preparing data for local querying.
 The pragma: define get:soft "{add | replace | soft}" handles resource retrieval modalities.
 The pragma: define input:same-as "yes" handles enabling or disabling of the OWL reasoner. 


I want to see the utility of what we have been discussing. I want (as the worlds worst programmer) to be able to do it, so we know its real fo all the class (not just those who get As) .
 Great. 

Also note you could also test the inversefunctional property relation option by declaring cert:key as being an IFP and then making a custom inference rule that's invoked by a pragma. For the URIBurner instance, I'll have to set up that rule once you are ready to test that scenario. All you have to do is associate two URIs with the same public key in your idp space modulo owl:sameAs relation. 


Then the rest of the world can copy it. Using a remote sparql protocol server for this seems ideal, since its now acting as a trust resolver (leveraging equivalence relations). using a sparql server guarded by webid validation is even better, as a "system" of trusted agents start to come into being.
 Yes, we can WebID protect the sparql endpoint. Basically, that's what we'll soon enact re. id.myopenlink.net, we've been waiting for an external QA tester like you for a few years now!


 

I want to be able to test for membership of two profiles in a given equivalence class. The class itself will be distinguishsed using a data URI with certain integrity properties that detect tampering, and webid validation agents (and the SSL handshake) will prove that the profiles do participate in that particular equivalency class. Since the class apparatus is based on the canonical nature of the public key in the data URI, it obviously scales to any size of equivalencies we want, suit the million person world of (wholly de-centralized) realty trust networks based on the individual (and her 20 local friends, neighbors, and business associates).
 This is the game, right here!



-- 

Regards,

Kingsley Idehen 
Founder & CEO 
OpenLink Software 
Company Web: http://www.openlinksw.com
Personal Weblog: http://www.openlinksw.com/blog/~kidehen
Twitter/Identi.ca handle: @kidehen
Google+ Profile: https://plus.google.com/112399767740508618350/about
LinkedIn Profile: http://www.linkedin.com/in/kidehen 		 	   		  

Received on Sunday, 1 January 2012 21:13:19 UTC