incubating profiles, and webid-powered sparql server endpoints as trust network "routers"

Permarants aside, I see a rosy future. And its due to you. its also due to others, staying focussed on the social agenda. I see some incubating going on (finally). We are no longer pretending to be a working group, writing a spec. that magically solves ALL the worlds security problems. I see henry trying to find a lowest common denominator, within 10 years of technology development which has several (often conflicting) generations of dogma and practice already - so that the core concept (W3C mission) is adopted. Webid is not the goal for him - its the forcing function. These days, he is cleary prepared to compromise (and at least is working on the LCD). This was my original deal: Ill help semweb get institutionalzed (and out of library/scholarship/object theory), and client certs get another chance. I think the work he is doing trying to define how a modern  (ajax-era) server expects to interact with SSL client authn is great - as browser and CGI behaviours re SSL sessionIDs are stuck in a rut older than semweb.  He is actually going to back to SSLv3 traditions, before IETF made TLS (which reconceptualized SSL into a layer 4 channel for kernels/stacks, vs a hypermedia support tool). Several things are one the one hand getting dumbed down (for consumers) so things can go mainstream, and on the hand updated (ready for the modern age, post smartcards and NSA Fortezza cards and Rivest digital signatures for legal purchases). Now proxy profiles are interesting to me (and you). We are focussed on that bit of owl that is evidently useful, for simple synomym management. I feel Im getting XRI/XDI back (from W3C sources), with minor variant in terminology and delivery. The point is, name identifier management is being addressed, firmly, so lifecycle issues SO IMPORTANT to viable large-scale security are addressed. In the phone world, endpoints with various names register with gatekeepers - that do call routing. Endpoints register in zones (possibly multuple thereof), and zone-based routing happens. In some cases, there is inter-zone routing, in which routing messages chain off between gatekeepers - much like X.500 servers can chain off a request to a peer, once located. (Hey, one might even start a career securing such chaining protocols, with  Fortezza cards.) I see multi-SAN certs acting in THAT kind of world. Opening up a cert store in a browser is like registering that browser endpoint with the gatekeeper, than can now route me into trust networks (vs phone call channel networks). I want multiple SANs, becuase I want to register with several trust domains. Who is the natural gatekeepr here? Kingleys sparl protocl server, already armed with webid client certs authn and authz. The analogy is apt. It was fun playing with ODS (and looking at it from modern goals), and at some point seeing it present in my proxy profile facts listing me as being POTENTIALLY linked to several other Peter Williams', in gmail+. I started to see trust networking happen (facebook style). but, I have a split personality. With one cert (and multiple SAN URIs), I want trust network A to be in effect for SAN.A, and not B. So I want to register with several gatekeeper (i.e. sparql servers webid/https session), so my "call" can be differently routed, through different briding points, ending up with a different trust chain, suiting the (security label) requirements expressed in the endpoint foaf cards.    
 Date: Wed, 28 Dec 2011 13:24:38 -0500
From: kidehen@openlinksw.com
To: public-xg-webid@w3.org
Subject: Re: neither FCNS nor FOAFSSL can read a new foaf card (hosted in      Azure). RDFa validators at W3C and RDFachecker say its fine...


  


    
  
  
    On 12/28/11 12:54 AM, Peter Williams wrote:
    
      
      
         

        > What URIBurner (a Virtuoso instance with its Linked Data
        Deployment 

        > middlware module enabled) does is generate a proxy/wrapper
        Linked Data 

        > URI because of ambiguity its detects when dealing the the
        URIs used in 

        > your Certs. SAN.

        

         

        Aha. When running sparql, sponging cleans up (formal
        ambiguities). It there was no ambiguity, the sparql result set
        would have the original URI as subject name. When forced to
        clean up, a new profile is screen scraped and given a wrapper
        URI. In such cases, the sparql is run against a local RDF store,
        populated with triples from only the proxy profile.

         

        On running the sparql queries against my TTL card, it had no
        ambiguities, and no proxy profile page was auto-mounted as a
        store for the query.

         

        In all cases (originally) I used the RDFa from the spec (which
        had relative names). And, normally, my SAN URIs bear the
        fragment (except when Im stressing other's implementations).

         

        When posted to blogger, validating sites could read those RDFa
        foaf cards (embedded in wider content). When posted to my own
        site, sites cannot work with the same streams. I will assume
        that somehow Blogger's template cleans up the suggested RDFa,
        making it palatable. Posted directly as araw  stream on a web
        site endpoint, the spec's suggested RDfa causes interoperability
        issues, even though it validates, and appears to be sensible
        RDFa.

      
    
    

    Sorta, the problem is that in the Linked Data realm saying what you
    mean is important. Once ambiguity is introduced (in this particular
    context) a process can opt to stay confused or attempt to sanitize.
    Ultimately, this boils down to Object Theory and the ability it
    leverage equivalence fidelity based on the principle that an Object
    has Identity that distinct from its Representation (values). 

    

    What remains a challenge re. Linked Data and the Web is that the
    narrative for explaining how HTTP URIs have been applied to age-old
    Object Theory remains confusing to many. This is why I have a
    #permarant going re. RDF and the lack of genealogy in its narrative.
    Many treat RDF as the Object Theory progenitor without any cross
    references to Lisp, EAV, Linked Data Structures and the like from
    the past. 

    

    Links:

    

    1.
    http://lod.openlinksw.com/describe/?url=http%3A%2F%2Fdbpedia.org%2Fresource%2FObject_theory
    -- Object Theory

    2.
    http://lod.openlinksw.com/describe/?url=http%3A%2F%2Fdbpedia.org%2Fclass%2Fyago%2FTheoriesOfDeduCtion
    -- Theories of Deduction

    3. https://plus.google.com/112399767740508618350/posts/U4u4FBmLnvx
    -- G+ post about Distributed Data Objects

    4. https://plus.google.com/112399767740508618350/posts/WjLcqFjUtWJ
    -- Conceptual Hierarchy of Data Objects.  

    
       

         

         

         

      
    
    

    

    -- 

Regards,

Kingsley Idehen	      
Founder & CEO 
OpenLink Software     
Company Web: http://www.openlinksw.com
Personal Weblog: http://www.openlinksw.com/blog/~kidehen
Twitter/Identi.ca handle: @kidehen
Google+ Profile: https://plus.google.com/112399767740508618350/about
LinkedIn Profile: http://www.linkedin.com/in/kidehen




 		 	   		  

Received on Wednesday, 28 December 2011 18:47:48 UTC