RE: describing, crawling, https schemes, https and session guards, OAUTH

The OAUTH approach is one way, and will depend on maturity of your own protocol engine and that in MIcrosoft Azure STS (giving me OAUTH, now). I will play with this as I want to qualify the maturity of each. So far, Ive stayed away from OAUTH, not wanting to be in the first wave attack. How do I take cert:key and make a business out of it, is the next issue. For there starts the money flow (which pumps up semantic web beyond its R&D stage). But the point is that im turning the whole "about" (sparql describe) concept of the semantic web to suit an enterprise-centered security policy. (This is one Henry doesn't grok, at all; just not wanting such a world to exist, I suspect). Im only interested in the semantic web - AS ITS DIFFERENT in a paradigmitic sense, not (yet) becuase its a different bit of middleware for app building. First, it has to show that its presents a new plumbers paradise. It has to have invented stainless steel plating, so the world is no longer populated by rusty old iron faucets. It doesnt need to reinvent the faucet itself, though. I think you did that, because of the consistency of the meta'ness. You didnt get side-tracked into webAPI with REST, or changing how trust should happen in the world, or any of the other dogmas (that come later, in my world). Another option is to use VPNs. Azure comes with "cloud-easy" VPN technology for middleware builders. Its NOT the old cisco-layer2 dialup concept. Its about hooking a cloud presence to the enterprise presence, so servers can be in each location on the same subnet. Since its targetting Windows programmers (like me, not the most intelligent members of the species), it has to "just work" and controlled cost (and typically be cheaper than a $100k US person doing linux), and work with legacy apparatus of several generations. Specifically, it allows cloud hosted intances in that highly controlled space to talk to: the enterprise directory, and enterprise hostsed SQL servers. This allows me to have my cert minting server in the cloud, for example, but talk to N "domains" (actually, unlinked forests), each one minting certs using said central minting server using enterprise group policy, and auto-populating the certs  (with SANs) ready for use, as controlled by the domain manager (not the central minting service). I dont need VeriSign to do this kind of scaling for me, I can do it myself, now, courtesy of (advanced deployment modes) of windows and Windows Azure. I could also be dropping a (simple) endpoint into your server farm (its an install that fashions a virtual network adapter), providing thereby a route to that which is not present on internet visible endpoints. It could be giving AUTHORIZED crawler thus a private  endpoint, over which it then crawls such authorities. The VPN method would be doing the security enforcement rather than OAUTH, based on standard endpoint control. Lets call that what it is : a good ol bileral agreement for information sharing, enforced using endpoint visibility. But, the point is that the cert:key triples that then get published on the web (by the likes of linkeddata.uriburner.com) would only be made available (in their proxy URI named  form, and on linked data endpoints served by OpenLink cloud) tio those the original document declarse. I.e the linked data data MUST enforce that, for the triples whose subject is the proxy URI. Now do this, we start to have commercial security, that does not tamper with the semantic web concept, and still exploits its paradigm shift. we start to have the conditions necessary for decent uses of certs:keys, anbd have moved beyond a wikileaks fuck the world type attitude of cert:keys. While thats fine and useful as a boostrap, it has to move on and address more conservative business needs, to keep the ball rolling. Otherwise the giant boulder will come to a(nother) rest (sic).                 		 	   		  

Received on Saturday, 7 January 2012 15:31:53 UTC