Re: issue of initiating client auth for parallel SSL sessionids

The ws* world has advanced deployed systems for much of the proxying and delegating flows. These days, the flows give you a choice of soap and saml, or rest and swt/wrap.

The browser is not tied to the client proxy defined only by the outgoing firewall. Common flows have browser crawl a series of proxies, each transforming the claims in the next signed token (proxy cert role).

Now what benefits to proxy certs in tls have over tokens in http headers?

None really, unless one seeks to tie token validity to the transport based assurances.

Now is there a linked data specific case that calls for the tls based delegation?

I dont find the printing example convincing, being just as easily solved using token passing/rewriting, ws* style. Need a more graphy example, where graph or data querying needs a service to support it (and needs to invoke the users Id/powers).
On Feb 28, 2011, at 4:27 AM, Henry Story <henry.story@bblfish.net> wrote:

> Thanks a lot for this very helpful post. I read it quickly as I have a deadline for this evening, but will look at it more in the coming days.
> 
> Proxy Certificates were mentioned a couple of times in ISSUE-28: "How does the WebID protocol interact with TLS proxies & firewalls", but no references were provided. Yours are very helpful.
> 
> <aside>
> Looking at the IETF spec, I just saw something interesting 
> 
>   <quote>
>     3.2 Issuer Alternative Name
>     The issuerAltName extension MUST NOT be present in a Proxy Certificate.
>   </quote>
> I suppose that could be found by walking back up the chain to the issuer.
> 
> </aside>
> 
> Though this spec explains what proxy certificates look like I am not sure I understand how they are used. The idea seems to be that the User Agent signs the proxy certificate, and gives it certain rights. I don't think this has yet been implemented in browsers, and as that can take a lot of time, this explains why the main interest has come from browser-less use cases such as grid computing, or company firewalls. I am not quite clear how proxy certificates work in company firewall situations without changing the browsers though.
> 
> For the firewall/transparent proxy case I put forward a sketch of a proposal that should allow company issued WebIDs to work: essentially the firewall would be tied to the WebID web service of the company, where it would be able to list its public key as an identity of the user. Since it would have the private key of that public key it could create parallel TLS sessions for each one requested by the client. Of course this would force users behind the firewall to only use company identities. If proxy certificates were to work in browsers that would allow one to remove the tie from the firewall to the company WebID server, and also make sure the firewall could not pretend to be the user for other purposes - though a rogue firewall would be a pretty bad thing whatever happens.
> 
> It seems that there is the idea that proxy certificates could be used more widely to take on the role of OAuth. You connect to a photo printing service, it asks you for a proxy certificate to ask on your behalf to fetch some pictures on some servers that you have access to. 
> 
>    ISSUE-4: Detail Authorization "protocol" using WebID
> 
> lists one way to do this [3] using only WebID and current browser capabilities: here you make the photo printing service - assumed to have its own WebID - a "friend" and give it access rights for a limited time on a number of resources. 
> 
>    Placing that functionality into the browser and being able to send proxy certificates will require a very good policy language to be developed. Perhaps something combining the ACL ontology [1] and POWDER [2]. The user interface for allowing users to express what resources they want to give the intermediate server access to, will also have to be very sophisticated if it has any chance to work. Perhaps the most important is first to get those ACL description languages widely adopted with protocols such as the photo printing service[3]. As the Social Web adopts those and explores the use case and implementation space, browsers will be able to work with existing experience to build a proxy certificate solution that requires just a small step for people to adopt.
> 
> Henry
> 
>  
> [1] http://www.w3.org/wiki/WebAccessControl
> [2] http://www.w3.org/TR/powder-dr/
> [3] http://blogs.sun.com/bblfish/entry/sketch_of_a_restful_photo
> 
> On 28 Feb 2011, at 01:37, Ryan Sleevi wrote:
> 
>> See RFC 3820, X.509 Proxy Certificate Profile [1]. No overloaded term, first result in the Big 3 search engines. The impact of proxy certificates, if used for a MITM SSL proxy, is that it puts the onus of validating/understanding proxy certificates onto the relying party (Validation Agent), rather than on the proxy. They only work for sites which are configured to accept them (as part of client certificate processing). This may or may not be acceptable for the protocol at large, but it shows how one might deal with the problem. However, it's not a solution I'm necessarily advocating as a good solution, but given the concern for MITM proxies and the (scary) idea of storing the WebID private key on the proxy itself, I was wondering it had been broached yet. A nice further read about them is at [2].
>>  
>> [snip interesting paras on ephemeral ciphersuites, and on reason of adoption of RSA]
>>  
>> All that said, if the WebID protocol continues to use TLS client authentication, then it must be expected/known that transparent SSL proxies won't work. The advice from vendors of such products (such as Bluecoat, Microsoft's Forefront TMG, etc) are: If you need to perform TLS client auth, add the site to the exclusion list of sites that are not transparently filtered [3]. This is because such transparent proxies are knowingly "breaking" the protocol, and client auth is one area that they're especially broken.
>>  
>> If the WebID protocol needs to work through such (malicious) proxies without requiring the proxies to be modified, which seems implied if WebID is meant to be cheaply deployed widely, the options I see are:
>> 1) Don't use TLS client authentication. Use some other means independent of TLS for identification, although presumably still securing the entire request/response with TLS.
>> 2) Work with the vendors to define some new protocol for allowing semi-transparent TLS interception while performing client auth. Good luck with that.
>>  
>> Hope that helps,
>>  
>> [1] http://www.ietf.org/rfc/rfc3820.txt
>> [2] http://security.ncsa.illinois.edu/research/wssec/gsihttps/
>> [3] http://blogs.technet.com/b/isablog/archive/2009/10/19/common-problems-while-implementing-https-inspection-on-forefront-tmg-2010-rc.aspx
>>  
>> From: peter williams [mailto:home_pw@msn.com] 
>> Sent: Sunday, February 27, 2011 6:38 PM
>> To: 'Ryan Sleevi'; public-xg-webid@w3.org
>> Subject: RE: issue of initiating client auth for parallel SSL sessionids
>>  
>> My advice is explain proxy certs.
>>  
>> Ive tried to introduce ephemeral certs (in SSL ciphersuites whose cipher nature exploits them). But, most folks are pretty set in their thinking in doing 1990s era https, with just classical RSA and RC4 stream ciphering.
>>  
>> And, I’ve tried hard to introduce SSL MITM proxies (client side, or reverse) as a threat posed - to “just” the secure communications aspects of webid protocol (never mind caching, or interfererence, etc)
>>  
>> TBH, I don’t know what you mean by proxy certs, since the term “proxy” is so overloaded.
>>  
>> I spent the last hour or two making “proxy certs” in gnutls, which seemed to be about some old experiments in delegation and computable/composable policy expressions stuffed in a cert extension. This seems to align with your text. If so, No – its not been a topic of discussion.
>>  
>> We have touched on the topic of having “javascript” in a cert extension (rather than some policy language), and we have touched on dumping X.509/ASN1/DER/PKIX and just using json-signed/encoded datums instead
>>  
>> But, I think there is some receptivity to saying: webid might leverage signed json/javascript certs should they exist (since they are “so webby”). But, they don’t really exist yet. The history of the movement is tied to the goal of working with actual browsers, from the last 5 years (which ties one to X.509). If signed javascript/json came fast, I think it might be a different group.
>>  
> 
> Social Web Architect
> http://bblfish.net/
> 

Received on Monday, 28 February 2011 18:09:47 UTC