- From: Jon Ferraiolo <jferrai@us.ibm.com>
- Date: Tue, 18 Dec 2007 08:55:08 -0800
- To: "Doyle, Bill" <wdoyle@mitre.org>
- Cc: "Jonas Sicking" <jonas@sicking.cc>, "WAF WG (public)" <public-appformats@w3.org>, public-appformats-request@w3.org
- Message-ID: <OFB696C6BF.5F30D8AF-ON882573B5.005AC2BB-882573B5.005CF045@us.ibm.com>
Hi Bill, We have had considerable discussion at OpenAjax Alliance (within the Security Task Force) about the security implications of Access Control. Personally, I share your concerns and opinions, especially how the server transfers access control to the client browser and how the data is sent all the way to the client and the client is trusted to discard the data if access is not allowed. But so far no one at OpenAjax Alliance is objecting to the security aspects of the Access Control specification. While we believe that Access Control might help malicious sites to exploit vulnerabilities on insufficiently protected servers because it provides a more convenient means for malicious sites to do bad cross-domain things, such as CSRF, most of the OpenAjax members have accepted the argument that those servers are already exposed and already need to be fixed. Another reason why we haven't objected to the security aspects is that servers have to "opt-in" (by either adding a processing instruction at the top of the XML file or adding a new HTTP header to the response). Bottom-line is that if my company has any web services that deliver restricted information, I would implement my server with server-side authentication and include various CSRF protection mechanisms, such as session tokens that don't get saved in cookies, and I wouldn't opt-in to Access Control for those scenarios. So, we haven't been able to identify any reason that Access Control would harm web security any more than what exists today. Therefore, no big negatives. However, personally I'm not sure how much Access Control will get adopted either within browsers (e.g., will IE implement it?) or in conjunction with web services (how many sites will choose to opt-in). As a result, I'm not convinced about how much benefit the industry will get from Access Control. The good news is that it looks like the good folks at Mozilla are implementing Access Control in Firefox 3. Therefore, we can learn from real-life experimentation once FF3 hits the streets on both the negative side (e.g., have any security problems come up?) and on the postiive side (e.g., is anyone opting in?). Jon "Doyle, Bill" <wdoyle@mitre.org > To Sent by: "Jonas Sicking" <jonas@sicking.cc>, public-appformats "WAF WG (public)" -request@w3.org <public-appformats@w3.org> cc 12/18/2007 06:37 Subject AM RE: comments on access control for cross-site requests - WSC member Hi Jonas thank you for the response, Not sure how the web server protects itself - "site should be protected from any other requests until it grants access" I understand that the 3rd party can restrict access. The requirement is for the web server to have a mechanism (i.e. configuration setting or other type of control) that allows or disallows access control for cross-site requests and the web server has the ability to restrict 3rd party access to settings that are controlled by the web server. Issue is that the web server owner looses Information Assurance (IA) control, this is an issue for my customers. IA control cannot be handed over to a 3rd party. For my customers, the web server owners need to manage the IA settings. Regards Bill Doyle wdoyle@mitre.org -----Original Message----- From: Jonas Sicking [mailto:jonas@sicking.cc] Sent: Tuesday, December 18, 2007 1:34 AM To: Doyle, Bill; WAF WG (public) Subject: Re: comments on access control for cross-site requests - WSC member Doyle, Bill wrote: > 1. The cross-site scripting protocol must include strong > cryptographic mechanisms to ensure that the server can restrict use of > the capabilities to authenticated and authorized clients. The third party site can require that all communication between the third party server and the browser is done using https by simply denying all access requests done through any other means. The third party site can also require that all communication between the browser and the requesting site is done over https by only white-listing https servers. Does this satisfy the request? Additionally, it is possible to extend this further in the future by adding additional attributes to require even stronger protection. This is done in a forwards compatible manner by saying that a current implementation that sees any unrecognized attributes must deny access. > 2. The protocol must provide the ability for a server to support > fine grained access control. e.g. a server should be able to limit write > access to a specific client noted in item 1. Any type of access, including write access can be limited according to the rules described above. > 3. Protocol must be able to restrict inheritance of a clients > access control rights by other clients. I don't quite understand this question. > 4. Resources must be protected until access is granted; the > security consideration that resources are not revealed is not strong enough. The only requests that can be made without explicit authorization are GET requests. These requests are already possible today. The site should be protected from any other requests until it grants access. Best Regards, Jonas Sicking
Attachments
- image/gif attachment: graycol.gif
- image/gif attachment: pic02736.gif
- image/gif attachment: ecblank.gif
Received on Tuesday, 18 December 2007 17:11:23 UTC