W3C home > Mailing lists > Public > public-appformats@w3.org > January 2008

Re: ISSUE-18: Is JSONRequest an acceptable alternative to the current model? [Access Control]

From: Jon Ferraiolo <jferrai@us.ibm.com>
Date: Tue, 8 Jan 2008 07:37:40 -0800
To: Jonas Sicking <jonas@sicking.cc>
Cc: Anne van Kesteren <annevk@opera.com>, "WAF WG (public)" <public-appformats@w3.org>, public-appformats-request@w3.org, "Close, Tyler J." <tyler.close@hp.com>
Message-ID: <OF4E625218.E7984890-ON882573CA.0053DF40-882573CA.0055D8B0@us.ibm.com>

public-appformats-request@w3.org wrote on 01/07/2008 10:01:17 PM:

snip

> >
> > Please let's not develop two new features that address the same
> > objective. The community needs to unify around one and only mechanism,
> > be it Access Control or JSONRequest or minor enhancements to the SCRIPT

> > tag or something else. The key objective is community-wide
> > interoperability which is best served by a single standard, not
multiple
> > standards. Besides confusing and dividing the community with multiple
> > features trying to address the same thing, you double the extra hacker
> > attack service.
> >
> > If you go along with the single standard argument, then the question
> > which one standard (Access Control vs JSONRequest vs other) should be
> > used? It might be time for the WAF WG to take time to query the browser

> > vendors and the rest of the community for feedback on Access Control in

> > general, but in particular how it relates to JSONRequest. It is clear
> > that Opera and Mozilla are engaged in the Access Control spec, but what

> > about the Safari team and (probably more importantly) the IE team,
> > particularly with regard to security? What do the big web service
> > providers think (Yahoo, Amazon, Google, Salesforce, etc.)? Doug
> > Crockford is with Yahoo and he invented JSONRequest, but that was last
> > year and I don't know whether he represents Yahoo's entire position.
>
> I don't actually agree that a single standard is all we need. At least
> none of the standards I've heard proposed I think would cover all needed
> scenarios.

It would be better to add direct XML support to JSONRequest (wouldn't that
just be one additional parameter for payload filetype?) or define a JSON
equivalent to the PI in Access Control. Let's not burden and fragment the
industry with two overlapping standards, and let's not double the security
worries.

>
> First of all JSONRequest is clearly not well suited for things like XSLT
> or XBL or any other scenario where you want to use an XML resource.
> JSON can be used to envelope XML data, but you couldn't put:
>
> <?xml-stylesheet type="text/xsl"
>     href="http://example.org/jsonrequest/xslt.js">
>
> at the top of an XML document and have it work.

I don't think cross-site XSLT and XBL are the key workflow to address, and
even if the W3C decides to allow cross-site XSLT and XBL, there are other
ways to skin that cat rather than piggybacking on the cross-site data
mechanism. Let's define coherent cross-site rules that apply to LINK,
STYLE, SCRIPT, IMG, etc and then fit XSLT and XBL into those rules.


snip

> > But if it *were* widely adopted, here is one attack vector. There will
> > be tons of Web2.0 community sites using XHR and Ajax, and if Access
> > Control is popular, then a significant percentage of these web site
will
> > opt-in to Access Control. It is human nature to take the easy route and

> > develop web sites that choose to Allow every web site (i.e., "*") for
> > GET and/or POST because many community-site web developers have little
> > imagination about the value of the information at their site. For
> > example, DalmationLovers.org might be advertising supported (with the
> > advertisements isolated in IFRAMEs) and might feel that the information

> > in their database is harmless. Now, let's suppose that one of the ads
> > comes from evil.com and includes a mouseover event handler that invokes

> > XHR to the DalmationLovers.org site, leveraging the session cookie for
> > that site. Users often post personal information (email, phone number,
> > login, password) within community web sites, and the web site might
> > offer a POST service that allows the email address to be changed and
> > then another POST service that will send your account information,
> > including password, to be sent to the designated email address. Using
> > this information, the hacker can try to login to various banks using
the
> > same login and password because many people use the same logins and
> > passwords for multiple sites.
>
> This does not seem like a new attack vector. If DalmationLovers.org has
> a service that lets you POST an email address to which all your personal
> data is sent, then they are hosed already. Anyone can write a web page
> with a <form action="dalmationlovers.org/personaldata.cgi"> that POSTs
> data to the DalmationLovers.org service. This POST will include the
> users cookies and auth credentials. That is how the web already works.

But if JSONRequest were available, I wouldn't design my server to accept
FORM posts from other domains.

>
> In fact, by not including cookies and auth credentials you are just
> giving yourself a false sense of security. Just like many sites use
> cookies or http auth to authenticate a user, other sites simply use the
> fact that you can reach the server as authentication.

Yes, there are dumb programmers, and there are also really dumb
programmers, as you point out. I'm not sure there is much hope for really
dumb programmers.

>
> > But I fully agree that there should be some way that *new* web sites
> > could tell browsers to go into a strict security mode that would
> > disabled cross-domain access, including disabling existing cross-domain

> > support for SCRIPT, IMG and other things. That way a mashup web site
> > developer could be sure that the only cross-site requests that were
made
> > were accomplished only via approval from the mashup container logic
> > rather than some nasty hidden logic within one of the mashup
components.
>
> Something like that would be nice indeed. Though you could never be
> suare that only cross-site requests are made via approved mechanisms as
> older browsers are always going to be out there.

Yes. Therefore, this new mechanism would have to have a transitional and
incremental approach because the world will take a long time to upgrade
their browsers. So:

* Web site developers would have to be educated about how to design their
web pages so that they would work reasonably securely with existing legacy
browsers but would be more secure in browsers that supported this new
strict mechanism
* There would have to be some sort of API that allowed a web page to
determine if strict security mode were available, in case the web page
logic wanted to do something different when strict mode is available and
when it is not


>
> / Jonas
>
Received on Tuesday, 8 January 2008 15:40:46 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 18:56:21 UTC