- From: Jonas Sicking <jonas@sicking.cc>
- Date: Tue, 26 Feb 2008 04:44:32 -0800
- To: Anne van Kesteren <annevk@opera.com>
- Cc: Brad Porter <bwporter@yahoo.com>, Daniel Veditz <dveditz@mozilla.com>, "WAF WG (public)" <public-appformats@w3.org>, Window Snyder <window@mozilla.com>, Brandon Sterne <bsterne@mozilla.com>, Jesse Ruderman <jruderman@gmail.com>
Anne van Kesteren wrote:
> I'd like to see an update on this from the Mozilla folks. I think if
> cookies are not part of the request we should simply nuke the whole idea.
So we had another (and hopefully final :) ) security meeting at mozilla
today. I'll post a separate mail about the other issues that came up as
the only really big thing is the cookie issue.
So the prevailing opinion was that sending cookies without getting the
users consent is simply too easy to misunderstand. A server that sends
private data based on cookie information and include a rule like
allow <linkedin.com>
has essentially just sent all the private data served on that URI over
to linkedin, without getting the users consent. And of course this
becomes many times worse if the rule is allow <*>. At that point
basically any private data for anyone to read.
While it can definitely be argued that the server should ask the users
consent first, just as it does before selling personal data to other 3rd
parties this seems like a much easier mistake to make. Sending all your
users personal information to a 3rd party like linkedin requires an
active action. Just adding a header to your responses in order to allow
mashups requires much less thinking.
There are three parties involved in this transaction. The user, the
requesting site and the target site. The spec clearly enforces that the
latter two parties are ok with this transaction. But asking the user is
left as a responsibility to the target site.
Unfortunately we are not convinced that all sites will get this right.
Especially given the ease at which this spec can be deployed.
So we have decided that we do not want to include cookies in the request.
So at this point there are a few ways forward:
1. We can leave the spec as is and say that mozilla is intentionally
only implementing a subset of the spec at this point.
I'm not at all exited about this idea. It very much increases the risk
that server administrators will wrongly configure their servers such
that private user data will be wrongly exposed if another UA implements
access-control and do send cookies. This includes both other browsers,
and later versions of firefox.
If it comes to this we will likely simply drop support for
access-control for firefox 3 in order to not hinder deployment by other
vendors of the full spec.
2. We can change the spec to say that cookies should never be sent.
This would support the very common usecase of the data hosted on the
target site not being personal at all. Such as the ability to fetch the
latest ads on craigslist.org, or fetch the directions to a destination
from google maps.
But I'm not exited about this idea as sending cookies and auth headers
does have several security advantages when fetching private data. Such
as never having to expose any credentials to the requesting site, and
having built-in protection against distributed dictionary attacks.
Something that won't be possible if the credentials have to be included
in the request body.
3. We can change the spec to allow for requests both that include
cookies, and requests that don't include them. We'd further say
that before the UA makes a request that do include cookies it
should get the users permission to do so first.
This would support the very common usecase of the data hosted on the
target site not being personal at all. Such as the ability to fetch the
latest ads on craigslist.org, or fetch the directions to a destination
from google maps.
I think this could be a very interesting option, if done right. The "how
to ask the user for permission" part is tricky, but I think doable. And
it's something that the spec wouldn't have to work out in detail, but
can be left up to the UA.
The issue of how to determine if the request should be done with or
without cookies is something we would need to specify though. One
solution would be to say that unless the UA has any prior knowledge
(from for example a previous session), it should first make a request
that does not include cookies. If that request is denied the UA should
ask the user and then, if granted permission, do a request that includes
cookies.
This is very similar to how http authentication is usually done.
Do note that I'm prepared to go with any of the above three options. If
we really don't want to change the spec we are perfectly happy with
holding off on this feature for a future firefox release.
If we go with 3, note that for the next firefox release we would then
act as if the user always denies the request to send cookies.
Implementing UI to ask the user is not going to happen for this release.
Nothing would prevent it from happening next release though.
> One thing that might be worth considering is adopting the policy Safari
> and Internet Explorer have for cookies. That is not accepting
> third-party cookies, but always including cookies in the request. Then
> again, there are already tracking methods without cookies and are
> actively being used (Hixie pointed out paypal + doubleclick on IRC) so
> I'm not sure whether complicated cookie processing models are worth it
> at all.
That wouldn't actually change anything at all. The major concern is
sites sending replies that contain the users private data to GET
requests that include cookies. This will happen even if the reply can't
set additional cookies.
The third-party-cookie blocker thing is mostly there to (poorly) stop
sites from tracking a user across multiple sites.
I realize a lot of people are probably going to be disappointed with
this decision, me included. But I hope we can find a way forward that
minimizes the disappointment, even if that includes removing support for
any of this from the next firefox release.
/ Jonas
Received on Tuesday, 26 February 2008 12:45:12 UTC