W3C home > Mailing lists > Public > public-fx@w3.org > October to December 2011

Re: Documenting Timing Attacks in Rendering Engines

From: Adam Barth <w3c@adambarth.com>
Date: Mon, 12 Dec 2011 12:05:39 -0800
Message-ID: <CAJE5ia_942BEAPF_uCOQqLRPJ1yJTP0MeU1VLGi+2sH+5na9Tg@mail.gmail.com>
To: Ralph Thomas <ralpht@gmail.com>
Cc: Vincent Hardy <vhardy@adobe.com>, "public-fx@w3.org" <public-fx@w3.org>
If there was a recognizable subset of GLSL that was constant-time,
that would be a nice solution to this problem.  The GLSL experts I've
asked tell me that such a thing isn't possible, however.  If you have
a proposal for such a subset, we can run it by these folks.

Adam


On Mon, Dec 12, 2011 at 12:00 PM, Ralph Thomas <ralpht@gmail.com> wrote:
> Do you think that GLSL could be changed to prevent the timing from
> changing depending on the content? Could the GLSL validator be changed
> to not allow conditionals dependent on texture samples from the page?
>
> That would eliminate various effects, but at least most shaders
> wouldn't do texture indirections anyway, since it's slow on embedded
> hardware.
>
> Ralph
>
> On Mon, Dec 12, 2011 at 11:26 AM, Adam Barth <w3c@adambarth.com> wrote:
>> Sorry if my previous response came off as abrupt.  Let me try replying
>> to your message again.
>>
>> On Mon, Dec 12, 2011 at 8:38 AM, Vincent Hardy <vhardy@adobe.com> wrote:
>>> I think the problem already exists regardless of shaders.
>>
>> These issues would exist regardless of shaders if we weren't careful
>> about maintaining this security property throughout the platform, but
>> there's been a bunch of careful work done by many folks over a number
>> of years to plug these sorts of leaks.  For example, David Baron
>> considered and experimented with timing attacks when he proposed and
>> implemented his history sniffing defense, and that's what motivated
>> his choices of which CSS properties could differ between visited and
>> non-visited links.  Folk have also analyzed other side channels (e.g.,
>> http://websec.sv.cmu.edu/visited/visited.pdf), so we have a pretty
>> good handle on what is and what is not possible.
>>
>>> We already have filter effects on SVG content (which may include HTML
>>> through foreignObjects). They can impact the rendering time of content. And
>>> regardless of shaders / filters, it is possible to modify content and
>>> compute the rendering time to detect patterns.
>>
>> If that's possible, I'd encourage you to report the vulnerability to
>> the various vendors (e.g.,
>> http://www.chromium.org/Home/chromium-security/reporting-security-bugs)
>> in a responsible manner.  These reports will likely qualify for
>> rewards via Chromium and Mozilla's vulnerability rewards programs.
>>
>> Note you'll need to construct a filter effect that has a different
>> running time based only on the color of the hyperlink.  Other CSS
>> properties are forbidden from varying between visited and non-visited
>> hyperlinks.  That's easy to do with a CSS Shader, but it's unclear to
>> me how to do that without a shader.
>>
>>> I'll take a silly and extreme case. If an attacker finds that the rendering
>>> of a visited link took a lot more time than rendering a non visited link,
>>> he/she could find out, through timing, if a url had been visited by the
>>> victim. No shaders involved. May be the attacker just added a very large
>>> number of drop shadows to the style of visited links or something else that
>>> also impacted rendering.
>>
>> Yes, that would be a vulnerability.  As far as I know, no one knows
>> how to do that without shaders.  We do know how to do this if we could
>> apply different effects (such as drop shadows) to visited and
>> non-visited hyperlinks, but, as James says, that is forbidden.
>>
>>> So since we have leakage of information by timing the rendering, I think we
>>> need to understand how bad the problem/threat is. As discussed before,
>>> shaders and filters may accentuate the issue (because they can slow down
>>> rendering of specific colors etc...) but the core issue, I think, is that
>>> timing the rendering (in general) leaks information. May be we should find
>>> how to obfuscate timing in requestAnimationFrame so that information leakage
>>> is reduced/removed. I think Dean has ideas there.
>>
>> That approach is unlikely to be successful.  Once the information has
>> entered the timing channel, it's basically impossible to remove.  You
>> can try to hide the information with blocking or noise, but experience
>> in many other domains tells us that these mitigations are surprisingly
>> easy for attackers to circumvent.
>>
>> The only approach I'm aware of that has a chance of success is to
>> prevent the sensitive information from entering the timing channel in
>> the first place.
>>
>>> I like to think that there are solutions to problems, even though not
>>> obvious at first :-). We may apply restrictions where needed. For example, I
>>> think CORS addresses some of the issues. Let's try to find out solutions to
>>> the other issues.
>>>
>>> May be I am just an optimist :-)
>>
>> If you have a specific proposal, I'm happy to review it.  So far I
>> haven't seen a proposal that wasn't trivially broken.  Specifically
>> for the approach you mention above, relying upon CORS does not address
>> the history sniffing risk.
>>
>> Adam
>>
Received on Tuesday, 13 December 2011 04:40:08 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 13 December 2011 04:40:12 GMT