Re: The javascript runtime, XSS, and javascript crypto...

On 12/13/2011 03:15 AM, Tom Ritter wrote:
> So after my use cases I hit up against the problem of verifying a
> javascript runtime.  The end goal is making javascript secure for
> crypto operations, so let's look at that problem and then wittle it
> down to the runtime problem.
>
> Right now, javascript crypto is no good for a number of reasons:
>   - Third-Party Problem: Any third party supplying code can poison the
> entire runtime
>   - XSS Problem: Any XSS flaw can poison the entire runtime
>   - MITM Problem: Without SSL, the code can be modified easily by a middler
>   - SSL Problem: SSL Authentication (CAs) blows
>   - Modified Problem: Even if you validate an entire runtime today, you
> have to do it again tomorrow to make sure it didn't change
>   - RNG Problem: Javascript doesn't have a secure RNG
>   - Implementation Problem: You shouldn't write your own crypto.
>   - Keystore Problem: Where do you keep your keys?
>   - Side Channel Problem: Timing attacks
>   - Coercible Problem: Basically, the modified problem, but adapted to
> the site operator being forced to trojan you.
> http://www.matasano.com/articles/javascript-cryptography/ is a good
> article on the topic

I think we can assume RNG+Implementation+Keystore are within scope.

XSS is in scope of work being done by WebAppSec WG

  The CA and TLS issues I cannot help with at this moment, but suggest 
key-pinning.
> Let's start out by knocking off MITM and SSL: let's use SSL with Key
> Pinning so you can trust the SSL connection.  Next let's kill the RNG
> and Implementation problems: DOMCrypt is the proof-of-concept of a
> subset of what we want to built: native methods exposed via javascript
> that have a good RNG and good crypto implementations.  And, let's
> knock off the Keystore for this thread so we can focus on the runtime.
>   And finally, to keep things simple for now I'm going to ignore Side
> Channels. Let's focus on what's left:
>
>   - Third-Party Problem: Any third party supplying code can poison the
> entire runtime
>   - XSS Problem: Any XSS flaw can poison the entire runtime
>   - Modified Problem: Even if you validate an entire runtime today, you
> have to do it again tomorrow to make sure it didn't change
>   - Coercible Problem: Basically, the modified problem, but adapted to
> the site operator being forced to trojan you.
>
> How would you change HTML, Javascript, the DOM, and browsers to solve
> these problems, so a simply-stated-but-complicated example like "PGP
> in gmail" is actually secure?  You can make any changes you want but
> the more complicated it is the less implementable it is, and that's
> points subtracted.  And anything that requires a user to make a trust
> decision is also points off, because users always make the wrong trust
> decision and don't change defaults.
>
>
>
>
> Here's my idea.  There's a lot of hand-waving in it, and several
> things that probably just wouldn't work well, but I think the ways
> it's broken can help illuminate the problem of the javascript runtime
> of a page being too malleable for crypto operations as it exists now.
> So something in that runtime has to change.
>
> First we knock out the 3rd Party and Modified problems by
> code-reviewing and signing javascript libraries.  Imagine if
> BouncyCastle wrote an OpenPGP javascript library that called the
> native crypto methods and just handled that annoying OpenPGP stuff for
> you.  You trust BouncyCastle to write good, correct code - but you
> don't trust a CDN to serve it unmodified.
>
> <script type="text/javascript"
> src="https://cdn.google.com/libraries/bouncycastle-openpgp-1.3.4.js"
> signature="SADfskdjahflkjh32q239oyhfd89awydflihq3e3o92==" />
>
> That signature attribute is the hash of the library, signed with the
> private key of the SSL connection.  If that file changes, the
> signature doesn't validate, and the file contents are never executed.
> File can't be modified without invalidating the signature or finding a
> collision in SHA256 (or whatever).  You sign the javascript files
> once, and hardcode signatures - you don't need or want to do online
> signing.
>
> But you've still got the huge problem of the entire javascript runtime
> (which includes this signed third party library) can be poisoned by a
> stray XSS flaw.  My idea is two execution environments: "Crypto" and
> "Everything Else".  [1]
>

Again, I'd see the work being done by WebAppSec WG - see here [1]. 
However, the signature attribute is a good idea and I know quite a few 
people are interested in this.

> <script type="text/javascript"
> src="https://cdn.google.com/libraries/bouncycastle-openpgp-1.3.4.js"
> signature="SADfskdjahflkjh32q239oyhfd89awydflihq3e3o92=="
> runtime="crypto" />
>
> <textarea id="email" runtime="crypto">Type your message here</textarea>
>
> Javascript code in the crypto runtime can see the DOM, hook it and
> interact with it just like javascript today.  And so can the
> everything-else runtime.  But variables and functions from one runtime
> cannot interact with the other.  What's more - if a HTML element
> specifies a runtime, it's 'private data' (that's a little hand wavy
> but for now just say it's contents) is not accessible to any other,
> nor can it be hooked by any other runtime.  So the email textarea
> above could only be read (.innerHTML) or hooked (onkeydown) by the
> crypto runtime.
>
> The bare minimum of javascript code is in the crypto runtime - no UI
> stuff, no user input.  An XSS flaw in it would bring the security
> model crashing down, yes.  But if the crypto-runtime javascript code
> is small (say 5% of all js code) and carefully written - your risk is
> minimized. It's no different from SetUID or privilege-dropping daemons
> - you're extra careful about code you write that runs as root.  A
> normal XSS flaw wouldn't be able to rewrite crypto functions, see
> their state, read the contents of sensitive HTML elements, or hook
> them.
>
> Pros:
>   - No changes to javascript language
>   - Minimum changes to HTML spec
>   - [1] You can have N execution environments, I just simplified it
> Cons:
>   - Big performance problems for javascript/DOM access
>   - Side Channels
>   - Lots of corner cases about what access can and can't be allowed
>
> Now the Coercible Problem.  You want to be a good netizen and only
> serve trustworthy code, but you have the misfortune to operate in a
> country where the government coerces you with guns or national
> security letters.  This one's tricky, but basically something in the
> browser that watches signed javascript libraries, and alerts you to
> changes.  That's UI stuff that's out of scope, I just included it for
> completeness.
>

That could obviously be programmed though!

> I think the javascript runtime environment is a very tough problem to
> solve for javascript crypto to be feasible.  There's probably going to
> need to be some huge specification document about how just the runtime
> will act.  This was a 'first-thought' idea - the places where it fails
> should help illustrate what the requirements for that spec doc would
> be.  By all means, point out its specific flaws in addition to
> discussing the 'runtime problem', but please also include the sentence
> "A proper implementation would X".
>
Again, look at the CSP spec and tell us if this fulfills your use-case.

[1] 
https://dvcs.w3.org/hg/content-security-policy/raw-file/tip/csp-specification.dev.html
> -tom
>

Received on Tuesday, 13 December 2011 12:08:37 UTC