W3C home > Mailing lists > Public > public-webappsec@w3.org > April 2017

Re: Verified Javascript: Proposal

From: Jeffrey Yasskin <jyasskin@google.com>
Date: Tue, 25 Apr 2017 07:53:25 -0700
Message-ID: <CANh-dXnVt1ioPAu_rvW9LrvmQG9N31h+SfqWxkyrN0cC37dJTg@mail.gmail.com>
To: Daniel Huigens <d.huigens@gmail.com>
Cc: public-webappsec <public-webappsec@w3.org>
The goal of binary transparency for web applications makes sense, but
implementing it on top of the Certificate Transparency logs seems like it
introduces too many problems to be workable.

Have you looked into a dedicated transparency log for applications, using
the system in https://github.com/google/trillian#readme? Then we'd need to
establish that only files logged to a particular set of log servers could
be loaded. A certificate extension might be the right way to do that, since
the certificate would only need to be re-issued in order to add log
servers, not to change the contents of the site.

Putting every Javascript resource from a large application into the log
also might introduce too much overhead. We're working on a packaging format
at https://github.com/dimich-g/webpackage/, which could reduce the number
of files that need to be logged by a couple orders of magnitude.

Jeffrey


On Mon, Apr 24, 2017 at 3:25 AM, Daniel Huigens <d.huigens@gmail.com> wrote:

> Hi webappsec,
>
> A long while ago, there was some talk on public-webappsec and public-
> web-security about verified javascript [2]. Basically, the idea was to
> have a Certificate Transparency-like mechanism for javascript code, to
> verify that everyone is running the same and intended code, and to give
> the public a mechanism to monitor the code that a web app is sending
> out.
>
> We (Airborn OS) had the same idea a while ago, and thought it would be a
> good idea to piggy-back on CertTrans. Mozilla has recently also done
> that for their Firefox builds, by generating a certificate for a domain
> name with a hash in it [3]. For the web, where there already is a
> certificate, it seems more straight-forward to include a certificate
> extension with the needed hashes in the certificate. Of course, we would
> need some cooperation of a Certificate Authority for that (in some
> cases, that cooperation might be as simple, technically speaking, as
> adding an extension ID to a whitelist, but not always).
>
> So, I wrote a draft specification to include hashes of expected response
> bodies to requests to specific paths in the certificate (e.g. /,
> /index.js, /index.css), and a Firefox XUL extension to support checking
> the hashes (and we also included some hardcoded hashes to get us
> started). However, as you probably know, XUL extensions are now being
> phased out, so I would like to finally get something like this into a
> spec, and then start convincing browsers, CA's, and web apps to support
> it. However, I'm not really sure what the process for creating a
> specification is, and I'm also not experienced at writing specs.
>
> Anyway, please have a look at the first draft [1]. There's also some
> more information there about what/why/how. All feedback welcome. The
> working name is "HTTPS Content Signing", but it may make more sense to
> name it something analogous to Subresource Integrity... HTTPS Resource
> Integrity? Although that could also cause confusion.
>
> -- Daniel Huigens
>
>
> [1]: https://github.com/twiss/hcs
> [2]: https://lists.w3.org/Archives/Public/public-web-security/
> 2014Sep/0006.html
> [3]: https://wiki.mozilla.org/Security/Binary_Transparency
>
>
Received on Tuesday, 25 April 2017 14:54:20 UTC

This archive was generated by hypermail 2.3.1 : Monday, 23 October 2017 14:54:22 UTC