W3C home > Mailing lists > Public > public-webappsec@w3.org > April 2017

Re: Verified Javascript: Proposal

From: Daniel Huigens <d.huigens@gmail.com>
Date: Tue, 25 Apr 2017 15:12:44 +0200
Message-ID: <CAL14OeHaM3NRW63=Evm+qrhRg13dfthJB8MH9C=A8PD0ZyRd5w@mail.gmail.com>
Cc: Jochen Eisinger <eisinger@google.com>, public-webappsec@w3.org
Hi Ben,

As far as I know, even if you pin your public key, you can still generate
new certificates for that public key. Also, I think most websites pin the
public key of their certificate authority, not their own public key.

-- Daniel Huigens

Op 25 apr. 2017 12:06 schreef "Ben Gidley" <ben@gidley.co.uk>:

> Slight brain freeze in that note- I meant public key pinning combined with
> HSTS... not just HSTS - the  pinned cert is typically set to have quite a
> long expiry window, due to operational challenging of making sure pinning
> works properly (e.g. you don't accidentally block all your users).
>
> On Tue, 25 Apr 2017 at 05:56 Ben Gidley <ben@gidley.co.uk> wrote:
>
>> This would be very tricky mixed with HSTS and the desired long expiry of
>> certificates, google require 18 week expiry windows (
>> https://hstspreload.org/) if you want your cert to be pre-loaded. This
>> would imply I could only change the protected resources every 18 weeks, or
>> hit HSTS mismatches.
>>
>> It would be very hard given the slow change of certificates to make this
>> workable, most sites need to update more often than that.
>>
>> Ben Gidley
>>
>>
>> On Tue, 25 Apr 2017 at 05:06 Daniel Huigens <d.huigens@gmail.com> wrote:
>>
>>> Hi Jochen,
>>>
>>> Thanks for the feedback. I think for the kind of web apps that this is
>>> meant for, they will also have a GitHub account with some or all of the
>>> frontend code on there. (Example for which I think this would be a good
>>> fit: [1].) Then, with the public log, you can start to think about
>>> verifying that the code on the server matches the code on GitHub. Then,
>>> you can start to monitor patches on GitHub, and inspect the code more
>>> deeply (including e.g. a paid security audit).
>>>
>>> -- Daniel Huigens
>>>
>>> [1]: https://github.com/meganz/webclient
>>>
>>> 2017-04-25 10:33 GMT+02:00 Jochen Eisinger <eisinger@google.com>:
>>> > Hey,
>>> >
>>> > I wonder how the logged certificates would be used. I would expect web
>>> apps
>>> > to update several times a day, or even per hour. How would a user tell
>>> the
>>> > difference between a bug fix / feature release on the one hand, and
>>> > something malicious (from their PoV) on the other hand?
>>> >
>>> > best
>>> > -jochen
>>> >
>>> > On Mon, Apr 24, 2017 at 12:27 PM Daniel Huigens <d.huigens@gmail.com>
>>> wrote:
>>> >>
>>> >> Hi webappsec,
>>> >>
>>> >> A long while ago, there was some talk on public-webappsec and public-
>>> >> web-security about verified javascript [2]. Basically, the idea was to
>>> >> have a Certificate Transparency-like mechanism for javascript code, to
>>> >> verify that everyone is running the same and intended code, and to
>>> give
>>> >> the public a mechanism to monitor the code that a web app is sending
>>> >> out.
>>> >>
>>> >> We (Airborn OS) had the same idea a while ago, and thought it would
>>> be a
>>> >> good idea to piggy-back on CertTrans. Mozilla has recently also done
>>> >> that for their Firefox builds, by generating a certificate for a
>>> domain
>>> >> name with a hash in it [3]. For the web, where there already is a
>>> >> certificate, it seems more straight-forward to include a certificate
>>> >> extension with the needed hashes in the certificate. Of course, we
>>> would
>>> >> need some cooperation of a Certificate Authority for that (in some
>>> >> cases, that cooperation might be as simple, technically speaking, as
>>> >> adding an extension ID to a whitelist, but not always).
>>> >>
>>> >> So, I wrote a draft specification to include hashes of expected
>>> response
>>> >> bodies to requests to specific paths in the certificate (e.g. /,
>>> >> /index.js, /index.css), and a Firefox XUL extension to support
>>> checking
>>> >> the hashes (and we also included some hardcoded hashes to get us
>>> >> started). However, as you probably know, XUL extensions are now being
>>> >> phased out, so I would like to finally get something like this into a
>>> >> spec, and then start convincing browsers, CA's, and web apps to
>>> support
>>> >> it. However, I'm not really sure what the process for creating a
>>> >> specification is, and I'm also not experienced at writing specs.
>>> >>
>>> >> Anyway, please have a look at the first draft [1]. There's also some
>>> >> more information there about what/why/how. All feedback welcome. The
>>> >> working name is "HTTPS Content Signing", but it may make more sense to
>>> >> name it something analogous to Subresource Integrity... HTTPS Resource
>>> >> Integrity? Although that could also cause confusion.
>>> >>
>>> >> -- Daniel Huigens
>>> >>
>>> >>
>>> >> [1]: https://github.com/twiss/hcs
>>> >> [2]:
>>> >> https://lists.w3.org/Archives/Public/public-web-security/
>>> 2014Sep/0006.html
>>> >> [3]: https://wiki.mozilla.org/Security/Binary_Transparency
>>> >>
>>> >
>>>
>>> --
>> Ben Gidley
>> ben@gidley.co.uk
>>
> --
> Ben Gidley
> ben@gidley.co.uk
>
Received on Tuesday, 25 April 2017 13:13:20 UTC

This archive was generated by hypermail 2.3.1 : Monday, 23 October 2017 14:54:22 UTC