Re: HTTPS at W3C.

On Mon, Nov 17, 2014 at 7:13 PM, Ted Guild <ted@w3.org> wrote:

> * Presently we force requests that require authentication or are deemed
> sensitive by ACLs to go through HTTPS URIs.  I've seen some in past
> www-tag thread argue on using SSL as warranted instead of a blanket on
> all traffic.  This is an example of such a practice.
>

That's a really great start, thanks!

* We have millions (literally) of static resources with absolute URIs
> that would need to be modified.  Besides the sheer amount of work, some
> but not all doable automatically, there is also policies against
> modifying specifications for instance.
>

Have you looked into using CSP as a monitoring system? Setting a
report-only policy along the lines of (`default-src https:`) could give you
a reasonable estimate of where you might want to spend your cleanup effort.


> * HSTS seemed promising at transitioning those like us with large bodies
> of content that would be extremely onerous to modify.  It is not
> deployed and behaving consistently in all major browsers (as we last
> checked a few months ago).
>

HSTS is enabled and running in Chrome/Opera and Firefox. IE is looking into
it. Safari, as far as I know, isn't.

It seems here that you're letting perfect be the enemy of good. For
example, I'd be a little bit happier if I could choose to point people to
https://www.w3.org/TR/mixed-content/ without being redirected to HTTP.
That's more or less what `tools.ietf.org` seems to be doing, and it's
certainly better than nothing.

* Mixed content warning algorithms are based on the page as it is
> retrieved and not as it is served.


I'm sure you're aware of this, but that is intentional behavior.

So even with HSTS and us redirecting
> all HTTP to the corresponding HTTPS our users will get inundated with
> mixed content warnings.


Until you fix the underlying resources. :)


> These are typically interpreted by users as
> glaring issues, will deter them from using W3C site and plague us with
> issue reports.


I really, really wish this matched my experience with user perception of
security indicators in browsers. :)


> Several major vendors have bugs reported on precisely
> this situation.  After these bugs are fixed, those using legacy browsers
> are still subject to it.
>

If you've filed any bugs against Chrome, please point me to them. I will
fix them.

* W3C Specifications have advertised HTTP URIs of things intended to be
> machine readable, eg DTD, other schemata, name spaces and also things
> like RSS feeds.  We have a rather significant amount of machine traffic
> for these resources and a question for the TAG is whether W3C should
> change the protocol out from under them?  This will undoubtedly break
> quite a few services, deployed libraries and software from quite a few
> organizations.
>
> http://www.w3.org/blog/systeam/2008/02/08/w3c_s_excessive_dtd_traffic/


I imagine this is less of a problem today than it was in 2008, but the
point is certainly a reasonable one in general. Naively, I'd suggest
optimizing for readers, and leaving a reasonable impression of
security/authenticity with the site's non-mechanical users.

Short term, I could imagine, for instance, asking browsers to add w3.org to
their internal HSTS pinning lists, but not actually sending the header.
That would ensure that users of modern browsers were given a pristine HTTPS
experience (once you clean up the mixed content issues, of course), while
not effecting the legacy traffic that might or might not be up to the task
of parsing encrypted content.

Long term, I'd hope we could find other options. :)

--
Mike West <mkwst@google.com>
Google+: https://mkw.st/+, Twitter: @mikewest, Cell: +49 162 10 255 91

Google Germany GmbH, Dienerstrasse 12, 80331 München, Germany
Registergericht und -nummer: Hamburg, HRB 86891
Sitz der Gesellschaft: Hamburg
Geschäftsführer: Graham Law, Christine Elizabeth Flores
(Sorry; I'm legally required to add this exciting detail to emails. Bleh.)

Received on Monday, 17 November 2014 19:54:15 UTC