W3C home > Mailing lists > Public > ietf-http-wg@w3.org > October to December 2013

Re: Rough minutes

From: Ilari Liusvaara <ilari.liusvaara@elisanet.fi>
Date: Mon, 11 Nov 2013 04:47:12 +0200
To: Bjoern Hoehrmann <derhoermi@gmx.net>
Cc: 'HTTP Working Group' <ietf-http-wg@w3.org>
Message-ID: <20131111024712.GA8085@LK-Perkele-VII>
On Mon, Nov 11, 2013 at 03:05:24AM +0100, Bjoern Hoehrmann wrote:
> * Ilari Liusvaara wrote:
> >=> Major issue.
> >
> >5) URL schemes
> >
> >- Site might have http:// links to itself in the database
> >(major issue for some types of sites).
> >- Main blocker on at least one site I know.
> >
> >=> Might be significant issue, depending on type of site.
> 
> Could you elaborate on what the problem is? How, for example, does HSTS
> not help to mitigate the problem? Julian mentioned having to run on :80;
> is that a notable problem?

Basically, if the site code constructs pages by pulling in text fragments
from the database, and those fragments include direct http:// links/
references to the site itself.

Sometimes those links might point to other subdomains, so relative paths
will not work (and of course sometimes users do things incorrectly).

If those fragments are internal markup, one could convert when converting
the text to HTML, but if those fragments are HTML, that's considerably
harder.

And rewriting the stuff in the DB might be a big effort.

HSTS doesn't fully solve this (aside from any possible issues in 
deployment), because if A refers to B, and B has active HSTS, the reference
is still insecure (even if the request to B will be over HTTPS because
of the HSTS). Or it is like that at least in Firefox 24.


On the one site, I think the plan was to use client-side JS to rewrite
the links/references to use https:// (none should be active content).


-Ilari
Received on Monday, 11 November 2013 02:47:39 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 1 March 2016 11:11:19 UTC