Re: Upgrade mixed content URLs through HTTP header

Ignore previous message. I was confused as to why the upgrading couldn't be
done by Javascript on the client, but now I realize that that wouldn't help
because the page would still show up as mixed content regardless of any
rewriting that JS code could do. Sorry for the noise!

On Tue, Feb 3, 2015 at 8:20 AM, Emily Stark <estark@google.com> wrote:

> On Mon, Feb 2, 2015 at 9:13 AM, Mike West <mkwst@google.com> wrote:
>
>> On Feb 2, 2015 4:58 PM, "Jim Manico" <jim.manico@owasp.org> wrote:
>>
>>> > The only way to support clients that don't support the thing we
>>> haven't implemented yet would be to alter the links at the source.
>>>
>>> You can always have JavaScript do this for you... Take Clickjacking
>>> defense: Just like X-Frame-Options issues with legacy clients, there
>>> are pure Js framebusting solutions that are rameasonable.
>>>
>>
>> If adjusting the source was an option, we wouldn't need this header.
>>
>> Sites with large amounts of legacy content (W3C, NYT, etc) have a hard
>> time ensuring that all the pages on their sites are updated with new URLs.
>> I think that's the problem Anne is aiming at mitigating.
>>
>>
> Hi Mike! Sorry to jump in in the middle here, but this argument isn't
> clear to me. For the sake of my understanding, can you tell me if the
> following is an accurate summary of why it should be a header rather than
> some Javascript code?
>
> 1. It would be difficult to write Javascript that actually does this
> rewriting correctly and reliably. For example, it's not clear to me that a
> Javascript library would be able to rewrite URLs of images that are
> inserted directly into the DOM dynamically.
>
> 2. Even if such a magical Javascript library existed, it would be
> difficult to deploy on huge sites with thousands of pages: the goal is to
> avoid having to make any changes to the source of such sites. [I'm
> skeptical of this point -- I see that actually transforming tons of URLs on
> all the thousands of pages could be cumbersome, but adding a Javascript
> snippet to the head of every page would probably just be a matter of
> modifying a small number of templates, right?]
>
> 3. Even if 1 and 2 were easy and possible, the goal is to load resources
> over HTTPS *without* actually rewriting URLs, because a script on the page
> might assume that a URL is http:// and would break if the URL was
> actually rewritten to https://. [I'm either skeptical of or ignorant on
> this point -- anyone have examples of why such breakage would come up in
> real life?]
>
> The major advantage that I see of rewriting in Javascript is flexibility:
> for example, nytimes.com could rewrite all images on nytimes-owned
> domains to https://, while only reporting for images on other domains. On
> the other hand, having the header doesn't stop nytimes from implementing
> this policy; it could do whatever rewriting it wants in JS, and use the
> header just to report insecure requests.
>
> Emily
>
>
>
>

Received on Wednesday, 4 February 2015 14:58:05 UTC