Re: [whatwg/fetch] Acknowledge interaction with draft-ietf-dnsop-svcb-https scheme redirect (#1325)

@davidben commented on this pull request.



> @@ -3873,11 +3884,16 @@ steps:
    <li>Matching <var>request</var>'s <a for=request>current URL</a>'s <a for=url>host</a> per
    <a href=https://datatracker.ietf.org/doc/html/rfc6797#section-8.2>Known HSTS Host Domain Name Matching</a>
    results in either a superdomain match with an asserted <code>includeSubDomains</code> directive
-   or a congruent match (with or without an asserted <code>includeSubDomains</code> directive).
-   [[!HSTS]]
+   or a congruent match (with or without an asserted <code>includeSubDomains</code> directive) or
+   DNS resolution for the request finds a matching HTTPS RR per
+   <a href=https://datatracker.ietf.org/doc/html/draft-ietf-dnsop-svcb-https#section-8.5>HTTP Strict Transport Security</a>.
+   [[!HSTS]][[!SVCB]]

Why isn't this also desirable for HSTS? Chrome's used a synthesized redirect for HSTS since the beginning. As far as I'm aware, we haven't special-cased it here. I think that keeps things straightforward and predictable:

Consider a site which references an HSTS domain with http instead of the https. The first time the user visits that site, they will not have the HSTS bit and just directly hit the http URL. That request (presumably) serves a real HTTP redirect, with all that entails. Now they pick up the HSTS bit. On a repeat visit, HSTS applies instead. If that upgrade does not behave the same as the first visit, the site now behaves differently for first and repeat visits.

The developer of the site has probably visited the site many times and thus always tests with the HSTS bit. If HSTS and true redirects behave differently, developers will not notice if their page depends on the repeat-visit HSTS behavior. Applying uniform behavior means they test with the redirect and, if a problem, can remove it by reference https directly.

This doesn't apply to preloaded HSTS, but keeping preloaded and dynamic HSTS aligned also seems good for platform predictability. (Otherwise getting on the preload list to suddenly changes behavior.)

There, *separately*, are non-HSTS features like [Upgrade-Insecure-Requests](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Upgrade-Insecure-Requests) and now [general mixed-content upgrade](https://www.chromestatus.com/feature/4926989725073408). I'm not as familiar with how those work, but I believe they hook in much earlier in our implementation, near the renderer. Since those only use information from the issuer, not the site, the predictability issues don't apply. (First and repeat visits do the same thing.) Also those features specifically want to apply before the mixed content.

> Are there tests to ensure images end up being tainted and such?

Not aware of any tests here. That seems worthwhile, though I don't know if WPT is able to specify DNS records like this.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/whatwg/fetch/pull/1325#discussion_r733882286

Received on Thursday, 21 October 2021 17:06:32 UTC