Re: Semantics of multiple 103s in Early Hints

2017-08-08 17:16 GMT+09:00 Stefan Eissing <>:
> Ok, time to clarify the semantics, then words will come more easily.

Yeah I agree with the approach.

> In my mind, any header that comes with a 103 is not a header of the
> resource that is requested. Same as with 100 and 101.
> For a client/cache/intermediate any 1xx header is neutral to the
> set of resource headers in the final response.

I agree to the above.

> RFC 7231 says (ch. 6.2) "If the request did not contain an Expect
> header field containing the 100-continue expectation, the client
> can simply discard this interim response."
> And "101 Switching Protocols" is obviously hop-by-hop anyway, so
> it may never reach the consumer of the real response.
> For 103, this is also true. Any HTTP/2 intermediate should discard
> unknown 1xx responses, so the server can also not assume that its
> headers reach all recipients.

I do not agree. RFC 7231 section 6.2 states:

   A proxy MUST forward 1xx responses unless the proxy itself requested
   the generation of the 1xx response.

So it would be natural to assume that proxies that do not understand
103 will forward the hints to the client.

> That also relieves intermediates from any obligation of "folding"
> 103s into the final response (in case they do not want or can not
> send on the 103s).
> Which also requires any origin that *wants* these headers in the
> final response that is about the resource , to duplicate them in
> its response.
> Personally, I think of 1xx as a signalling channel from server
> to client that can be used before the response is sent.
> For 100 and 101, this is pretty clear. For 103 there is some
> temptation to see Link: headers as part of the final response,
> but they really are not.
> My interpretation.
> -Stefan
>> Am 08.08.2017 um 09:38 schrieb Willy Tarreau <>:
>> On Tue, Aug 08, 2017 at 03:57:16PM +0900, Kazuho Oku wrote:
>>> 2017-08-08 14:29 GMT+09:00 Willy Tarreau <>:
>>>> What about something like this instead :
>>>>  A client must be prepared to receive multiple 103 (Early Hints) responses
>>>>  in any order coming from multiple intermediaries as well as the origin
>>>>  server along the path between the client and the server. Given that such
>>>>  agents will often rely on different but overlapping policies to emit these
>>>>  responses, it is likely that some header fields may be repeated. The client
>>>>  is expected to simply consider the union of all these header fields as if
>>>>  they were received in a single response.
>>> Thank you for your comments and the suggestion.
>>> I am worried of adding statements on how an intermediary could
>>> generate 103 responses and using that as the reasoning for why the
>>> client should consider the union of the header fields as the
>>> server-provided expectation.
>> But isn't it expected to be the reality ? In the other example I gave
>> it makes a lot of sense and it becomes obvious :
>>  - for clients that any such response may be partial ;
>>  - for clients that any such response may overlap with others ;
>>  - for clients that even the union of these responses do not provide an
>>    exhaustive list
>>  - for clients that the final response will probably not contain the
>>    whole list of links given that some might have speculatively been
>>    added in anticipation
>>  - for servers that they don't know if upfront intermediaries have
>>    already sent equivalent links
>>  - for servers that based on the incertainty of what is done upfront,
>>    they need have to provide the final list with the final response
>>> This is because it is technically
>>> possible to to require each intermediary to build (i.e. calculate the
>>> union) and emit a complete set of header fields for every 103 response
>>> that it sends.
>> In fact not exactly. Most of these will be built while processing the
>> request, long before the response arrives. When the response arrives,
>> conditions may have changed. For example in haproxy when processing
>> the response we don't have access to the request elements anymore. A
>> rule based on the Host field or on the URI prefix would be matched
>> only during the request and not during the response (we have the
>> ability to artificially copy them into variables for such explicit
>> processing but it's not natural). I tend to think that we should keep
>> in mind that what intermediaries add there is approximative but helps
>> fill dead time speculatively preloading contents that will be likely
>> needed, and that the final word is to the server's final response. We
>> could possibly even suggest that elements that were learned from 103
>> and not yet prefeteched could be aborted if they don't appear in the
>> final response.
>>> Therefore, my preference goes to either (re)stating the general rule
>>> (i.e. nonexistence in 103 is not an indication of absence in the final
>>> response), or to state the expected behavior of the endpoints without
>>> any reasoning. I also think that we should keep the "a server can
>>> omit" statement, since it would be a direct answer for people
>>> wondering how a server should adjust the expectation that it has
>>> already sent.
>> I see your point but I tend to think that explaining the workflow like
>> above makes all these responses much more obvious in fact. If you're
>> willing to go down that route, I can try to help provide a paragraph
>> to (try to) make this more natural.
>>> Considering the points, how using something like below for the last paragraph:
>>>   While emitting a series of 103 (Early Hints) responses, a server can
>>>   omit a header field that was included in one response in the
>>>   following responses, even when it is anticipated that the header
>>>   field will be part of the final response.
>> To be honnest I'm having difficulties parsing it, which is not a good
>> sign for a standard ;-)
>> It would be nice if some native english speakers could bring some help
>> here, some of our sentences are not always the most natural we can think
>> of.
>> Cheers,
>> Willy

Kazuho Oku

Received on Tuesday, 8 August 2017 08:55:46 UTC