- From: Mark Nottingham <mnot@mnot.net>
- Date: Sun, 24 Aug 2014 17:42:39 +1000
- To: Martin Thomson <martin.thomson@gmail.com>
- Cc: Mike Bishop <Michael.Bishop@microsoft.com>, Patrick McManus <mcmanus@ducksong.com>, William Chow <wchow@mobolize.com>, HTTP Working Group <ietf-http-wg@w3.org>
On 23 Aug 2014, at 3:57 am, Martin Thomson <martin.thomson@gmail.com> wrote: > On 22 August 2014 10:30, Mike Bishop <Michael.Bishop@microsoft.com> wrote: >> "While the stream identified by the promised stream ID is still open" - meaning that as long as the client has asked for it before the server has finished sending it? That's a fairly small amount of time, particularly if the resource is very small, but sounds like a good starting point. > > I'm sure that clients can fail to notice the END_STREAM flag for as > long as they need to in order to ensure that the various races resolve > in the right way... > > Trying to determine how long the window is after END_STREAM arrives in > which clients can consider the response validated is nasty. I don't > know how to finesse this other than turning a blind eye to small > violations, the likes of which you (and Firefox too) are committing in > this regard. Yeah, this was by far the trickiest bit; the caching model is really only built for client-driven traffic, server push inverts all of the assumptions and creates some nasty corner cases. The best we can do is wave hands about a bit and try to make sure there isn't too much implementation divergence... I'm OK with that mostly because if you push something with CC: no-cache and it isn't considered valid five seconds later, well, you got what you asked for, didn't you? Cheers, -- Mark Nottingham https://www.mnot.net/
Received on Sunday, 24 August 2014 07:43:12 UTC