- From: Max Bruce <max.bruce12@gmail.com>
- Date: Sun, 5 Apr 2015 17:35:44 -0700
- To: Willy Tarreau <w@1wt.eu>
- Cc: "ietf-http-wg@w3.org" <ietf-http-wg@w3.org>
- Message-ID: <CABb0SYTV6J_dcS2BanOL6VzCFUgWd0=FmuC83hh-1xtWsaGiZA@mail.gmail.com>
Sorry, I formatted it as a request, this would be more proper: HTTP/1.1 206 Partial Content (or some custom status code if you want to be original) X-Req-ID: blah (other headers MAY be necessary, probably not, original response will likely have anything) FFFF 65535 bytes of data On Sun, Apr 5, 2015 at 5:28 PM, Max Bruce <max.bruce12@gmail.com> wrote: > I get your point, but using something like chunked transfer encoding, you > could use the same idea of id-assigned packets to have intermittent > requests. > However, generally your not performing requests while streaming, but > support would be for the best, and I'll definitely set to do this. I'm > thinking we have response-like chunked encoding for each chunk. ex. > > request for video > > chunk 1 = > CHK * HTTP/1.1 > X-Req-ID: -theid/orpush- > > FFFF > 65535 bytes of stream. > > > > On Sun, Apr 5, 2015 at 5:16 PM, Willy Tarreau <w@1wt.eu> wrote: > >> On Sun, Apr 05, 2015 at 04:55:04PM -0700, Max Bruce wrote: >> > Each request adds a X-Req-ID to the headers as such: >> > >> > GET / HTTP/1.1 >> > X-Req-ID: random-unique-per-connection-id >> > other headers... >> > >> > and the server responds... >> > >> > HTTP/1.1 200 OK >> > X-Req-ID: request-id-here >> > X-Req-Target: / >> > other headers... >> > >> > BUT, can also respond twice or more for server pushing: >> > >> > HTTP/1.1 200 OK >> > X-Req-ID: -PUSH- >> > X-Req-Target: /logo.png >> > other headers... >> >> But that doesn't work. You don't have multiplexed streams this way, just >> full requests then full responses. Try to download two videos in parallel >> and you'll have a full video and once completely downloaded, you'll get >> the second one. What you are proposing here is just a very minor add-on >> to what we were doing with HTTP/1, basically you allow the server to >> respond out of order, that's all. >> >> H2 supports multiple streams over the same TCP connection to limit the >> buffer bloat issue, to reduce the connection counts, and to try to >> conserve the TCP congestion windows. And it supports these streams >> in *parallel*, yours are serialized. >> >> Regards, >> Willy >> >> >
Received on Monday, 6 April 2015 00:36:12 UTC