RE: Large content size value

Larry Masinter said:
> > if a client can't download (or a server can't serve) a
> > file bigger than the FS can handle
> It is possible for a client to use range retrieval
> to get parts of a large file, even if the client couldn't
> store the whole thing because of file size limitations.
> (This can happen with JPEG2000 image files, for example.)
> It's quite possible for a server to serve a dynamically
> generated resource that is bigger than can fit into a
> single file on the file system.
> So I don't think the protocol limits and the underlying
> operating system file size limits should be linked
> in any way.

Excellent point! I completely ignored dynamic content (though I would think that would almost always be served with a chunked encoding in practice). You've also brought up a great solution.

Since it's possible for the client to detect when a Content-Length or a chunk-length is too long, SHOULD the client then attempt a series of byte-range requests instead? This would solve all the prior problems I've mentioned, assuming the server implements that part of the protocol (anyone know which servers do, in practice?).

Also, in regard to connection handling: as far as I can tell, the client is going to have to close the connection if an oversized Content-Length shows up, since the client won't be able to read through to the next request reliably. If this is the case, is it specified? It might make for a nice suggestion (not a requirement).


-- Travis

Received on Thursday, 4 January 2007 23:50:34 UTC