Re: large file transfers, was: Please admit defeat

WebDav is not the base HTTP. It’s an extension, and it might as well have been an application running over HTTP.

I agree that support for ftp is waning because the use is waning (although the RFC editor only lets you download the whole RFC repository over either FTP or rsync. Even the index is only available via FTP).

Anyway, my point is not that FTP is that great. It’s that if the claims that pages load twice as fast in HTTP/2 are true, we should not hold off until we see a benefit to every other use of HTTP.

Yoav

On May 26, 2014, at 1:01 PM, Julian Reschke <julian.reschke@gmx.de> wrote:

> On 2014-05-26 11:48, Yoav Nir wrote:
>> Translation of text encoding (ASCII/EBCDIC, CR/LF/CRLF)
> 
> I seriously believe that's not needed anymore.
> 
>> Listing directories
> 
> RFC 4918.
> 
>> Getting or putting multiple files based on wildcard with or without user prompt
> 
> RFC 4918, plus a client that uses it properly.
> 
>> Slightly less verbose (very slightly…)
> 
> How is this relevant for *large* files?
> 
>> With old-style FTP, each resource got its own TCP connection, and so its own flow control. That one we are getting with HTTP/2.
> 
> You can do that in HTTP/1.1 as well, if you want.
> 
>> We used to move files using FTP. We moved to HTTP not because it was better, and not because of a lack of client support (all OSes have an FTP client, and all browsers support ftp:// URIs), but because everybody installed a firewall and half of those blocked FTP, but they all let HTTP through.
> 
> I hear that browsers makers consider to remove "ftp:" support.
> 
> The last time I used FTP was a few years ago, when I had to fix a bug in essentially unmaintained Mozilla code, that I had previously broken by improving the compliance of their URI parsing code.
> 
> Best regards, Julian

Received on Monday, 26 May 2014 10:28:30 UTC