- From: Terje Bless <link@tss.no>
- Date: Wed, 18 Jul 2001 12:25:00 +0200
- To: Martin Duerst <duerst@w3.org>
- cc: www-validator@w3.org
On 18.07.01 at 19:04, Martin Duerst <duerst@w3.org> wrote: >But I'm not sure we should leave it as it is. It would probably make sense >to put some limit on overall file length, to avoid denial of service >attacks. Yes. Especially as a sufficiently large file will fill up the disk and never recover without manual intervention. I've been meaning to look into this for a while, but never got around to it. However, this shouldn't necessarily mean that we limit the size of what we can theoretically validate. It just means that we should handle large files gracefully (i.e. in chunks) and allow local installations to place arbitrary limits on how large files they will allow (with a configuration parameter or somesuch).
Received on Wednesday, 18 July 2001 07:30:39 UTC