Re: Potential HTTP Security Risk

According to BearHeart / Bill Weinman:
>    I just noticed in the WWW Security FAQ a notation that some 
> servers, including NCSA, allow the file ".htaccess" to be retrieved. 
> I tried it with my Apache 1.0 server and I got the file. 
>    Perhaps the following modification of the proposed section 12.5 
> would help: (change marks in the left column are relative to Paul 
> Hoffman's message that began this thread)
>  | 12.5  Attacks Based On URL Contents
>    Implementations of the HTTP servers should be careful to restrict the
>    documents returned by HTTP requests to be only those that were intended
>    by the administrators. If an HTTP server translates HTTP URIs directly
>    into file system calls, the server must take special care not to serve
>    files outside the desired directory tree. For example, Unix, Microsoft
>    Windows, and other operating systems use ".." as a path component to
>    indicate a directory level above the current one. A URL with such
>    constructs can be constructed to potentially allow access to files
>    outside the desired directory structure, and should thus be disallowed.
>  + Many servers implement a system of access-control files within the 
>  + document directory tree that may contain sensitive security- or 
>  + implementation-related information. A URL which references a filename 
>  + which is used for access-control files, or a filename pattern 
>  + commonly used for system files (e.g. "/." for Unix systems, or ".PWL" 
>  + for Microsoft Windows systems), should be disallowed. A server should 
>  + make a configuration option available to the system administrator to 
>  + ensure that this protection is made sufficiently flexible for 
>  + site-specific security considerations. 

These are just a couple of instances of a whole class of potential
problems associated with a design decision for some server
implementations.  Most (but not all) server implementations treat the
path part of a requested URL as a file system path and make the
default action to serve that file unless it is somehow explicitly
forbidden.  This is the reason that /../ is problematic and that it is
necessary to pay so much attention keeping the server restricted to
its data hierarchy.  It is also the reason that minor bugs so often
turn into security problems (e.g. the NCSA bug that caused a
"cgi-bin//" in a URL to be treated as a path but not to a cgi-bin and
hence to allow the serving of script sources).  It is also the reason
that .htaccess files get served.  They seem especially problematic
since you can't set permissions on .htaccess files to make them
unreadable by the server.

There are still more examples, like viewing CGI sources by requesting
foo.cgi~ in a directory where foo.cgi lives.  And there are almost
surely other problems we aren't yet aware of.  In my view these all
stem from taking a (potentially hostile) request from a user and by
default serving it unless it is somehow explicitly forbidden.  I am
all for warnings it the HTTP specification, but it is not very
realistic to think that any collection of warnings will really remedy
the situation.  You simply can't warn against all the possible risks
associated with this design.

John Franks

Received on Thursday, 28 December 1995 08:11:19 UTC