Okay, at first I was stumped, but I seem to have a working solution.
I have added the following to my <VirtualHost> directive for my
website, and it has the desired effect:
<Directory /home/www/docs/dylan.wibble.net/ga-test/>
Order Deny,Allow
Deny From All
Allow From None
<FilesMatch "\.(php|jpg|png)$">
Allow from All
</FilesMatch>
</Directory>
<LocationMatch "/ga-test/$">
Order Deny,Allow
Allow from All
</LocationMatch>
The directory has the following files:
index.php index.php~ test.jpg test.png test.txt
The following URLs are allowed:
http://dylan.wibble.net/ga-test/
http://dylan.wibble.net/ga-test/index.php
http://dylan.wibble.net/ga-test/test.jpg
http://dylan.wibble.net/ga-test/test.png
The following URLs are forbidden:
http://dylan.wibble.net/ga-test/test.txt
http://dylan.wibble.net/ga-test/index.php~
As the <LocationMatch> directive does not seem to be allowable in
.htaccess, this will only work in httpd.conf or a <VirtualHost>
directive.
You wouldn't believe some of the ways I tried first, then this
suddenly dawned upon me :)
This was tested on Apache 1.3.27 running on a Debian GNU/Linux server.
Information Sources:
http://httpd.apache.org/docs/mod/directives.html
Regards,
sycophant-ga |
Clarification of Answer by
sycophant-ga
on
21 Jan 2003 02:03 PST
The posted answer does not work for the URL without the trailing
slash.
This is easy to fix, either by adding another LocationMatch for the
location with no slash, or by modifying the included one to make the
slash optional.
It's always the little things...
Regards,
sycophant-ga
|
Request for Answer Clarification by
tcordes-ga
on
21 Jan 2003 03:19 PST
Got it! You got me thinking about that LocationMatch idea, which I
had investigated a bit before but for some reason gave up on. I guess
I was miffed I couldn't do LocationMatch in a <Directory> stanza,
which means that now the opening of that "hole" will apply to every
directory on my server (I'm not really using virthosts here). I
solved the missing trailing slash problem also and tweaked it so that
the rule would also work on the root of the domain (www.foo.com) as
well as all subdirs (www.foo.com/bar/sub).
I'm trying to figure out the security impliciations though. I'm
wondering if the new rule takes precedence over standard
out-of-documentroot and <Directory> rules? It doesn't seem to, but a
little assurance would be nice!
Here's what I'm using:
<Directory /work/foo/web>
AllowOverride All
Options Indexes Includes FollowSymLinks ExecCGI
Order allow,deny
<FilesMatch "\.(phtml|css|gif|jpg|png)$">
Allow from 127.0.0.1
Allow from 192.168.100
</FilesMatch>
</Directory>
<LocationMatch "/([^.]+)?$">
Order allow,deny
Allow from 127.0.0.1
Allow from 192.168.100
</LocationMatch>
Feel free to comment. I knew there had to be a solution to this.
Someone add it to the apache directive cookbook!
|
Clarification of Answer by
sycophant-ga
on
21 Jan 2003 14:13 PST
Yeah, I was wondering whether you would need this to apply to
subdirectories also, however your question made it seem as if you had
a specific directory you wanted to protect.
As far as I have been able to determine, from the documentation, and
experimentation, access rules can be overidden at any point in the
configuration, however the config is read top down. Therefore I don't
believe the config you have posted poses any risk and should work, as
long as all files have an extension on them, and no directories have a
period in their names.
Another option I began to investigate, were rewrite rules, where it is
actually possible to test for directories and so on. I seems there
could be some promise there, but I was unable to come up with anything
workable.
The one thing I do notice in your posted rules, is the lack of a "Deny
from All" which means that requests will be excepted regardless of the
match, I believe. I have been caught out before.
Also, you can limit the directorie you allow it to apply to by
hardcoding them, but obviously the practicality will vary depending on
how many directories that may be:
<LocationMatch "/(work|images|stuff)([^.]*)$"> for example, would
apply to the tree directories listed and all their subdirectories.
I have that clears up your concerns about the security, and I will
keep looking for other options.
Regards,
sycophant-ga
|
Request for Answer Clarification by
tcordes-ga
on
21 Jan 2003 21:15 PST
Re: top-down processing & security: Yes, it appears to work that way.
I can live with not being able to have files with no extension and
directories with no dots.
I tried tons of rewrite rules and couldn't get anything to even come
close to working! Those rewrites are pure voodoo and appear to be
applied AFTER the basic security settings, not before, which would not
help us.
The "Order allow,deny" according to the docs will deny everything not
explicitly allowed. So you don't need a deny from all. Try it, it
works.
Re: hard-coding directories: good idea, but not pratical with the
large number of dirs I will have.
Bottom line: problem solved. Thanks for your help!
|
Clarification of Answer by
sycophant-ga
on
21 Jan 2003 23:06 PST
I am glad I could help you find a working solution anyway.
Another idea that has occured to me that may be practical, although
outside the scope of your original question, is to do the protection
in a server-side script. For example a PHP script, which can be easily
preloaded before every page request. Although I am not sure how that
might work for images and non-parsed files. I am sure there is a
way...
Feel free to drop me an email about that if you think there may be
something worth pursuing there, I may toy with the ideas anyway. My
email address is anything@my virtual host.
Regards,
sycophant-ga
|