laziness, impatience, and hubris | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
Hi, a while ago I asked the wisdom of a simple site mapping tool, (see Web Site Mapping Tool). I got a few useful messages - many thanks, and so went ahead and built it. It works great, and I'm happy with it, except that I know it's got at least one big security hole!
I'm working within a secured intranet, so I don't expect many hacking attempts, but you can never tell, plus I can't be sure that someone else won't use the script within the company in a less secure environment - who ever reads warning and comments when you are in a hurry? The problem lies around the way that the script works out which directory tree to map. Here is a summary of how it works:
The obvious problem with this is that if you pass by hand a path such as /allowed/../../notallowed, where the approved path is /allowed/, then this gets through, and then File::Find then traverses a space it's not supposed to (directory permissions aside). I've done the obvious of obliterating any "..", but I know that there are many more ways to bypass this. Taint and detainting won't help directly either. CHROOT isn't an option on my NT box, and annoyingly NT permissions don't prevent much either - as currently set. On a Linux box (next phase) I should be able to use CHROOT and file permissions to control the script a bit better, but I'd like the script to be more robust by default. QUESTION: How do I take in a path from the outside, and verify that it's safe to pass to File::Find? As ever, humble thanks in advance. In reply to Controlling Inputted Paths in a CGI Script by ajt
|
|