WARNING: The following text is much like the ramblings of an old man, etc ...
Ahhh ... the ol' "remove old files" script. Had to do a few of these,
over the years, at numerous companies. It is usually as a result of the
file system filling up with "old" log files, etc (eg log001, log002, etc).
The easiest thing to do back then was to run the one-liner Unix "find" command
with the "-mtime" followed by the "-exec" or "-delete" flag
(see http://unixhelp.ed.ac.uk/CGI/man-cgi?find). Just an option to consider ...
... Anyway the important thing here, however, was that in the early days I made a complete mess
of things when I didn't TEST the script first.
Nowadays I have a scheduled task that archives/zips the old (1 week) files
to another directory (much like a trashcan). Another task removes or deletes
these files from the archived directory sometime later (if they are 2 or
more weeks old). Of course, you can manually delete the files at any time
knowing that they have been backed-up on tape by the system administrators - right?
Some tips (hopefully it's not too late ...)
In your script (perl or otherwise):
* Have an option to only display the files to be deleted - or to display before deletion (with a confirmation)
* Have an option to archive rather than delete old files (eg move to another directory and gzip)
* Have an option to "restore" files from the archive (ie an "undelete")
* As you become more confident, allow handling of files via "regular expression" - handly for file names containing unsual characters or spaces, etc.
* Perhaps you can consider searching for files based on file attributes such as file sizes and (modified) dates (use ranges) and file types
* Log what has been archived (or restored) or deleted, the time and the *user id*
* If this is your first script - ever - and you are basically performing a "rm *.* www.*" then for goodness sake do not put your name on the script!
- Laz.
| [reply] |