laziness, impatience, and hubris | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
Of course it's eating all of your system cache, you're trawling through TWO MILLION files. The comments suggesting reading the directories instead of the files are excellent, I only have a couple of minor tidbits to add:
- If you have a way to hook into the process that creates the PDF, you could write to a one-table database, or even a text file, listing the file and when you want it to expire. Then the cleanup program only has to scan the list for files past their expiration date, and delete them. (You could just about do that in a batch file.) - If you can change the structure of where the files are stored, put them in directories named based on when the files should go away, like '2004-01-20', and delete any directories older than today. -- Spring: Forces, Coiled Again! In reply to Re: Some File::Find woes.
by paulbort
|
|