damian has asked for the wisdom of the Perl Monks concerning the following question:
hi guys,
i have this problem with directory walking i have a code here that search all directory
that has htm and html files on it. the problem is that when i run the program it takes so long
for the program to do the searching because some of my directories have about a hundred of html files.
my idea is to specify in the program what directories should be skipped, but dunno how to do it here..
dumb huh! well anyway here is the code and give suggestions.
and oh btw, the &seach_tags is the subroutine that accepts input from user to search HTML files for a certain pattern. thanks in advance.sub dirwalk { local @d_info=(); local $n = ""; local $r = ""; opendir H_DIR,$_[0]; @dir_info=readdir H_DIR; closedir H_DIR; foreach $n (@d_info) { next if $n=~/^\.*$/; if (-d "$_[0]/$n"); } else { next if !($n=~/^.+\.(htm|html)$/); ($r,$line) &search_tags("$_[0]/$n"); if ($r) { push @pages, "$_[0]/$n" if $r; } } } }
Back to
Seekers of Perl Wisdom