I wrote a perl script that does some basic file operations, in this case moving files around based on their creation date.
I have a large folder with 60,000+ files inside, and when I run this particular Perl script, it exits without any output and no errors.
Is there a memory limit or something I could be running into?
Heres a snippet:
for my $file (<*.txt>)
{
$count = $count + 1;
my ($name,$path,$suffix) = fileparse($file,"\.txt") ;
my $info = stat($file);
my $datestamp = strftime(".%Y%m%d", localtime($info->mtime));
my $datestamp2 = strftime("%Y%m%d", localtime($info->mtime));
#print "\nMaking Directory if it doesnt exist: $dst2\\$datestamp2"
+;
mkdir "$dst2\\$datestamp2" or "Error making Directory\n";
print "\n Moving \"$file\" to >> $dst2\\$datestamp2\n";
move $file,"$dst2\\$datestamp2\\$name$suffix" or warn "Cannot copy
+ $file $!\n" ;
if($count > $limit)
{exit;}
}