| [reply] [Watch: Dir/Any] |
Windows os
NTFS filesystem
| [reply] [Watch: Dir/Any] |
If you literally just want to count the files on the entire disk, this is by far the fastest simple method I know of.
It counts the 1.2 million files on my cold-cache, 640GB (400GB used) drive in a little under 7 minutes:
$t=time;
$n = `attrib /s c:\\* | wc -l`;
printf "$n : %.f\n", time()-$t;;
1233597
: 394
Try it and see how you fare. I vaguely remember finding a faster method years ago, and I'll try to remember enough to look it up.
Note: Don't do my @files = `attrib /s c:\\*`; my $n = scalar @files; All the memory allocation slows things down horribly.
With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
| [reply] [Watch: Dir/Any] [d/l] [select] |
find /current_dir/ -type f | parallel -k -j150% -n 1000 -m stat -c %s
+ > sizeslist && awk '{sum+=$1}END{print "Total Bytes:",sum}' sizeslis
+t
| [reply] [Watch: Dir/Any] [d/l] |
If your only concern is speed I think you should rewrite it in C.
(First place I'd start looking is at the source of du or df) | [reply] [Watch: Dir/Any] |