Perl-Sensitive Sunglasses | |
PerlMonks |
Re: (OT) should i limit number of files in a directoryby BrowserUk (Patriarch) |
on Sep 11, 2008 at 19:00 UTC ( [id://710714]=note: print w/replies, xml ) | Need Help?? |
Storing large lumps of binary data in aa RDBMS makes no sense. You cannot use relational logic on it; it just takes more space on the filesystem; and takes longer to access. From the experience of a project a few years ago, using the 3 level deep filesystem hierarchy with MD5 checksums distributes the files very evenly. It is almost guarenteed by the very nature of MD5. For your project, you end up with 4096 directories with 750 files in each. We had 10 of millions of files in a 4 deep hierarchy (on a linux system) and lookup was fast and reliable. Loading the data the same. And the time it takes to produce the path from a given MD5 is negligable. Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
In Section
Seekers of Perl Wisdom
|
|