Pathologically Eclectic Rubbish Lister | |
PerlMonks |
Re: Combining Ultra-Dynamic Files to Avoid Clustering (Ideas?)by matija (Priest) |
on Jul 24, 2004 at 06:33 UTC ( [id://377095]=note: print w/replies, xml ) | Need Help?? |
Let's keep things in perspective: a million of 4KB sized is 4GB of space. Today, that costs about $4. Tomorrow, it will cost less. So unless you're talking about a legacy system, space is not that much of a concern. Speed and convenience of access should be, though. Managing a million of files is going to be a major pain in the neck. Putting them into a directory three with about three levels will reduce the server load, but it will still be a major complication in your program. I agree with the other people who said you should use a database. Yes, in a way a database is doing what you planned to do. However, the people who programmed that database spent a lot of time and used many sophisticated algorithms to get the database to do this stuff efficiently. Realisticaly speaking, if you use a database, this is maybe a couple of hours work (mostly spent in learning the basics of SQL :-). If you decide on programming the whole thing yourself, you will spend days to weeks getting it right. If the cost of 4GB of space was a concern, what is the cost of several weeks of your work? What is the cost of the work of whoever has to maintain that system after you're done implementing it? Use a database
In Section
Seekers of Perl Wisdom
|
|