There's more than one way to do things | |
PerlMonks |
Re: Large chunks of text - database or filesystem?by cyocum (Curate) |
on Mar 19, 2005 at 10:09 UTC ( [id://440901]=note: print w/replies, xml ) | Need Help?? |
I would use a database for your problems. The reason? Ease of transfering the information from one machine to another. All the information would be stored in a few files and can be transfered to a new database much more easily. Hard drive space is releatively cheap these days and having millions of files all over the place is ineffiecant in my opinion (as always, I could be terribly wrong). For effecient searching, I would use Plucene. Internally, MyISAM tables do have the ability to compress but only for read-only tables. I would run Plucene as a cron or scheduled task or have it triggered by an insertion. Do not worry Plucene (at least as far as I know for Lucene from which Plucene came) should store partial indexes in memory before flushing to disk so it should be pretty quick about additions to the index. About the storage requirements for text, check the MySQL documentation here. That should let you know how much disk space you are wasting with each post. I hope this helps.
In Section
Seekers of Perl Wisdom
|
|