Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things
 
PerlMonks  

Re: Large chunks of text - database or filesystem?

by cyocum (Curate)
on Mar 19, 2005 at 10:09 UTC ( [id://440901]=note: print w/replies, xml ) Need Help??


in reply to Large chunks of text - database or filesystem?

I would use a database for your problems. The reason? Ease of transfering the information from one machine to another. All the information would be stored in a few files and can be transfered to a new database much more easily. Hard drive space is releatively cheap these days and having millions of files all over the place is ineffiecant in my opinion (as always, I could be terribly wrong).

For effecient searching, I would use Plucene. Internally, MyISAM tables do have the ability to compress but only for read-only tables. I would run Plucene as a cron or scheduled task or have it triggered by an insertion. Do not worry Plucene (at least as far as I know for Lucene from which Plucene came) should store partial indexes in memory before flushing to disk so it should be pretty quick about additions to the index.

About the storage requirements for text, check the MySQL documentation here. That should let you know how much disk space you are wasting with each post.

I hope this helps.

  • Comment on Re: Large chunks of text - database or filesystem?

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://440901]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others examining the Monastery: (5)
As of 2024-04-18 19:34 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found