"be consistent" | |
PerlMonks |
Re: Writing data in chunks to a DB handleby samtregar (Abbot) |
on Aug 16, 2008 at 16:42 UTC ( [id://704703]=note: print w/replies, xml ) | Need Help?? |
Dealing with file uploads in a web environment is a real pain. I've certainly been there. Everything else the user sends you is so well behaved - just a bunch of text you can stick in a table and be done with it! It's very tempting to think that you can get away with treating file uploads the same way, but I think it's a mistake to go that way.
MySQL is not a distributed file-system and it's not tuned to handle huge files like this. This problem inserting the file without loading it into memory is really just the start of your woes. How will you get the file out again? How will you do backups of your DB now that mysqldump is producing dumps that contain every file any user ever uploaded? Will you re-invent directories at some point? Permissions too? What will you do when you want to process the uploaded files with an external tool (resizing images, transcoding video, etc)? It's not too late to setup an NFS server! It's not as hard as you probably think, and once you have one setup you'll find other uses for it (configuration, shared code installs, etc) and stop leaning on your DB so hard. -sam
In Section
Seekers of Perl Wisdom
|
|