Short answer: too many variables to tell. You should be ok, though. I've got a CMS that sometimes has to read two megabytes of xml and manages it without becoming unusable. It's not the right way for me to be doing it, but it just grew, and it still works ok. I suspect that it will be issues of data integrity or structure that make you shift, not capacity.
Anyway, if you use DBI and something like DBD::CSV to manage access to your flatfiles, then you'll have a ready-made layer of abstraction between the scripts and the data. in which case it kind of doesn't matter about the ceiling: if you seem to be reaching it, you can switch to something with more headroom just by changing a line or two in the top of the script (and getting the IT people to provide a bit of access to the databases. Not in that order.)
There are useful discussions of DBI and DBM around, if you want to look into that further.