Welcome to the Monastery | |
PerlMonks |
Re: Sorting Gigabytes of Strings Without Storing Themby tilly (Archbishop) |
on Dec 22, 2008 at 16:26 UTC ( [id://732133]=note: print w/replies, xml ) | Need Help?? |
I have to disagree with everyone. Standard "lists are too large" approaches such as Sort::External are coded to handle the case of too many lines, not lines that are too large. I would expect this to be true of the Unix sort utility as well, though it makes sense to try it to verify that. Assuming that that doesn't work, what you need to do is use read to scan the file to find all of the starts of lines (they are 1 after the returns). Then you sort this array of offsets with a comparison function that does a string comparison between what is in the file at the two different locations. You of course don't want to pull in the full line to do that, instead use seek and read to move to the the right spots, and pulls a block each, then compares them, repeating as need be. Then you take this sorted array and use it to write out a sorted file by taking each offset, extracting the string in pieces and writing it to the output. This is kind of complex and I'd code you an explanatory example if I was not currently typing one-handed due to a hand injury. :-( Update: As BrowserUk notes, your question is unclear. I read it as lines up to 2 GB. If that is wrong, then this answer is inappropriate.
In Section
Seekers of Perl Wisdom
|
|