Beefy Boxes and Bandwidth Generously Provided by pair Networks
Welcome to the Monastery
 
PerlMonks  

Re: Sorting Gigabytes of Strings Without Storing Them

by tilly (Archbishop)
on Dec 22, 2008 at 16:26 UTC ( [id://732133]=note: print w/replies, xml ) Need Help??


in reply to Sorting Gigabytes of Strings Without Storing Them

I have to disagree with everyone. Standard "lists are too large" approaches such as Sort::External are coded to handle the case of too many lines, not lines that are too large. I would expect this to be true of the Unix sort utility as well, though it makes sense to try it to verify that.

Assuming that that doesn't work, what you need to do is use read to scan the file to find all of the starts of lines (they are 1 after the returns). Then you sort this array of offsets with a comparison function that does a string comparison between what is in the file at the two different locations. You of course don't want to pull in the full line to do that, instead use seek and read to move to the the right spots, and pulls a block each, then compares them, repeating as need be. Then you take this sorted array and use it to write out a sorted file by taking each offset, extracting the string in pieces and writing it to the output.

This is kind of complex and I'd code you an explanatory example if I was not currently typing one-handed due to a hand injury. :-(

Update: As BrowserUk notes, your question is unclear. I read it as lines up to 2 GB. If that is wrong, then this answer is inappropriate.

  • Comment on Re: Sorting Gigabytes of Strings Without Storing Them

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://732133]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others learning in the Monastery: (2)
As of 2024-04-25 02:09 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found