Syntactic Confectionery Delight | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
It is very unlikely that blastn takes a GB-size sequence as input from the command line! Most likely what the command expects from you, is to provide the name of the file which holds that huge data. So, in all likelihood, you must 1) write your hash to a file if it is indeed in the memory of the Perl script (and not in a file already!) and then 2) make the "system call" and provide the filename to it, as part of the command arguments. Make sure that if the expected output is huge, to instruct blastn to write its output to a file. Do not read it back from the output of the command (stdout)! Perhaps use the -o outfile option or simply redirect your command to a file, which is not an elegant solution if you are doing it via Perl's system command. The above procedure is acceptable if you create/calculate/transform that hash in the Perl script. Just to make sure: if you just read the hash from file, do not change it in any way and then blastn on it (which implies writing it to a file, as I recommend above), then you are doing something wrong. Since you have a lot of RAM available, it is worth investigating either storing it in a RAM-disk which you have to create it first, in fact all your data could go there, including temporary files. OR, use memory-mapped files, perhaps read on File::Map. bliako In reply to Re^3: System call doesn't work when there is a large amount of data in a hash
by bliako
|
|