Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl: the Markov chain saw
 
PerlMonks  

files larger then 2 gig

by SomeGuy (Initiate)
on May 09, 2002 at 19:07 UTC ( [id://165468]=perlquestion: print w/replies, xml ) Need Help??

SomeGuy has asked for the wisdom of the Perl Monks concerning the following question:

Hey guys I'm currently running perl 5.6.1 on Sun Solaris 8 and I am having trouble with files greater the 2 gig in size. I've checked to make sure large file support is enabled, perl -V:uselargefiles uselargefiles='define'; I have many files greater then 2 gig on the unix file system, unix utilites run fine agasint them. However perl crashes whenever it tries to open a file greater then 2 gig or write more then 2 gig of data to a file. I'm aware of the pipe workaround for reading files greater then 2 gig but I am not aware of any workaround for writing them, and I'd rather just get the large file support working. Is there anything special I need to do in the script? ideas? thanks SomeGuy

Replies are listed 'Best First'.
Re: files larger then 2 gig
by Moonie (Friar) on May 09, 2002 at 20:56 UTC
    take a look at this node - it may help!
Re: files larger then 2 gig
by tachyon (Chancellor) on May 10, 2002 at 05:21 UTC

    You may find Re: Performance Question useful as well

    cheers

    tachyon

    s&&rsenoyhcatreve&&&s&n.+t&"$'$`$\"$\&"&ee&&y&srve&&d&&print

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://165468]
Approved by virtualsue
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others studying the Monastery: (4)
As of 2024-03-29 04:41 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found