|more useful options
What the flock!? Concurrency issues in file writing.by suaveant (Parson)
|on Oct 01, 2008 at 15:48 UTC
suaveant has asked for the wisdom of the Perl Monks concerning the following question:
I have a report writer that was designed to fork for better speed, it takes chunks of the report and runs them in parallel, writing the results to a file which is later sorted. When we were on Solaris we had no known issues with this, but when we moved to Linux all of a sudden we started getting inconsistencies in the data. Basically lines were getting cut off, lost, etc... it would seem the children were stomping on each other's toes.
I have been hacking at this for half a day and nothing is working. I build chunks of data, all the lines for my child's output, flock the output file and write the chunk to it. Originally I use open and print with flock, then I tried manually flushing the file handle (which apparently flock already does anyway), I tried converting everything to sysopen and syswrite. I made sure LOCK_EX and LOCK_UN were defined... I just can't figure out what is going on. I though flock was fine as long as everything was using it....
Here is the code snippet I am currently using to no avail:
Any ideas? Or even better ways to write data to the file that is fork-safe?