P is for Practical | |
PerlMonks |
need help debugging perl script killed by SIGKILLby expo1967 (Sexton) |
on Mar 01, 2021 at 19:57 UTC ( [id://11128960]=perlquestion: print w/replies, xml ) | Need Help?? |
expo1967 has asked for the wisdom of the Perl Monks concerning the following question: at the office I am working on threaded perl script on a linux system. at first just as a test I had the script work on a sub set of the large data file. I gradually increased the amount of data to be processed by the perl script and everything seemed to be fine. I seem to have reached a limit of some kind, my script now dies with a killed message and the exit status is 137 (which means killed by a SIGKILL signal) I have the main program create and load a threaded queue with all of the data records from the large data file. I next have the main script start all the threads (currently 35 threads). First the thread sets an element in a shared hash to indicate that it has started. Next the thread loops over elements from the queue using a non-blocking fetch until nothing is returned. Finally, the thread sets the shared hash to indicate it is done. My script detaches the threads after they are started. After all of the threads are started, the main script goes into a loop waiting for all the threads to mark their status flags as done. After a certain period of time if not all the flagfs indicate DONE then the main script prints a message and exits. Both the main script and the threads use signal handling. No signals were caught. I am guessing that my script was likely killed due to a too much memory used problem. Does a perl threaded queue store its elements in memory ? Any suggestions on what to check for memory issues or any other likely suspects ?
Back to
Seekers of Perl Wisdom
|
|