Beefy Boxes and Bandwidth Generously Provided by pair Networks
P is for Practical
 
PerlMonks  

Re: Multi threading

by BrowserUk (Patriarch)
on Apr 06, 2009 at 10:13 UTC ( [id://755677] : note . print w/replies, xml ) Need Help??


in reply to Multi threading

If you want to be able to serialise writes to a single log file concurrently from multiple threads of execution, then a locking a simple shared variable will achieve that (using threads & threads::shared):

#! perl -slw use strict; use threads; use threads::shared; our $N ||= 100; sub worker { my $tid = threads->tid; my( $log, $semRef, $from, $to ) = @_; for my $file ( $from .. $to ) { ## Simulate doing some processing sleep 1+rand( 2 ); ## Lock the log file semaphore before writing lock $$semRef; ## And write to the log printf $log "[%2d] Processesing file%3d\n", $tid, $file; ## The lock is released automatically at the end of the block } } ## A shared variable used as a semaphore for the log file resource my $logSem :shared; ## Open the log file in the main thread open my $log, '>', 'myLog' or die $!; my @threads = map{ ## create the workers passing the log file handle and semaphore threads->create( \&worker, $log, \$logSem, $_*$N, $_*$N + $N -1 ); } 0 .. 4; ## 5 threads each processing 100 "files" ## Wait till they are done $_->join for @threads; ## close the log close $log;

That's just a simplistic demo of the technique. If it is of interest, and you need help adapting it to your needs, please describe those needs more clearly.


Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.

Replies are listed 'Best First'.
Re^2: Multi threading
by sandy1028 (Sexton) on Apr 07, 2009 at 05:10 UTC
    Hi, Thanks for the reply. When I run this script initializing
    $from = 0; $to = 1000; Only 500 entries are got in the file and the it prints from <code> [ 3] Processesing file200 [ 3] Processesing file201 [ 3] Processesing file202 .... [ 3] Processesing file220 [ 3] Processesing file299 [ 5] Processesing file400 [ 5] Processesing file401 [ 5] Processesing file498 [ 5] Processesing file499 [ 1] Processesing file 0 [ 1] Processesing file 1 [ 1] Processesing file 2 .. [ 1] Processesing file 99 [ 2] Processesing file100 [ 2] Processesing file101 [ 2] Processesing file102 [ 2] Processesing file103 [ 2] Processesing file104 [ 2] Processesing file105 .. [ 2] Processesing file198 [ 2] Processesing file199 [ 4] Processesing file300 [ 4] Processesing file301 ...... [ 4] Processesing file397 [ 4] Processesing file398 [ 4] Processesing file399
    Only 500 entries are there in the file and the processing file is not in sequence order. Can you please help me how to do it. How to use file locks on the code above which I have mentioned in the previous thread, i.e forking a process and all the files should write to a single log file.