Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris
 
PerlMonks  

Re: Multi threading

by Anonymous Monk
on Apr 07, 2009 at 07:29 UTC ( [id://755943] : note . print w/replies, xml ) Need Help??


in reply to Multi threading

I urge you to reconsider your design. It would be much simpler if you had a designated process whose sole job is to write to the log file - a logger in the style of syslogd.
When one of the worker threads/procs wants to write to the log file then it sends a request to the logger. There you have a number of mechanisms you could use for inter-process communications, sockets, message queues, or even a named pipe, provided the record is not too large.

Replies are listed 'Best First'.
Re^2: Multi threading
by sandy1028 (Sexton) on Apr 07, 2009 at 11:47 UTC
    Hi, Suppose I am forking 5 process, All the five process should should write the 5 different log file.
    Can you please tell me how can I do it?
      There is more than 10000 articles in a directory. Using this code I am creating the 5 processes. Each process should read 100 articles at once and write to the log file.
      While writing to log file some of the files are missed.
      How to use the lock on these process?
      How can I create the different lof files for all 5 processes.
      Can anyone please help me?
      my $pm = new Parallel::ForkManager(5); $pm->run_on_finish( sub { my ($pid, $exit_code, $ident) = @_; $tmp +Files[$ident] = undef; } ); foreach my $i (0..$#tmpFiles) { # Forks and returns the pid for the child: my $pid = $pm->start($i) and next; $SIG{INT} = 'DEFAULT'; my $filename = $tmpFiles[$i]->filename(); my $file = IO::File->new("<$filename") or die "Can't open $filen +ame\n"; while((my $line) = $file->getline()) { last unless defined($line); chomp $line; my ($dir, $file) = split(/\t/, $line); $processor->($dir, $file, $config, $log); } $pm->finish; # Terminates the child process } $pm->wait_all_children;
        sub processdir(){ my $pm = new Parallel::ForkManager(5); $pm->run_on_finish( sub { my ($pid, $exit_code, $ident) = @_; $tmp +Files[$ident] = undef; } ); foreach my $i (0..$#Files) { # Forks and returns the pid for the child: my $pid = $pm->start($i) and next; $SIG{INT} = 'DEFAULT'; my $filename = $Files[$i]->filename(); my $file = IO::File->new("<$filename") or die "Can't open $filen +ame\n"; while((my $line) = $file->getline()) { last unless defined($line); chomp $line; my ($dir, $file) = split(/\t/, $line); $processor->($dir, $file, $config, $log); } $pm->finish; # Terminates the child process } $pm->wait_all_children; return; } processdir($input_dir,$logfile,\&clean_value); sub clean_value(){ $logfile->print("-- Reading '$input_file' file\n"); }
        Here all the process executes,writes together and input file name overlaps in the file name
        How to avoid overlapping of file names into the $logfile.
        In which portion should I use locks or how to avoid averlapping of filename