reinaldo.gomes has asked for the wisdom of the Perl Monks concerning the following question:
I have a multi-threaded script which does the following:
1) One boss thread searches through a folder structure on an external server. For each file it finds, it adds its path/name to a thread queue. If the path/file is already in the queue, or being processed by the worker threads, the enqueuing is skipped.
2) A dozen worker threads dequeue from the above queue, process the files, and remove them from the hard disk.
It runs on a single physical server, and everything works fine.
Now I want to add a second server, which will work concurrently with the first one, searching through the same folder structure, looking for files to enqueue/process. I need a means to make both servers aware of what each other is doing, so that they don't process the same files. The queue is minimal, ranging from 20 to 100 items. The list is very dynamic and changes many times per second.
Do I simply write to/read from a regular file to keep them sync'ed about the current items list? Any ideas?
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Sync item list between perl scripts, across servers
by Corion (Patriarch) on Nov 14, 2016 at 08:53 UTC | |
Re: Sync item list between perl scripts, across servers
by GrandFather (Saint) on Nov 14, 2016 at 09:06 UTC | |
Re: Sync item list between perl scripts, across servers (rename)
by tye (Sage) on Nov 14, 2016 at 19:58 UTC | |
Re: Sync item list between perl scripts, across servers
by reinaldo.gomes (Beadle) on Nov 14, 2016 at 17:19 UTC | |
by reinaldo.gomes (Beadle) on Sep 20, 2018 at 14:36 UTC |