http://qs321.pair.com?node_id=758064


in reply to monitor text files

The various examples in the synopsis of File::Monitor are not all consistent with each other and you have copied some of the inconsistencies into your program. In particular, you create a $monitor object but then try to call the scan method in $object, which doesn't exist.

Here is a working example of a monitor with callback.

use strict; use warnings; use File::Monitor; my $monitor = File::Monitor->new(); $monitor->watch( '/tmp/otherfile.txt', sub { my ( $name, $event, $change ) = @_; print "file has been changed\n"; } ); while(1) { foreach my $delta ($monitor->scan) { print $delta->name . " has changed\n"; } sleep(10); }

But this depends on polling for changes, which is not ideal.

Can you have your application which creates the data initiate the further processing? This would minimize delay and eliminate the waste of polling.

Replies are listed 'Best First'.
Re^2: monitor text files
by grashoper (Monk) on Apr 16, 2009 at 23:13 UTC
    actually I do want to poll for changes primarily wish to scan periodically to ensure that mtime is still changing, I can have the scan run once the files are copied, but then it will only happen every few hours, it should not cause too much extra processor or disk usage right? all that is being written to the file currently is a single line of data, but there are 50+ files and the data consists of time tests to perform operations on an application which is dependent in turn on ie and active x so I can't easily code my own testing app for it. Eventually I will probably do something with .net and fiddler to create a more custom solution but for now I just need to make sure its running when it should, I will need yet another script to validate the data files but I plan on doing that after they have been copied over to the destination directories. Thank you for the example.