in reply to making a loop script with a remote URL call faster
I'm with LanX, I don't really see how it would make sense to iterate the loop multiple times if price doesn't change, but if you really want a tight loop, everyone loves threads:
#!/usr/bin/perl -w use strict; use warnings; use threads; use threads::shared; my $PRICING :shared = 0; my $RUNNING :shared = 1; sub price_thread { while ($RUNNING) { # Local variable to limit lock time. # Catch errors in eval my $pricing = eval { # Net::Curl to remote URL... request here. my $val = 1000 * rand(); # [from net::curl] # ... more? # last line will go to $pricing and $pricing gets undef on + exception $val }; do { lock($PRICING); $PRICING = $pricing; }; # Optional pause between requests: sleep 60; } } sub main_loop { # Local variable to limit lock time and prevent changes mid-comput +ation my $pricing = 0; while ($RUNNING) { do { lock($PRICING); $pricing = $PRICING; }; # Optional - I don't know what you are doing with pricing, you + might want # to use 0 or undef to signal a request error (i.e., out-of-da +te data). next if !defined($pricing) or $pricing == 0; # Do something with $pricing; print "$pricing\n"; sleep 1; # Work simulation # return if QUIT CONDITION; } } my $thr = threads->create(\&price_thread); eval { main_loop() }; $RUNNING = 0; $thr->join();
Good Day,
Dean
|
---|
Replies are listed 'Best First'. | |
---|---|
Re^2: making a loop script with a remote URL call faster
by marioroy (Prior) on Jul 08, 2022 at 02:05 UTC |
In Section
Seekers of Perl Wisdom