Monks...
My script worked when i hard coded the url address into the script. Now that i want the url read in and then when it is done move on to the next one...it is erroring out. ..I am printing the response code on the screen along with the url and when it is done with the first url it moves onto the second. I get a 200 (which is what i want) and then it hangs for a min and then i get 400s...they don't stop until i hit Ctrl + C. i added a sleep in my script to slow things down a bit, and that helps me to stop it before it gets really out of hand...but it seems that the first url data isn't getting flushed after it runs through my while loop. Can anyone help? Below i will post my code, my input list and the results.txt
Thnx.
Ray
#!D:/perl/bin -w
use LWP::UserAgent;
use HTTP::Request;
use HTML::TableExtract;
$OutFile = "results.txt";
open(OUT,">>$OutFile") || die "Can't create $OutFile: $!";
open(IN,"ipList1.txt") || die "Can't open ipList1.txt: $!";
OUTER: while(<IN>) {
$url = $_;
print $url . "\n";
$ua = new LWP::UserAgent;
$request = new HTTP::Request('GET',$url);
$request->authorization_basic('login', 'password');
$ua->timeout(10);
$response = $ua->request($request);
$responsecode = $response->code();
print $responsecode . "\n";
INNER: if ($responsecode != 200) {
sleep 1;
redo OUTER;
} else {
# login successful, let's get the html code into a variable
@ARRAY_OF_LINES = (split "\n", $ua->request($request)->as_stri
+ng);
foreach $line (@ARRAY_OF_LINES) {
$html_code .= $line . "\n";
}
}
$te = new HTML::TableExtract( depth => 0, count => 1 );
$te->parse($html_code);
foreach $ts ($te->table_states) {
#print "Table (", join(',', $ts->coords), "):\n";
foreach $row ($ts->rows) {
$td = join(" ", @$row);
$td =~ s/\n/ /g;
if ( $td =~ /^\s*(\d)\s+(\w+)\s+(\d)\s+(\w+)/) {
$outlet1 = $1;
$host1 = $2;
$outlet2 = $3;
$host2 = $4;
print OUT "Outlet: $outlet1, Host: $host1\n";
print OUT "Outlet: $outlet2, Host: $host2\n\n";
sleep 1;
}
}
}
}
Here is a sample of my input list:
http://192.168.10.20/pdumaina
http://192.168.10.21/pdumaina
And here is my results.txt:
Outlet: 1, Host: KanaApp1
Outlet: 5, Host: Kyle
Outlet: 2, Host: KanaApp2
Outlet: 6, Host: Cres
Outlet: 3, Host: Kenny
Outlet: 7, Host: Ebb2
Outlet: 4, Host: Eric
Outlet: 8, Host: Flood2
Outlet: 1, Host: KanaApp1
Outlet: 5, Host: Kyle
Outlet: 2, Host: KanaApp2
Outlet: 6, Host: Cres
Outlet: 3, Host: Kenny
Outlet: 7, Host: Ebb2
Outlet: 4, Host: Eric
Outlet: 8, Host: Flood2
Outlet: 1, Host: KanaApp1
Outlet: 5, Host: Kyle
Outlet: 2, Host: KanaApp2
Outlet: 6, Host: Cres
Outlet: 3, Host: Kenny
Outlet: 7, Host: Ebb2
Outlet: 4, Host: Eric
Outlet: 8, Host: Flood2
Outlet: 1, Host: KanaApp1
Outlet: 5, Host: Kyle
Outlet: 2, Host: KanaApp2
Outlet: 6, Host: Cres
Outlet: 3, Host: Kenny
Outlet: 7, Host: Ebb2
Outlet: 4, Host: Eric
Outlet: 8, Host: Flood2
Outlet: 1, Host: KanaApp1
Outlet: 5, Host: Kyle
Outlet: 2, Host: KanaApp2
Outlet: 6, Host: Cres
Outlet: 3, Host: Kenny
Outlet: 7, Host: Ebb2
Outlet: 4, Host: Eric
Outlet: 8, Host: Flood2
Outlet: 1, Host: KanaApp1
Outlet: 5, Host: Kyle
Outlet: 2, Host: KanaApp2
Outlet: 6, Host: Cres
Outlet: 3, Host: Kenny
Outlet: 7, Host: Ebb2
Outlet: 4, Host: Eric
Outlet: 8, Host: Flood2
As you can see ...it never gets any info from the second url. I just keeps going over the first URL please help.