go ahead... be a heretic | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
If you are saving info on each page you find to a file then couldn't you just check to see if the file already exists before writing to it?? I didn't realy understand your code but you could save each url in a hash. Then just check to see if the url already exists in your hash before reading the page agian. The hash would only get as big the number of sites you spider. ___________ Eric Hodges In reply to Re: Re: Re: Cutting Out Previously Visited Web Pages in A Web Spider
by eric256
|
|