I have the following perl script which works nicely. It takes a list of URLs from a file, looks up each one, and returns the meta-keywords for that domain. Problem is, sometimes it hangs because it goes to a site which does not respond. I'm wondering if there is a method by which if it times out, or gets an error response, or in the alternative, a success response, it will know to continue on to the next line in the file. Here is the current script which works, but, as I said, hangs upon grabbing meta-keywords from a bad site:
#!/usr/bin/perl
#print "Content-type: text/html\n\n";
use LWP::Simple;
use HTML::HeadParser;
open (OUTFILE, '>outfile.txt');
open (MYFILE, 'url3.txt');
foreach $line (<MYFILE>) {
chomp($line);
$URL = get($line);
$Head = HTML::HeadParser->new;
$Head->parse("$URL");
print OUTFILE $Head->header('X-Meta-Description') . ".";
}
close(MYFILE);
close(OUTFILE);
exit;