I'm developing a web-program that needs to check a whole bounch of potentially faulty http-links. Now, I have found a way of doing this, the only problem is that it relies on try-catch, not in it self horribly bad, but the time it takes to check the links is extremely variable. It ranges from about 20k ns to 1,8M ns for links that works. For links that is not working propperly (eg. a link that has a server (ip) but is not a real http) the time it takes the code to "figure out" that an exception should be thrown is upwards of 30 seconds! As you might tell, this is not very practical.
private boolean tryLink(String linkToTest) {
//Open connectino to link and return true if link works, false otherwise
long end, start = System.nanoTime();
URL myurl;
try {
myurl = new URL(linkToTest);
HttpURLConnection conn = (HttpURLConnection)myurl.openConnection();
conn.setRequestMethod("GET");
} catch (Exception e) {
end = System.nanoTime();
if(end-start>50000){
System.out.println("TryLink time: "+(end-start)+". Link not ok");
System.out.println("Slow link was: "+linkToTest);
}
return false;
}
end = System.nanoTime();
if(end-start>50000){
System.out.println("TryLink time: "+(end-start)+". Link OK");
System.out.println("Slow link was: "+linkToTest);
}
return true;
}
Is there a more elegant way of testing a http link? Any and all help is much appreciated!