Hi,
Im currently building a script to monitor / check on my sites and report back to me with their availability, HTTP Status Codes, and importantly here, the PAGE LOAD TIME.
Im currently doing this with curl requests, and this works for getting status's, and finding out many other things about the pages / sites. BUT it will only give me the time it took to grab the html on the page.
Is there a way to set it to download more sort of like a browser would. So it would record the total time in which it takes a user to download the html, css, JavaScript, images etc.
I have thought about just using the CURL script to download the html originally, and using the multiple settings (curl_multi_*) to download each of the components to make up the page, but this seems long winded.
I wondered if there was a better way than using curl, or maybe a pre-built set of classes ready for this but i have searched google and no such luck.
I know of some online tools that request pages sort of how i want to, e.g. http://tools.pingdom.com/, and http://loadimpact.com/pageanalyzer.php but i don't need all the info they generate, just the end result page load time that the user would experience.
Not sure if this is possible in PHP, may need to switch to python or another language to do this part of the script.
If anyone else has used/built a script capable of this please let me know!
Cheers!
EDIT :
Just been looking at using COM, but i would need to invest in using a windows server.
Found an article on using the COM class to screen grab sites so i assume you could use it aswell to time the load of pages.
using...
$browser = new COM("Internet Explorer.Application");
etc...
Not sure if i can test this on my windows PC though, i have XAMPP, but when i run the script i get errors, i assume i need to install some things. Has anyone here used the COM class, give me any advise with it, maybe anyone using .NET have any ideas i could migrate to PHP, or use PHP to execute a .NET or even C program in order to get the result time?
Any Ideas?