Hello again people, I've got some questions about timing that I hope can be answered by someone.
I'm trying to use the functions getrusage or times, or the time command to time my programs. Now I've got this piece of code that uses getrusage, and my first question is a bit of a daft one, can anyone please tell me what unit t is given in? I got 1086556160 last time I ran the program. Is this milllionths of a second or what?
#include<stdio.h>
#include<stdlib.h>
#include<sys/time.h>
#include<sys/resource.h>
double getcputime();
int main(void)
{
int i, n, a;
double t;
n = 13;
printf("The 5-times Table\n");
for(i=1;i<n;i++)
{
a = (i * 7);
printf("%d\t%d\n", i, a);
}
printf("\n");
t = getcputime();
printf("The total time taken by the system is: %d (in decimal format)\n", t);
return 0;
}
double getcputime(void)
{
double t;
struct timeval tim;
struct rusage ru;
getrusage(RUSAGE_SELF, &ru);
tim=ru.ru_stime;
t=(double)tim.tv_sec * 1000000.0 + (double)tim.tv_usec;
return t;
}
My second question is more general (for now), does anyone know why very few textbooks talk about timing programs, and even if you wanted to use the system clock, surely that is only accurate to seconds, and therefore not what is needed to time a relatively fast program?