This program is intended to demonstrate how long it takes the CPU to
perform an empty loop many times over. The line:
for(j=1;j<=k;j++);
Does nothing but start j at 1 and increment it over and over until it
gets to the value of k. The printf near the bottom is intended to
output the number of increments (loops) it did, and the total time
elapsed. This test is run 500 times, and each time the value of k is
increased 20% over the prior test.
There are, however, several things wrong with the program.
Most importantly, it is often assigning an integer value to a
floating-point type variable without explicitly saying that its doing
this. This is causing the problem with the zeros. Whenever you want to
store or display an integer value, an integer-type variable should be
used.
In Microsoft C++, CLK_TCK and CLOCKS_PER_SEC are set to the same
value, which is 1000, and integer value. Attempting to print an
integer value with a floating-point specifier in the printf is bad.
You're not going to see the proper value.
There are other problems here and there. I rewrote it as follows:
void main(){
long i,j,k,t1,t2,t;
double time;
printf("tests the clock by running dummy loops of various
lengths\n");
k = 10000;
printf("\n i k t1 t2 t time\n\n");
for(i=1;i<=50;i++){
k=k*1.2;
t1=clock();
for(j=1;j<=k;j++);
t2=clock();
t=t2-t1;
time= (float)t/CLK_TCK;
printf("%4d %10d %6d %6d %6d %2.4f \n", i,k,t1,t2,t,time);
if(time>12.)
break;
}
printf("\n CLK_TCK=%d, CLOCKS_PER_SEC=%d \n", CLK_TCK,
CLOCKS_PER_SEC);
}
Now, a sample line of output looks like this:
47 52659768 2078 2468 390 0.3900
Meaning:
47 = which test # this is
52659768 = how many loops it will try this time
2078 = timer reading at start of loops
2468 = timer reading at end of loops
390 = how much time in timer ticks it took (2468 - 2078 = 390)
0.39 = how much time in seconds that it took
Since CLOCK_PER_SEC is 1000, that means every 1000 timer ticks is one
second. Which means 390 timer ticks is .39 seconds.
All this means my computer does the loop 52659768 times in .39
seconds.
Please ask for clarification on anything that is not totally clear to
you.
Cheers,
nauster-ga |
Clarification of Answer by
nauster-ga
on
06 Oct 2002 14:04 PDT
In Microsoft's product, it returns the amount of time, measured in
1/CLOCKS_PER_SEC, that has elapsed since the start of the program.
Since CLOCKS_PER_SEC is 1000, that means it is returning the number of
milliseconds since the program started.
It is not precise to the millisecond, however. It provides only an
approximation. For example, when I run it continuously and have it
report its results, it jumps from 0 to 15 to 31. As long as you keep
its imprecision in mind, it can be quite useful for measuring elapsed
time.
Other compilers have slightly different versions of the clock()
function, but in all cases the intent is to provide some way for the
programmer to have access to the amount of real-world time has elapsed
while the program is running.
|