Look into this code:
#include <stdio.h>
main()
{
int p,n,r;
float si;
scanf("%d %d %d", &p, &n, &r);
si=p*n*r/100;
printf("SI = %f", si);
}
Output of this program is
100
100
100
SI = 169.00000
and also this:
#include <stdio.h>
main( )
{
int p, n ;
float r, si ;
printf ( "Enter values of p, n, r" ) ;
scanf ( "%d %d %f", &p, &n, &r ) ;
si = p * n * r / 100 ;
printf ( "%f" , si ) ;
}
Output of this program is
100
100
100
SI = 1000
How a compiler is understanding these codes?
Why the answer is not correct for the first program?
Can any one help me out with a compiler meaning?