Here is a code
#include<stdio.h>
int main(void)
{
int a;
a=f(10,3.14);
printf("%d",a);
}
int f(int aa,float bb)
{
return (aa+bb);
}
If I run the code it produces garbage value, but if I declare the prototype of the function before main it produces the correct output as 13. Why is this so coz without the prototype it's assuming returning int, so in effect it shouldn't have any problem?