suppose i have defined constant using #define MAX 5 and i am using it in function called find() as shown below:

find()
{
if (MAX==5)
   {
   //do this;
   }
else
   {
   //do this;
   }

}

** If i am calling this function 100 of times** then which way is better to define constant as we can define it with 2 ways
1.#define MAX 5
2.const int MAX=5

If you have multiple MAX macros, the compiler has to reserve memory for each occurence.
ie. MAX + MAX - MAX is 3 substitutions of integer 5, so 4*3=12 bytes reserved
If you declare a constant, it's only declared and initialized once in memory, so it takes up only 4 bytes.

And when you're working with a debugger, it can tell you the value of the constant but not for macros.

Thanks for your reply.
But here i have defined MAX one time only and it will get substitued by 5 before compilation begins.So which one is better?

But here i have defined MAX one time only and it will get substitued by 5 before compilation begins.So which one is better?

Neither. If you're comparing a constant with a constant then you should just strip away the test entirely because it's not dependent on runtime values, or replace it with conditional inclusion:

#if MAX == 5
    /* Do this */
#else
    /* Do this */
#endif

Otherwise you're just saying "always do one part of the if statement, but I still want the overhead of the test". It's silly.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.