what is the Running time of the given code?

#include<stdio.h>

int main(){
  char str[100];
  int i=0,j=-1,flag=0;

  printf("Enter a string: ");
  scanf("%s",str);

  while(str[++j]!='\0');
  j--;

  while(i<j)
      if(str[i++] != str[j--]){
           flag=1;
           break;
      }


  if(flag == 0)
      printf("The string is a palindrome");
  else
      printf("The string is not a palindrome");

  return 0;
}

It varies considerably depending on how long someone stares at the screen before entering something.

Try starting a timer on line 9 and finishing it on line 24 and seeing how long it takes that way. The trouble with that is that by measuring the code you are changing the code. It will give you a rough idea though.

Also the time varies depending on the input given.

Let's back up a bit, however, as I suspect that there is a matter of terminology to be resolved. Is the question the running time of the code (which will vary depending on the machine it is run on, the memory usage, the system loading, and a host of other side issues), or do you need to determine the algorithmic time complexity of the algorithm (how the mean/maximum/minimum running time is proportional to the number of items being processed; e.g., O(n log n), or O(n^2))? These are two very different questions.

For the time complexity of this specific algorithm, you need to consider how many times the algorithm passes over the string in question. In particular, you have one pass over the whole string (line 10), followed by a pass that covers at least one item of the list, at most one-half the length of the string, and on average 1/4 of the list (lines 13-17). Does that give you enough information to work the rest out yourself?

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.