Hey i read about various Time Complexity Notations. I just wanted to know how big can the C (Constant ) can be . As in Big Oh Notation we say f(n)=O(g(n)) iff f(n)<=c*g(n) for all n>=n0.

For eg. We have f(n)=3n+2=O(n) as 3n+2<=4n for all n>=2 but what if we say c=2 instead of 4.
And in another example 3n+3=Sigma(n) as 3n+3>=3n for n>=1. but what if we say c=10 instead of 3.

Please help me out.

There is no limit to how big the constant can be.

hi mayank.mittal.3954,

Sorry for not giving specific solution. But
I can help you from my experience from my BSc degree. During my algorithm semester which I was unable to understand sit on group discussion with my friends then make all idea clear. If some confusion stays then go to our teacher.

No matter how big the constants are; thik about sorting algorithms: if you need to sort an array of less than 10 elements a bubble sort (On2) will do better work than quick sort O(nlogn), and it is specially true with insertion sort O(n2), as the move instructions which takes O(n) part from the O(n2) can be done using the move instruction from hardware....

And there are other behaviours to take into account, as quick sort is mean O(nlogn)... but insertion sort is better when input is almost sorted (it is O(n) for the best case).

Usually quick sort is used while the size of data remains big, but the last steps of quick sort are relagated to some other simpler algorithm.

Moreover, heapsort is O(nlogn) for the worst case, so it is better than quick sort (which is O(n2) for the worst case), but the constant is so big that usually quick sort is used.

You can find lot of information on internet. Here there is an interesting compilation of algorithm complexities:

http://bigocheatsheet.com/

The first thing I'm going to note is that Order Notation can be used for other things other then algorithms. It's just a way to tell us how any function "evenually" acts as n->inf.

You can use it for a final term in a taylor polynomial to give you an idea about the error for example.

CS borrows it from math because it's usefull to tell us how algorithms act for larger problems.

Now, you used O in your example, so I suppose you're also familiar with o, theta, THETA, omega and OMEGA already. This also applies to all of those definitions. I've never seen sigma used for order notation. I'll continue to stick with O for now.

The formal definition is:
We say f(n) is O(g(n)) iff there exists positive constants c and n0 such that 0 <= f(n) <= c*g(n), for all n < n0.

"there exists" means if you are able to find a 'c' and an 'n0'. So, even if you can "choose" 'c'and an 'n0' that don't satisfy the definition, it doesn't mean that the definition isn't satisfied for some other c and n0.

You asked what happens if you choose c = 2. The answer is that nothing happens, because you can still find a c that satisfies the definition. Thus, you can't just "choose" arbitrary constants. You need to PROVE that you can or connot find some constants that satisfy the definition.

So, if you want to show that f is not O(g(n)), you need to prove to me that it is absolutly impossible to find c and n0. If you want to prove it to me, then you need to show me the constants you used.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.