1.An algo can't have both constant time and O(n^2).
2.There is no such thing as O(n^2 + <something>)
The time the algorithm takes is a function of more than one variable, so O(n^2 + <something>) is perfectly reasonable notation.
3.Time complexity of an algorithm is independent of PL, OS or anything at all. Even if you have hardware quick sort on your machine it is still O(n log n) (Worst case O(n^2))
You're missing the point. I'm talking about development time; every serious language has the same time complexities. For any given task, some programming languages take less economic resources than others. You wouldn't use Delphi to code a supercomputer, would you? If you're writing a MacOSX application, Objective C would be the language of choice, no? If you're a scientist doing some numerical calculation, you'd use Matlab or something like it, no? They're the cheapest tools for their respective jobs, and you'd be wasting money/time otherwise.
4.I still have no idea about what is the algorithm we debate on its time complexity.
You can derive a closed form of the summation by simplifying the relation
sum_from_1_to_K(x^n) - sum_from_1_to_K((x-1)^n) = K^n
iteratively or recursively. The algorithm implements this and then evaluates the closed form of the expression.