Hi all!
I'm having some trouble computing the time complexity of the following code:
(A and B are doubles greater than 1)
void do(int n) {
if (n<=1) return;
print(n);
do(n*A/(A+B));
do(n*B/(A+B));
}
void print(int n) {
int i;
for (i=0;i<ni++)
printf("-");
}
He's what I got to so far:
function "do()" calls itself recursively until:
\frac{n\cdot{A}^k}{(A+B)^k}\leq1\\n\cdot\left(\frac{A}{A+B}\right)^k\leq1\\k\leq\log_{\frac{A}{A+B}}{\frac{1}{n}}
So far the time complexity is O(n\cdot\log{\frac{1}{n}}) (since each recursive call the do() function calls the print() function who's complexity is O(n)). But then the function starts calling itself from the second recursive call. I'm not sure how to continue from here.
Please help!