Time complexity of the following recurrence equation? - algorithm

Hi all I'm having problems calculating the complexity of the following recurrence equation:
T(n)={ O(1) , if n<=2
{ 2*T(n^(1/2)) + O(logn) , if n>=2
I got to a probable conclusion of O(2^n * nlogn). If anyone has got any clue I'd be happy. Thank you.

Suppose for now that n > 2 is a power of two, so that you can write n = 2^m. Also lets write the constant in your O(log(n)) term explicitly as c*log2(n).
Then, unravelling the recursion gives us :
T(2^m) <= 2*T((2^m)^(1/2)) + c*log2(2^m)
= 2*T(2^(m/2)) + c*m
<= 2*( 2*T((2^(m/2))^(1/2)) + c*log2(2^(m/2)) ) + c*m
= 4*T(2^(m/4)) + 2*c*m
<= 4*( 2*T((2^(m/4))^(1/2)) + c*log2(2^(m/4)) ) + 2*c*m
= 8*T(2^(m/8)) + 3*c*m
<= ...
= (2^log2(m))*T(2^1) + log2(m)*c*m
= m*T(2) + c*m*log2(m)
= log2(n)*T(2) + c*log2(n)*log2(log2(n))
= O(log2(n)*log2(log2(n)))
The term log2(m) comes from the fact that we divide m by two at each new recursion level, and so it will take (at most) log2(m) divisions before m <= 1.
Now if n is not a power of two, you can notice that there exists some number r which is a power of two such that n <= r < 2*n. And you can then write T(n) <= T(r) = O(log2(r)*log2(log2(r))) = O(log2(2*n)*log2(log2(2*n))) = O(log2(n)*log2(log2(n))).
So the overall answer is
T(n) = O(log2(n)*log2(log2(n)))

Related

confused about a nested loop having linear complexity(Big-Oh = O(n)) but I worked it to be logarithmic

Computing complexity and Big o of an algorithm
T(n) = 5n log n + 3log n + 2 // to the base 2 big o = o(n log n)
for(int x = 0,i = 1;i <= N;i*=2)
{
for(int j = 1 ;j <= i ;j++)
{
x++;
}
}
The Big o expected was linear where as mine is logarithmic
Your Big-Oh analysis is not correct. While it is true that the outer loop is executed log n times, the inner loop is linear in i at each iteration.
If you count the total number of iterations of the inner loop, you will see that the whole thing is linear:
The inner loop will do ‍1 + 2 + 4 + 8 + 16 + ... + (the last power of 2 <= N) iterations. This sum will be between N and 2*N, which makes the whole loop linear.
Let me explain why your analysis is wrong.
It is clear that inner loop will execute 1 + 2 + 4 + ... + 2^k times where k is the biggest integer which satisfies equation . This implies that upper bound for k is
Without loss of generality we can take upper bound for k and assume that k is integer, complexity equals to 1 + 2 + 4 + ... + = which is geometric series so it is equal to
Therefore in O notation it is O(n)
First, you should notice that your analysis is not logarithmic! As N \log N is not logarithmic.
Also, the time complexity is T(n) = sum_{j = 0}^{log(n)} 2^j (as the value of i duplicated each time). Hence, T(n) = 2^(log(N) + 1) - 1 = 2N - 1 = \Theta(N).

Computing expected time complexity of recursive program

I wish to determine the average processing time T(n) of the recursive algorithm:
int myTest( int n ) {
if ( n <= 0 ) {
return 0;
}
else {
int i = random( n - 1 );
return myTest( i ) + myTest( n - 1 - i );
}
}
provided that the algorithm random( int n ) spends one time unit to return
a random integer value uniformly distributed in the range [0, n] whereas
all other instructions spend a negligibly small time (e.g., T(0) = 0).
This is certainly not of the simpler form T(n) = a * T(n/b) + c so I am lost at how to write it. I can't figure out how to write it due to the fact that I am taking a random number each time from 0 to n-1 range and supplying it twice to the function and asking for the sum of those two calls.
The recurrence relations are:
T(0) = 0
T(n) = 1 + sum(T(i) + T(n-1-i) for i = 0..n-1) / n
The second can be simplified to:
T(n) = 1 + 2*sum(T(i) for i = 0..n-1) / n
Multiplying by n:
n T(n) = n + 2*sum(T(i) for i = 0..n-1)
Noting that (n-1) T(n-1) = n-1 + 2*sum(T(i) for i = 0..n-2), we get:
n T(n) = (n-1) T(n-1) + 1 + 2T(n-1)
= (n+1) T(n-1) + 1
Or:
T(n) = ((n+1)T(n-1) + 1) / n
This has the solution T(n) = n, which you can derive by telescoping the series, or by guessing the solution and then substituting it in to prove it works.

Recursive function runtime

1.Given that T(0)=1, T(n)=T([2n/3])+c (in this case 2n/3 is lower bound). What is big-Θ bound for T(n)? Is this just simply log(n)(base 3/2). Please tell me how to get the result.
2.Given the code
void mystery(int n) {
if(n < 2)
return;
else {
int i = 0;
for(i = 1; i <= 8; i += 2) {
mystery(n/3);
}
int count = 0;
for(i = 1; i < n*n; i++) {
count = count + 1;
}
}
}
According to the master theorem, the big-O bound is n^2. But my result is log(n)*n^2 (base 3) . I'm not sure of my result, and actually I do not really know how to deal with the runtime of recursive function. It is just simply the log function?
Or what if like in this code T(n)=4*T(n/3)+n^2?
Cheers.
For (1), the recurrence solves to c log3/2 n + c. To see this, you can use the iteration method to expand out a few terms of the recurrence and spot a pattern:
T(n) = T(2n/3) + c
= T(4n/9) + 2c
= T(8n/27) + 3c
= T((2/3)k n) + kc
Assuming that T(1) = c and solving for the choice of k that makes the expression inside the parentheses equal to 1, we get that
1 = (2/3)k n
(3/2)k = n
k = log3/2
Plugging in this choice of k into the above expression gives the final result.
For (2), you have the recurrence relation
T(n) = 4T(n/3) + n2
Using the master theorem with a = 4, b = 3, and d = 2, we see that logb a = log3 4 < d, so this solves to O(n2). Here's one way to see this. At the top level, you do n2 work. At the layer below that, you have four calls each doing n2 / 9 work, so you do 4n2 / 9 work, less than the top layer. The layer below that does 16 calls that each do n2 / 81 work for a total of 16n2 / 81 work, again much work than the layer above. Overall, each layer does exponentially less work than the layer above it, so the top layer ends up dominating all the other ones asymptotically.
Let's do some complexity analysis, and we'll find that the asymptotic behavior of T(n) depends on the constants of the recursion.
Given T(n) = A T(n*p) + C, with A,C>0 and p<1, let's first try to prove T(n)=O(n log n). We try to find D such that for large enough n
T(n) <= D(n * log(n))
This yields
A * D(n*p * log(n*p)) + C <= D*(n * log(n))
Looking at the higher order terms, this results in
A*D*p <= D
So, if A*p <= 1, this works, and T(n)=O(n log n).
In the special case that A<=1 we can do better, and prove that T(n)=O(log n):
T(n) <= D log(n)
Yields
A * D(log(n*p)) + C <= D*(log(n))
Looking at the higher order terms, this results in
A * D * log(n) + C + A * D *log(p) <= D * log(n)
Which is true for large enough D and n since A<=1 and log(p)<0.
Otherwise, if A*p>1, let's find the minimal value of q such that T(n)=O(n^q). We try to find the minimal q such that there exists D for which
T(n) <= D n^q
This yields
A * D p^q n^q + C <= D*n^q
Looking at the higher order terms, this results in
A*D*p^q <= D
The minimal q that satisfies this is defined by
A*p^q = 1
So we conclude that T(n)=O(n^q) for q = - log(A) / log(p).
Now, given T(n) = A T(n*p) + B n^a + C, with A,B,C>0 and p<1, try to prove that T(n)=O(n^q) for some q. We try to find the minimal q>=a such that for some D>0,
T(n) <= D n^q
This yields
A * D n^q p^q + B n^a + C <= D n^q
Trying q==a, this will work only if
ADp^a + B <= D
I.e. T(n)=O(n^a) if Ap^a < 1.
Otherwise we get to Ap^q = 1 as before, which means T(n)=O(n^q) for q = - log(A) / log(p).

Recurrence relation for given algorithm?

int print4Subtree(struct Node *root) {
if (root == NULL)
return 0;
int l = print4Subtree(root->left);
int r = print4Subtree(root->right);
if ((l + r + 1) == 4)
printf("%d ", root->data);
return (l + r + 1); }
This algorithm/code finds number of subtrees having exactly 4 nodes in binary tree , it's works in bottom-up manner .
I know the time complexity of this code would be O(n) , and space complexity is O(log n) , since it's using recursion.
What will be recurrence relation for the code ?
I try to draw T(n) = 2T(n-1)+1 , which is obviously wrong !
You can only talk about recurrence relations in terms of n alone in cases where you know more about the structure of the tree, for instance:
Case 1: Every node has only one child meaning
T(n) = T(0) + T(n-1) + k.
Case 2: Subtrees at any level are balanced so that
T(n) = 2 T((n-1)/2) + k.
Both of these will result in O(n), but these two cases are only a very select minority of possible trees. For a more universal approach you have to use a formula like T(n) = T(a) + T(b), where a and b are an arbitrary division into sub-problems resulting from the structure of your tree. You can still establish results from this kind of formula using strong induction.
The following is the exact formula and approach I would use:
T(n) = nk + mnc, where mn ≤ n + 1. (Note: I am using k for overhead of recursive steps and c for overhead of base/null steps).
Base case (n=0):
For a null node T(0) = c so T(n) = kn + mnc ,
where mn = 1 ≤ n+1 = 1.
Inductive step (T(x) = xk + mxc for all x < n):
The sub_tree of size n has two sub-trees of sizes a and b (a or b may be 0) such that n = a + b + 1.
T(n) = T(a) + T(b) + k = ak + mac + bk + mbc + k = (a+b+1)k + (ma+mb)c = nk + mnc ,
where mn = ma + mb ≤ a + 1 + b + 1 = n + 1.
The reason for using mn is merely a formality to make the proof smoother, as the exact number of null cases is what is actually affected by the structure of tree (in the former case 2, it is log n). So T(n) is at best O(n) because of the nk term, and can be no worst than O(n) because of the bound on mnc.

What is the approach to solving a recurrence relation when I have more than one recurrent function calls on the right-hand side of the equation?

I am trying to analyze the complexity of the function call to PEL(A[1..n]) where n is a certain power of 3 and PEL is defined by the following algorithm:
function PEL(A[m..n]){
if(n - m <= 1) return 1;
else {
p := [(n - m + 1)/3];
MAKE(A[m..n]);
PEL(A[m..n + p - 1]); PEL(A[m + p .. m + 3p - 1]);
}
}
The complexity of MAKE(A[m..n]) is theta( (n-m)log(n-m) ).
From what I have gathered so far, we are dealing with the following recurrence relation:
C(N) = C(N/3) + C(2*N/3) + theta( (n-m)log(n-m) )
Where
C(1) = C(2) = 1
I understand that we need to apply the master theorem here, but in the master theorem we have recurrence relations of the form:
C(N) = a * C(N/b) + f(n)
And I have no idea how to get rid of the second recurrent call to C() in my recurrence relation, so how do I do it? I don't know how to derive the values of a and b.
As all the commenters said, I need to use the Akra-Bazzi theorem.
C(1) = 1
C(2) = 1
For N > 2 we need to first find 'p' from the following equation :
(1/3)^p + (2/3)^p = 1. It is obvious that p = 1.
Next we need to solve N^p * ( 1 + I[1,N](log(u)/u) ) with p = 1
I[1,N](x) denotes the integral of x, from 1 to N.
also I wrote log(u)/u instead of (u - 1)log(u-1)/u^2
since I((u-1)log(u-1)/u^2) looks like a monster.
I[1,N](log(u)/u) gives log^2(N)/2 so the end result is N + N*(log^2(N)/2).
All in all, the running time = theta( N + N*(log^2(N)/2) ).

Resources