Recurrence: T(n) = 3T(n/2) + n^2(lgn) - algorithm

Here is the full question...
Analysis of recurrence trees. Find the nice nonrecursive function f (n) such that
T(n) = Θ( f (n)). Show your work: what is the number of levels, number of instances on each level, work of each instance and the total work on that level.
This is a homework question so I do not expect exact answers, but I would like some guidance because I have no idea where to start. Here is part a:
a) T(n) = 3T(n/2) + n^2(lgn)
I really have no idea where to begin.

These types of recurrences are solved with Master's theorem
In your case a=3, b=2 and therefore c = log2(3) < 2.
So you are in the third case and your complexity is O(n^2 * log(n))

Related

Solving by Masters theorem and recursion tree gives different answers

Suppose we have been given a recurrence relation T(n) = 2T(n/3) + n. And need to find the time complexity.
My problem is why I am getting different answers by master theorem and using recursion tree method.
By Master Theorem for equation T(n) = aT(n/b) + O(nklogpn)
Here a=2,b=3,p=0,k=1 and as a < bk here i.e 2 < 31 by masters theorem it gives us T(n) = O(n) time complexity.
Here is my recursion tree method
And at the end I seriously don't know what did I found out. It looks something like O(n.2logn).
Surely its not the answer and I have messed it up. But I don't get where I am wrong? What is right approach?
Also I wanted to ask if recurrence relations T(n) = 2T(n/3) + n and T(n) = 2T(n/3) + C are both same? where C is a constant in 2nd equation.

Solution for the recurrence

I am looking for the solution of this recurrence. Basically I want to learn how to solve this kind of recurrence and how to get its value.
T(N) = 3T(N/3) + T(N/2) + N
Using Akra-Bazzi method is definitely a great solution.
The below solution about recurrence tree is wrong. Find why in the comments.
Just want to left the wrong solution here in case someone is curious or makes the same error.
Here is a solution from recurrence tree.
For level h, it will look like
T(N) = 3^h*T(N/3^h) + T(N/2^h) + \sum\limits_{i=1}^{h-1} (3^{i-1} + 3^i)*T(N/(3^i 2^{h-i}))
which means level h contributes
N < N + N/2^h + 4N/3 * (1-\frac{1}{2^{h-1}}) < 4N
And notice that, the height of the recurrence tree is between
(\log_2 N, \log_3 N)
Thus, the big-Oh notation is
O(N log N)
( Just want to illustrate that recurrence tree is a possible solution.

Get the complexity of T(n)=T(n/4)+T(3n/4)+c

In this recurrence relation T(n)=T(n/4)+T(3n/4)+c , I am having just confusion that what is the relation of this recurrence relation with the best and worst case analysis since we have to solve both the sub-problems which are of size n/4 and 3n/4 so what is the terminology of worst case or best case analysis here ?
Moreover we should use here theta(log n ) our O(log n ) ,although seeing the below link I found O(log n ) more applicable but still couldn't get why are we not using theta(log n ) here .
How to solve the recursive complexity T(n) = T(n/4)+T(3n/4)+cn
T(n) = T(n/4) + T(3n/4) + CONST <= 2T(3n/4) + CONST
We will use case 1 of master theorem with:
a = 2, b = 4/3.
c = log_{4/3}(2) ~= 0.4
CONST is in O(n^0.4)
Thus, from master theorem, one cad derive that 2T(3n/4) + CONST is in Theta(logn), and since T(n) <= 2T(3n/4) + CONST, we can say that T(n) is in O(logn).
By following the same idea, but with lower bound:
T(n) >= T(3n/4) + CONST ...
And using master theorem again, we can tell that T(n) is also in Omega(logn).
Since T(n) is both O(logn) and Omega(logn), it is also Theta(logn).
As for your question, you can use either big-O or Theta notation, whatever you prefer. As you can see, proving Theta requires a bit more work, but it is also more informative, as it tells you the bound you found is tight.
These types of recurrences can be easily solved with Akra-Bazzi theorem (and if you have looked at the question you linked, someone showed you a solution to a similar problem).
So 1/4^p + (3/4)^p = 1, where p = 1. In your case g(u) = c, so the integral
So int c/u^2 du from 1 to x which is equal to -1/u evaluated from 1 to x. This is equal to -1/x + 1. Now when you multiply it by x and you will get that the complexity is O(n) and not O(log n) as other people suggested.

How to convert time analysis to O(n)?

Rookie computer science student here, have a question I'm having some trouble answering.
I have a tree traversal algorithm, the time performance of which is O(bm) where b is the branching factor and m is the max depth of the tree. I was wondering how one takes this and converts it into standard asymptotic time analysis (IE O(n), O(n^2), etc).
Same question for a different algorithm I have which is O(b^m).
I have gone through my textbook extensively and not found a clear answer about this. Asymptotic time analysis usually relates to input (n) but I'm not sure what n would mean in this instance. I suppose it would be m?
In general, what do you do when you have multiple inputs?
Thank you for your time.
You should start with building a recurrence. For example, let us consider binary search. The recurrence comes as: T(n) = T(n/2) + c. When you solve it, you will get
T(n) = T(n/2) + c
= T(n/4) + c + c
= T(n/8) + c + c + c
...
= T(n/2^k) + kc
The recurrence is solved when n = 2^k or k = log_2(n). So, the complexity is c.log_2(n)
Now, let us look at another situation where the input is divided into 5 parts, and the results combined in linear time. This recurrence will be
T(n) = 5T(n/5) + n
= 5^2T(n/5^2) + 2n
...
= 5^kT(n/5^k) + kn
This will stop when n = 5^k or k = log_5(n). So, substituting above, the complexity is: n.log_5(n).
I guess you should be able to take it from here on.

Algorithm complexity, solving recursive equation

I'm taking Data Structures and Algorithm course and I'm stuck at this recursive equation:
T(n) = logn*T(logn) + n
obviously this can't be handled with the use of the Master Theorem, so I was wondering if anybody has any ideas for solving this recursive equation. I'm pretty sure that it should be solved with a change in the parameters, like considering n to be 2^m , but I couldn't manage to find any good fix.
The answer is Theta(n). To prove something is Theta(n), you have to show it is Omega(n) and O(n). Omega(n) in this case is obvious because T(n)>=n. To show that T(n)=O(n), first
Pick a large finite value N such that log(n)^2 < n/100 for all n>N. This is possible because log(n)^2=o(n).
Pick a constant C>100 such that T(n)<Cn for all n<=N. This is possible due to the fact that N is finite.
We will show inductively that T(n)<Cn for all n>N. Since log(n)<n, by the induction hypothesis, we have:
T(n) < n + log(n) C log(n)
= n + C log(n)^2
< n + (C/100) n
= C * (1/100 + 1/C) * n
< C/50 * n
< C*n
In fact, for this function it is even possible to show that T(n) = n + o(n) using a similar argument.
This is by no means an official proof but I think it goes like this.
The key is the + n part. Because of this, T is bounded below by o(n). (or should that be big omega? I'm rusty.) So let's assume that T(n) = O(n) and have a go at that.
Substitute into the original relation
T(n) = (log n)O(log n) + n
= O(log^2(n)) + O(n)
= O(n)
So it still holds.

Resources