Solving by Masters theorem and recursion tree gives different answers - algorithm

Suppose we have been given a recurrence relation T(n) = 2T(n/3) + n. And need to find the time complexity.
My problem is why I am getting different answers by master theorem and using recursion tree method.
By Master Theorem for equation T(n) = aT(n/b) + O(nklogpn)
Here a=2,b=3,p=0,k=1 and as a < bk here i.e 2 < 31 by masters theorem it gives us T(n) = O(n) time complexity.
Here is my recursion tree method
And at the end I seriously don't know what did I found out. It looks something like O(n.2logn).
Surely its not the answer and I have messed it up. But I don't get where I am wrong? What is right approach?
Also I wanted to ask if recurrence relations T(n) = 2T(n/3) + n and T(n) = 2T(n/3) + C are both same? where C is a constant in 2nd equation.

Related

Which way is best to solve: T(n)=4T(n/5)+(log(n√n))^5?

I need to find the asymptotic behavior of T(n) = 4T(n/5) + (log(n √n))5, in the form Θ(…).
I know of three ways:
recursion tree
master method
recurrence
Which way is easiest? And, how I can be sure I got the right answer?
For the asymptotic behavior, you can find the master theorem (if it can be applied) useful. Although in the master theorem's proof, it is using from the recursion tree, and these methods that you have written are not independent.
To using the master theorem, first simplify the not recursive part:
log(n\sqrt(n)) = log(n) + log(\sqrt(n)) = 3/2 log(n)
Hence:
T(n) = 4T(n/5) + (3/2 log(n))^5
From the master theorem, c_critic = log_5(4) = log(4)/log(5) ~ 0.86, we know that (3/2 log(n))^5 = O(n^0.5) such that 0.5 < c_critic. Therefore, T(n) = Theta(n^{log(4)/log(5)}) ~ Theta(n^0.86).

Question about time complexity and Master Theorem

I'm quite new to algorithm and I encountered a question that I don't know how to apply Master Theorem:
We have an algorithm A, which solves P by creating 3 subproblems, each of size 2n/3, recursively solving each subproblem, and then combining solutions in O(nlogn) time. Would this algorithm have a better running time than O(n)? Prove your answer.
What I know here is a=3, b=3/2, but how can I deal with O(nlogn)?
Henve, the recursive formula is T(n) = 3T(2n/3) + O(n\log(n)). As f(n) = O(n\log(n)) = O(n^{\log(3)/\log(1.5)}) ~ O(n^{2.7}), from the master theorem we can say T(n) = \Theta(n^{2.7}).
Therefore, T(n) = \Omega(n^{2.7}).

Recurrence: T(n) = 3T(n/2) + n^2(lgn)

Here is the full question...
Analysis of recurrence trees. Find the nice nonrecursive function f (n) such that
T(n) = Θ( f (n)). Show your work: what is the number of levels, number of instances on each level, work of each instance and the total work on that level.
This is a homework question so I do not expect exact answers, but I would like some guidance because I have no idea where to start. Here is part a:
a) T(n) = 3T(n/2) + n^2(lgn)
I really have no idea where to begin.
These types of recurrences are solved with Master's theorem
In your case a=3, b=2 and therefore c = log2(3) < 2.
So you are in the third case and your complexity is O(n^2 * log(n))

Problems Solving Recurrence T(n) = 4T(n/4) + 3log n

I'm really getting frustrated about solving the Recurrence above. I was trying to solve it by using the Master Method, but I just didn't get it done...
I'm having a recursive algorithm that takes 3log n times (three binary searches) to identify four sub-problems, each with with a size of n/4, and then solves them individually until n is smaller than some constant given by input. So I got this recurrence as a result:
T(n) = 4*T(n/4) + 3*log(n)
Base-Case if n < c (c = some constant given by program input):
T(n) = 1
I'm trying to find the asymptotic running time of my recursive program, and wanted to solve it by using the master theorem. Can anybody tell me if it's possible to use the master theorem with this recurrence, and if yes, which case of the master theorem is it?
All help is appreciated, thanks.
T(n) = O(n), because a logarithm of 4 base 4 is 1 and 3 * log(n) is O(n ^ 0.5)(0.5 < 1). It corresponds to the first case of the Master theorem as described here.

How to convert time analysis to O(n)?

Rookie computer science student here, have a question I'm having some trouble answering.
I have a tree traversal algorithm, the time performance of which is O(bm) where b is the branching factor and m is the max depth of the tree. I was wondering how one takes this and converts it into standard asymptotic time analysis (IE O(n), O(n^2), etc).
Same question for a different algorithm I have which is O(b^m).
I have gone through my textbook extensively and not found a clear answer about this. Asymptotic time analysis usually relates to input (n) but I'm not sure what n would mean in this instance. I suppose it would be m?
In general, what do you do when you have multiple inputs?
Thank you for your time.
You should start with building a recurrence. For example, let us consider binary search. The recurrence comes as: T(n) = T(n/2) + c. When you solve it, you will get
T(n) = T(n/2) + c
= T(n/4) + c + c
= T(n/8) + c + c + c
...
= T(n/2^k) + kc
The recurrence is solved when n = 2^k or k = log_2(n). So, the complexity is c.log_2(n)
Now, let us look at another situation where the input is divided into 5 parts, and the results combined in linear time. This recurrence will be
T(n) = 5T(n/5) + n
= 5^2T(n/5^2) + 2n
...
= 5^kT(n/5^k) + kn
This will stop when n = 5^k or k = log_5(n). So, substituting above, the complexity is: n.log_5(n).
I guess you should be able to take it from here on.

Resources