I'm quite new to algorithm and I encountered a question that I don't know how to apply Master Theorem:
We have an algorithm A, which solves P by creating 3 subproblems, each of size 2n/3, recursively solving each subproblem, and then combining solutions in O(nlogn) time. Would this algorithm have a better running time than O(n)? Prove your answer.
What I know here is a=3, b=3/2, but how can I deal with O(nlogn)?
Henve, the recursive formula is T(n) = 3T(2n/3) + O(n\log(n)). As f(n) = O(n\log(n)) = O(n^{\log(3)/\log(1.5)}) ~ O(n^{2.7}), from the master theorem we can say T(n) = \Theta(n^{2.7}).
Therefore, T(n) = \Omega(n^{2.7}).
Related
Suppose we have been given a recurrence relation T(n) = 2T(n/3) + n. And need to find the time complexity.
My problem is why I am getting different answers by master theorem and using recursion tree method.
By Master Theorem for equation T(n) = aT(n/b) + O(nklogpn)
Here a=2,b=3,p=0,k=1 and as a < bk here i.e 2 < 31 by masters theorem it gives us T(n) = O(n) time complexity.
Here is my recursion tree method
And at the end I seriously don't know what did I found out. It looks something like O(n.2logn).
Surely its not the answer and I have messed it up. But I don't get where I am wrong? What is right approach?
Also I wanted to ask if recurrence relations T(n) = 2T(n/3) + n and T(n) = 2T(n/3) + C are both same? where C is a constant in 2nd equation.
I have to solve this recurrence relation with tree method, because Master theorem does not apply.
T(n) = (2+1/log n) T(n/2)
After a some thoughts I can not come up with an exact solution. Master's theorem does not work here and unrolling the tree has not gave me anything reasonable. So I will just estimate the complexity in the following way.
For any reasonably big n you can estimate 0 < 1/log n < 1. So you can get:
T1(n) = 2 * T1(n/2)
T2(n) = 3 * T2(n/2)
and O(T1) < O(T) < O(T2). You can find the complexity for both recurrences using master theorem. The complexity of T1 is O(n) and of T2 is O(n^log2(3)).
So you can be sure that the complexity of your recurrence is bigger than O(n) and less than O(n^1.58), so less than quadratic.
I know that we can apply the Master Theorem to find the running time of a divide and conquer algorithm, when the recurrence relation has the form of:
T(n) = a*T(n/b) + f(n)
We know the following :
a is the number of subproblems that the algorithm divides the original problem
b is the size of the sun-problem i.e n/b
and finally.. f(n) encompasses the cost of dividing the problem and combining the results of the subproblems.
Now we then find something (I will come back to the term "something")
and we have 3 cases to check.
The case that f(n) = O(n^log(b)a-ε) for some ε>0; Then T(n) is O(n*log(b)a)
The case that f(n) = O(n^log(b)a) ; Then T(n) is O(n^log(b)a * log n).
If n^log(b)a+ε = O(f(n)) for some constant ε > 0, and if a*f(n/b) =< cf(n) for some constant
c < 1 and almost all n, then T(n) = O(f(n)).
All fine, I am recalling the term something. How we can use general examples (i.e uses variables and not actual numbers) to decide which case the algorithm is in?
In instance. Consider the following:
T(n) = 8T(n/2) + n
So a = 8, b = 2 and f(n) = n
How I will proceed then? How can I decide which case is? While the function f(n) = some big-Oh notation how these two things are comparable?
The above is just an example to show you where I don't get it, so the question is in general.
Thanks
As CLRS suggests, the basic idea is comparing f(n) with n^log(b)a i.e. n to the power (log a to the base b). In your hypothetical example, we have:
f(n) = n
n^log(b)a = n^3, i.e. n-cubed as your recurrence yields 8 problems of half the size at every step.
Thus, in this case, n^log(b)a is larger because n^3 is always O(n) and the solution is: T(n) = θ(n^3).
Clearly, the number of subproblems vastly outpaces the work (linear, f(n) = n) you are doing for each subproblem. Thus, the intuition tells and master theorem verifies that it is the n^log(b)a that dominates the recurrence.
There is a subtle technicality where the master theorem says that f(n) should be not only smaller than n^log(b)a O-wise, it should be smaller polynomially.
I'm really getting frustrated about solving the Recurrence above. I was trying to solve it by using the Master Method, but I just didn't get it done...
I'm having a recursive algorithm that takes 3log n times (three binary searches) to identify four sub-problems, each with with a size of n/4, and then solves them individually until n is smaller than some constant given by input. So I got this recurrence as a result:
T(n) = 4*T(n/4) + 3*log(n)
Base-Case if n < c (c = some constant given by program input):
T(n) = 1
I'm trying to find the asymptotic running time of my recursive program, and wanted to solve it by using the master theorem. Can anybody tell me if it's possible to use the master theorem with this recurrence, and if yes, which case of the master theorem is it?
All help is appreciated, thanks.
T(n) = O(n), because a logarithm of 4 base 4 is 1 and 3 * log(n) is O(n ^ 0.5)(0.5 < 1). It corresponds to the first case of the Master theorem as described here.
T(n) = 4T(n/2) + n
= O(n2) using master theorem.
Is the above more complex than the one below?
T(n) = 3T(n/4) + n2
both are O(n2) using master theorem,
but I do not know how to check the constant.
Hint: Easier question: which one has higher complexity? 4N2 or 5N2