I am trying to apply the Master's Theorem to a recurrence of this type:
T(n) = T(n/2) + 2^n
However, f(n) = 2^n doesn't seem to fit any of the three cases described in the master's theorem, which all seem to have base n instead of base 2. How can I solve a recurrence of this type, could anyone please help ? Thanks.
If none of the cases of the theorem applies, then the theorem won't solve your recurrence. It can't solve every single recurrence out there.
To address your issue: what you get by repeatedly substituting the recursive case is T(n) = 2^n + 2^(n/2) + 2^(n/4) + ... + 2, and since there are log n many terms to add up, you end up with something below 2^(n+1), so in total you're in Θ(2^n).
We can take log on both sides and solve. It will fall in case 3 of Master's theorem.
Related
I know how to calculate the master theorem and I managed to calculate it for best and average case.
T(n) = 2T(n/2) + Theta(n)
The worst case equation is
T(n) = T(n-1) + Theta(n)
If I am correct a is 1, b is n/(n-1) and f(n) is n.
But how do I choose the right case of the master theorem and get a worst-case time complexity of Theta(n^2)?
Thanks!
As #DavidEisenstat pointed out in the comments, the Master Theorem doesn’t apply to the recurrence you’ve come up with here.
To give some context as to why this is - the Master Theorem is specifically designed for the case where
the number of subproblems is a constant, and
the sizes of the subproblems decays geometrically.
In this case, that second requirement doesn’t hold, since your recurrence has the problem size decay linearly rather than geometrically.
You are correct, however, that the recurrence solves to Θ(n2). To see why, note that if you unroll the recurrence, you get that the cost is
Θ(n + (n-1) + (n-2) + ... + 2 + 1)
= Θ(n(n+1)/2)
= Θ(n2).
Hope this helps!
I am quite frustrated over this.
In CLRS 3rd edition, page 95 (chapter 4.5), it mentions that recurrences like
T(n) = 2T(n/2) + n lg n
cannot be solved with the Master Theorem because the difference
f(n)/n^(log_b(a)) = (n lg n)/n^1 = lg n
is not polynomial.
But then I come across pages like this this where, at the bottom of the page, it mentions the exact same recurrence and says that it CAN be solved with the Master Theorem because it falls into an "extended case 2" even though the difference is non-polynomial. It becomes n lg^2 n (incrementing the log factor on f(n) by one).
Then I come across pages like this where in example (e) seems like a clear application of Extended Case 2 (the recurrence is T(n) = 4T(n/2) + n^2 lg n), but then the solution is not n^2 log^2 n, but rather n^2 log n! Am I wrong or is the paper wrong?
Can anyone please clear up the contradictions and make it very clear exactly when Master Theorem can be used and when it cannot? When does the polynomial-difference check matter, and when does it not? Is the extended case 2 usable, or does it actually violate something?
EDIT:
I tried solving recurrence (e) directly from the second paper and I get:
T(n) = n^2 lg^2(n)/2 + n^2 lg(n)/2
Is this not big theta n^2 lg^2 n?
The book states that it cannot be solved using Case 3:
even though it appears to have the proper form: ... You might mistakenly think that case 3 should apply
However, this recurrence formula can be solved using master theorem, case 2.
T(n) = 2T(n/2) + nlgn:
We define:
a = 2, b = 2, f(n) = nlgn
Using Master theorem case 2:
c = log_2(2) = 1
k = 1
And f(n) is indeed in Theta(nlogn).
So, all conditions to master theorem case 2 apply, and we can deduce:
T(n) is in Theta(n^c * log(n)^(k+1)) = Theta(n*log(n)^2)
Long story short, Master theorem have 3 cases. Each case have it's prerequisites to be applied. Case 3 have more complicated prerequisites, because it also requires convergence.
Since the prerequisites for case 3 does not apply for this formula, you cannot use case 3. However, the prerequisites of case 2 - do apply, and you can use it.
How do you go about finding the asymptotic complexity based off a running time? For example:
If the run time of a recursive algorithm is given as
T(n) = 2 T(n/2) + O(n)
considering the Master Theorem, what is its asymptotic complexity?
Could someone explain the steps on how to figure this out?
For the Master Theorem, there are 3 different cases to solve.
First step is to understand which case applies.
For questions involving Master Theorem, our general formula is T(n) = aT(n/b) + f(n)
So first thing to do is compare n^(logb a) with f(n).
Simply whichever is bigger that's the complexity of it(these are Cases 1 and 3) and if they are equal then you multiply the result with lgn, such as if you have a case like T(n) = 16 T(n/4) + O(n^2) then
n^(logb a) would make n^2 as well as f(n) = n^2 so the answer would be Θ(n^(2)*lgn), which is by using the case #2 of the Master Theorem.
What is the runtime of a recurrence T(n)=3T(2n/3)+1 and how did you get it?
This type of recurrences can be solved with a Master theorem. Here a=3, b=3/2 and f(n) = 1. Your c = log1.5(3) = 2.709 And because n^2.709 is bigger than f(n), you fall into first case.
So the solution is O(n^2.709)
Use the Master Theorem. This is a lot easier than trying to solve the recurrence yourself, as you tried in your original question.
OTTOMH this should get you T(n) = Theta(n^2.7) (Case 1 of the Master Theorem).
I'm trying to solve a recurrence relation to find out the complexity of an algorithm I wrote. This is the equation..
T(n) = T(n-1) + Θ(n)
And I found out the answer to O(n2), but I'm not sure if I did it right. Can someone please confirm?
Update: What if the equation is T(n) = T(n-1)+Θ(nlogn)? Will it still be O(n2)?
It is O(N)+O(N-1)+...+O(1) = O(N*(N+1)/2). So yes, the total complexity is quadratic.
Yes, you guess it right.
However, the form of the recurrence doesn't fit with Master method. Since you have guessed the bound correctly, substitution method is more suitable here.
Now your job is finding two constants c and n0 to prove that:
T(n) <= c*(n^2) forall n >= n0