I know how to calculate the master theorem and I managed to calculate it for best and average case.
T(n) = 2T(n/2) + Theta(n)
The worst case equation is
T(n) = T(n-1) + Theta(n)
If I am correct a is 1, b is n/(n-1) and f(n) is n.
But how do I choose the right case of the master theorem and get a worst-case time complexity of Theta(n^2)?
Thanks!
As #DavidEisenstat pointed out in the comments, the Master Theorem doesn’t apply to the recurrence you’ve come up with here.
To give some context as to why this is - the Master Theorem is specifically designed for the case where
the number of subproblems is a constant, and
the sizes of the subproblems decays geometrically.
In this case, that second requirement doesn’t hold, since your recurrence has the problem size decay linearly rather than geometrically.
You are correct, however, that the recurrence solves to Θ(n2). To see why, note that if you unroll the recurrence, you get that the cost is
Θ(n + (n-1) + (n-2) + ... + 2 + 1)
= Θ(n(n+1)/2)
= Θ(n2).
Hope this helps!
Related
I was looking for the worst-case and best-case run-time analysis of the following recurrence relation:
T(n) = 2T(n/2) + T(n-1) + 1
I couldn't find strictly the same question on Stack Overflow or on the Web.
In this case, we have three branches, and we know that T(n/2) would reach the base case faster than T(n-1) would, so from my understanding, the longest leaf to root path represents the worst-case complexity and the shortest leaf to root path represents the best-case complexity.
As such, we have that the best case complexity would be:
T(n) = log(n) * T(1)
Assuming that T(1)=1, then we have best-case complexity
T(n) = O(logn)
If we look at the worst case complexity, we have
T(n) = n * T(1)
So, then we have (by assuming T(1)=1 again):
T(n) = O(n)
Could I be misunderstanding something here or is this timing analysis accurate for this recurrence relation?
Assuming that T(1)=1, then we have best-case complexity
You cannot simply replace T(1) and claim it the best-case complexity. Especially using the Big-O notation to denote that
T(n) = O(logn)
to be correct you would use Ω(logn).
For the best-case complexity, one needs to study the behavior of the algorithm when one increases the size, and analyze if there is any property of the algorithm that might cause different scenarios. For instance, searching in a BST can be constant in the best-case scenario, but you still considered it with an input 'n', and not what is the best case-scenario with a single element.
In your case, you do not have a concrete algorithm but rather a function (represented as a recurrence). Therefore, it does not make sense to talk about best- and worst- case scenarios
In this case, we have three branches, and we know that T(n/2) would
reach the base case faster than T(n-1) would, so from my
understanding, the longest leaf to root path represents the worst-case
complexity
When calculating the recurrence one should not only take into account the height of the recursion tree but also the number of branches. Therefore:
If we look at the worst case complexity, we have
T(n) = n * T(1)
your rational is not correct.
I am quite frustrated over this.
In CLRS 3rd edition, page 95 (chapter 4.5), it mentions that recurrences like
T(n) = 2T(n/2) + n lg n
cannot be solved with the Master Theorem because the difference
f(n)/n^(log_b(a)) = (n lg n)/n^1 = lg n
is not polynomial.
But then I come across pages like this this where, at the bottom of the page, it mentions the exact same recurrence and says that it CAN be solved with the Master Theorem because it falls into an "extended case 2" even though the difference is non-polynomial. It becomes n lg^2 n (incrementing the log factor on f(n) by one).
Then I come across pages like this where in example (e) seems like a clear application of Extended Case 2 (the recurrence is T(n) = 4T(n/2) + n^2 lg n), but then the solution is not n^2 log^2 n, but rather n^2 log n! Am I wrong or is the paper wrong?
Can anyone please clear up the contradictions and make it very clear exactly when Master Theorem can be used and when it cannot? When does the polynomial-difference check matter, and when does it not? Is the extended case 2 usable, or does it actually violate something?
EDIT:
I tried solving recurrence (e) directly from the second paper and I get:
T(n) = n^2 lg^2(n)/2 + n^2 lg(n)/2
Is this not big theta n^2 lg^2 n?
The book states that it cannot be solved using Case 3:
even though it appears to have the proper form: ... You might mistakenly think that case 3 should apply
However, this recurrence formula can be solved using master theorem, case 2.
T(n) = 2T(n/2) + nlgn:
We define:
a = 2, b = 2, f(n) = nlgn
Using Master theorem case 2:
c = log_2(2) = 1
k = 1
And f(n) is indeed in Theta(nlogn).
So, all conditions to master theorem case 2 apply, and we can deduce:
T(n) is in Theta(n^c * log(n)^(k+1)) = Theta(n*log(n)^2)
Long story short, Master theorem have 3 cases. Each case have it's prerequisites to be applied. Case 3 have more complicated prerequisites, because it also requires convergence.
Since the prerequisites for case 3 does not apply for this formula, you cannot use case 3. However, the prerequisites of case 2 - do apply, and you can use it.
I am trying to apply the Master's Theorem to a recurrence of this type:
T(n) = T(n/2) + 2^n
However, f(n) = 2^n doesn't seem to fit any of the three cases described in the master's theorem, which all seem to have base n instead of base 2. How can I solve a recurrence of this type, could anyone please help ? Thanks.
If none of the cases of the theorem applies, then the theorem won't solve your recurrence. It can't solve every single recurrence out there.
To address your issue: what you get by repeatedly substituting the recursive case is T(n) = 2^n + 2^(n/2) + 2^(n/4) + ... + 2, and since there are log n many terms to add up, you end up with something below 2^(n+1), so in total you're in Θ(2^n).
We can take log on both sides and solve. It will fall in case 3 of Master's theorem.
How do you go about finding the asymptotic complexity based off a running time? For example:
If the run time of a recursive algorithm is given as
T(n) = 2 T(n/2) + O(n)
considering the Master Theorem, what is its asymptotic complexity?
Could someone explain the steps on how to figure this out?
For the Master Theorem, there are 3 different cases to solve.
First step is to understand which case applies.
For questions involving Master Theorem, our general formula is T(n) = aT(n/b) + f(n)
So first thing to do is compare n^(logb a) with f(n).
Simply whichever is bigger that's the complexity of it(these are Cases 1 and 3) and if they are equal then you multiply the result with lgn, such as if you have a case like T(n) = 16 T(n/4) + O(n^2) then
n^(logb a) would make n^2 as well as f(n) = n^2 so the answer would be Θ(n^(2)*lgn), which is by using the case #2 of the Master Theorem.
T (1) = c
T (n) = T (n/2) + dn
How would I determine BigO of this quickly?
Use repeated backsubstitution and find the pattern. An example here.
I'm not entirely sure what dn is, but assuming you mean a constant multiplied by n:
According to Wolfram Alpha, the recurrence equation solution for:
f(n) = f(n / 2) + cn
is:
f(n) = 2c(n - 1) + c1
which would make this O(n).
Well, the recurrence part of the relationship is the T(n/2) part, which is in effect halving the value of n each time.
Thus you will need approx. (log2 n) steps to get to the termination condition, hence the overall cost of the algorithm is O(log2 n). You can ignore the dn part as is it a constant-time operation for each step.
Note that as stated, the problem won't necessarily terminate since halving an arbitrary value of n repeatedly is unlikely to exactly hit 1. I suspect that the T(n/2) part should actually read T(floor (n / 2)) or something like that in order to ensure that this terminates.
use master's theorem
see http://en.wikipedia.org/wiki/Master_theorem
By the way, the asymptotic behaviour of your recurrence is O(n) assuming d is positive and sufficiently smaller than n (size of problem)