asymptotic complexity based off running time? - asymptotic-complexity

How do you go about finding the asymptotic complexity based off a running time? For example:
If the run time of a recursive algorithm is given as
T(n) = 2 T(n/2) + O(n)
considering the Master Theorem, what is its asymptotic complexity?
Could someone explain the steps on how to figure this out?

For the Master Theorem, there are 3 different cases to solve.
First step is to understand which case applies.
For questions involving Master Theorem, our general formula is T(n) = aT(n/b) + f(n)
So first thing to do is compare n^(logb a) with f(n).
Simply whichever is bigger that's the complexity of it(these are Cases 1 and 3) and if they are equal then you multiply the result with lgn, such as if you have a case like T(n) = 16 T(n/4) + O(n^2) then
n^(logb a) would make n^2 as well as f(n) = n^2 so the answer would be Θ(n^(2)*lgn), which is by using the case #2 of the Master Theorem.

Related

Worst Case and Best Case Run-time Complexity of Recurrence Relation T(n) = 2T(n/2) + T(n-1) + constant

I was looking for the worst-case and best-case run-time analysis of the following recurrence relation:
T(n) = 2T(n/2) + T(n-1) + 1
I couldn't find strictly the same question on Stack Overflow or on the Web.
In this case, we have three branches, and we know that T(n/2) would reach the base case faster than T(n-1) would, so from my understanding, the longest leaf to root path represents the worst-case complexity and the shortest leaf to root path represents the best-case complexity.
As such, we have that the best case complexity would be:
T(n) = log(n) * T(1)
Assuming that T(1)=1, then we have best-case complexity
T(n) = O(logn)
If we look at the worst case complexity, we have
T(n) = n * T(1)
So, then we have (by assuming T(1)=1 again):
T(n) = O(n)
Could I be misunderstanding something here or is this timing analysis accurate for this recurrence relation?
Assuming that T(1)=1, then we have best-case complexity
You cannot simply replace T(1) and claim it the best-case complexity. Especially using the Big-O notation to denote that
T(n) = O(logn)
to be correct you would use Ω(logn).
For the best-case complexity, one needs to study the behavior of the algorithm when one increases the size, and analyze if there is any property of the algorithm that might cause different scenarios. For instance, searching in a BST can be constant in the best-case scenario, but you still considered it with an input 'n', and not what is the best case-scenario with a single element.
In your case, you do not have a concrete algorithm but rather a function (represented as a recurrence). Therefore, it does not make sense to talk about best- and worst- case scenarios
In this case, we have three branches, and we know that T(n/2) would
reach the base case faster than T(n-1) would, so from my
understanding, the longest leaf to root path represents the worst-case
complexity
When calculating the recurrence one should not only take into account the height of the recursion tree but also the number of branches. Therefore:
If we look at the worst case complexity, we have
T(n) = n * T(1)
your rational is not correct.

Master theorem for worst case quicksort

I know how to calculate the master theorem and I managed to calculate it for best and average case.
T(n) = 2T(n/2) + Theta(n)
The worst case equation is
T(n) = T(n-1) + Theta(n)
If I am correct a is 1, b is n/(n-1) and f(n) is n.
But how do I choose the right case of the master theorem and get a worst-case time complexity of Theta(n^2)?
Thanks!
As #DavidEisenstat pointed out in the comments, the Master Theorem doesn’t apply to the recurrence you’ve come up with here.
To give some context as to why this is - the Master Theorem is specifically designed for the case where
the number of subproblems is a constant, and
the sizes of the subproblems decays geometrically.
In this case, that second requirement doesn’t hold, since your recurrence has the problem size decay linearly rather than geometrically.
You are correct, however, that the recurrence solves to Θ(n2). To see why, note that if you unroll the recurrence, you get that the cost is
Θ(n + (n-1) + (n-2) + ... + 2 + 1)
= Θ(n(n+1)/2)
= Θ(n2).
Hope this helps!

Time complexity of the expression

I am trying to find the time complexity of the following:
Algorithm C solves X by dividing a problem of size n into 2 subproblems of size n/2, recursively solving each sub problem, and then combining the solutions in O(n^3) time.
My try:
2 T(n/2) + O(n^3)
Can I write directly that the time complexity will be : O(n^3)
using master theorem we can solve 2 T(n/2) which would result in O(n) then as per the question add O(n^3)
thus.,
O(n) + O(n^3)
As you have mentioned The time complexity will be : O(n^3) but be sure to solve the `2 T(n/2)' since if this part has a greater complexity it would affect the final result.

When can the Master Theorem actually be applied?

I am quite frustrated over this.
In CLRS 3rd edition, page 95 (chapter 4.5), it mentions that recurrences like
T(n) = 2T(n/2) + n lg n
cannot be solved with the Master Theorem because the difference
f(n)/n^(log_b(a)) = (n lg n)/n^1 = lg n
is not polynomial.
But then I come across pages like this this where, at the bottom of the page, it mentions the exact same recurrence and says that it CAN be solved with the Master Theorem because it falls into an "extended case 2" even though the difference is non-polynomial. It becomes n lg^2 n (incrementing the log factor on f(n) by one).
Then I come across pages like this where in example (e) seems like a clear application of Extended Case 2 (the recurrence is T(n) = 4T(n/2) + n^2 lg n), but then the solution is not n^2 log^2 n, but rather n^2 log n! Am I wrong or is the paper wrong?
Can anyone please clear up the contradictions and make it very clear exactly when Master Theorem can be used and when it cannot? When does the polynomial-difference check matter, and when does it not? Is the extended case 2 usable, or does it actually violate something?
EDIT:
I tried solving recurrence (e) directly from the second paper and I get:
T(n) = n^2 lg^2(n)/2 + n^2 lg(n)/2
Is this not big theta n^2 lg^2 n?
The book states that it cannot be solved using Case 3:
even though it appears to have the proper form: ... You might mistakenly think that case 3 should apply
However, this recurrence formula can be solved using master theorem, case 2.
T(n) = 2T(n/2) + nlgn:
We define:
a = 2, b = 2, f(n) = nlgn
Using Master theorem case 2:
c = log_2(2) = 1
k = 1
And f(n) is indeed in Theta(nlogn).
So, all conditions to master theorem case 2 apply, and we can deduce:
T(n) is in Theta(n^c * log(n)^(k+1)) = Theta(n*log(n)^2)
Long story short, Master theorem have 3 cases. Each case have it's prerequisites to be applied. Case 3 have more complicated prerequisites, because it also requires convergence.
Since the prerequisites for case 3 does not apply for this formula, you cannot use case 3. However, the prerequisites of case 2 - do apply, and you can use it.

Applying Master's Theorem with f(n) = 2^n

I am trying to apply the Master's Theorem to a recurrence of this type:
T(n) = T(n/2) + 2^n
However, f(n) = 2^n doesn't seem to fit any of the three cases described in the master's theorem, which all seem to have base n instead of base 2. How can I solve a recurrence of this type, could anyone please help ? Thanks.
If none of the cases of the theorem applies, then the theorem won't solve your recurrence. It can't solve every single recurrence out there.
To address your issue: what you get by repeatedly substituting the recursive case is T(n) = 2^n + 2^(n/2) + 2^(n/4) + ... + 2, and since there are log n many terms to add up, you end up with something below 2^(n+1), so in total you're in Θ(2^n).
We can take log on both sides and solve. It will fall in case 3 of Master's theorem.

Resources