Assume an arbitrary r
T(n) <= cn + T(n/r) + T (3n/4)
show T(n) <= Dcn for some constant D
by reworking the induction proof, use the expression to argue that:
T(n) <= Dcn does not hold for r=3.
Have a look at the Akra-Bazzi theorem. This is a generalization of the master theorem that does not require subproblems of equal size.
Related
Often in CLRS, when proving recurrences via substitution, Ө(f(n)) is replaced with cf(n).
For example,on page 91, the recurrence
T(n) = 3T(⌊n/4⌋) + Ө(n^2)
is written like so in the proof
T(n) <= 3T(⌊n/4⌋) + cn^2
But can't Ө(n^2) stand for, let's say, cn^2 + n? Would that not make such a proof invalid? Further in the proof, the statement
T(n) <= (3/16)dn^2 + cn^2
<= dn^2
is reached. But if cn^2 +n was used instead, it would instead be the following
T(n)<= (3/16)dn^2 + cn^2 + n
Can it still be proven that T(n) <= dn^2 if this is so? Do such lower order terms not matter in proving recurrences via substitution?
Yes, it does not matter.
T(n) <= (3/16)dn^2 + cn^2 + n still less than or equal to dn^2 if n is big enough. Because as n goes to infinity, two sides of the equation have the same increasing rate (which is n^2), so the lower-order term will never matter if there is a constant number of lower-order terms in the cost function. But if there is not a constant number of them, that is a different story.
Edit: as n goes to infinity, you will find suitable d and c for T(n) <= (3/16)dn^2 + cn^2 + n to be less than or equal to dn^2, for example d = 2 and c = 1
I am reading the book "Introduction to Algorithms" and I have this problem that is saying to derive the lower and upper bound for T(n) for the following recurrence :
T(n) = 36T(n/6) + 2n
And its saying assume that: T(n) is constant for n <= 2, can anyone explain to me the last sentence, what does that really mean?
assume that: T(n) is constant for n <= 2
It means T(0) = c_0 and T(1) = c_1 are constant proportional to n to notify that you should not worry about them and compute them as a constant in time complexity. If they are not constant, as they are the basis of the recurrent equation, you can't use the master theorem to analyze the complexity.
Suppose in real n is in the order of 10^20, but T(0) = 10^50, and T(1) = 10^60. Hence, you can't compute the complexity using the master theorem.
For the following recurrence I have ti find the Big-O complexity O().
T(n) = 2T(n/2) + cn
T(1)=c1
I guessed its T(n) = O(n).
Induction hypothesis : T(n)<= ak for all k < n .
T(n) = 2T(n/2) + c*n
T(n) <= 2(a*(n/2)) + c*n
<= an +cn
=O(n)
I find it completely correct but my TA graded me 0 in this , Where do you think I went wrong ?
The so-called Master Theorem, case 2, states that
T(n) = Theta(n log n)
for your example. Furthermore, if T(n) = O(n) would be true, Quicksort (for which the above recurrence relation is satisfied) would have a linear runtime complexity, which is also not the case.
Concerning your argument, apparently you state that there exists a constant a such that
T(n) <= a*n
holds. Consequently, the induction hypothesis should be as follows.
T(k) <= a*k for each k < n
But even if this is assumed, the induction step proves that
T(n) <= (a+c)*n
holds; however, this does not prove the desired property as it does not prove that
T(n) <= a*n
holds.
In this recurrence relation T(n)=T(n/4)+T(3n/4)+c , I am having just confusion that what is the relation of this recurrence relation with the best and worst case analysis since we have to solve both the sub-problems which are of size n/4 and 3n/4 so what is the terminology of worst case or best case analysis here ?
Moreover we should use here theta(log n ) our O(log n ) ,although seeing the below link I found O(log n ) more applicable but still couldn't get why are we not using theta(log n ) here .
How to solve the recursive complexity T(n) = T(n/4)+T(3n/4)+cn
T(n) = T(n/4) + T(3n/4) + CONST <= 2T(3n/4) + CONST
We will use case 1 of master theorem with:
a = 2, b = 4/3.
c = log_{4/3}(2) ~= 0.4
CONST is in O(n^0.4)
Thus, from master theorem, one cad derive that 2T(3n/4) + CONST is in Theta(logn), and since T(n) <= 2T(3n/4) + CONST, we can say that T(n) is in O(logn).
By following the same idea, but with lower bound:
T(n) >= T(3n/4) + CONST ...
And using master theorem again, we can tell that T(n) is also in Omega(logn).
Since T(n) is both O(logn) and Omega(logn), it is also Theta(logn).
As for your question, you can use either big-O or Theta notation, whatever you prefer. As you can see, proving Theta requires a bit more work, but it is also more informative, as it tells you the bound you found is tight.
These types of recurrences can be easily solved with Akra-Bazzi theorem (and if you have looked at the question you linked, someone showed you a solution to a similar problem).
So 1/4^p + (3/4)^p = 1, where p = 1. In your case g(u) = c, so the integral
So int c/u^2 du from 1 to x which is equal to -1/u evaluated from 1 to x. This is equal to -1/x + 1. Now when you multiply it by x and you will get that the complexity is O(n) and not O(log n) as other people suggested.
The question is :
T(n) = √2*T(n/2) + log n
I'm not sure whether the master theorem works here, and kinda stuck.
This looks more like the Akra-Bazzi theorem: http://en.wikipedia.org/wiki/Akra%E2%80%93Bazzi_method#The_formula with k=1, h=0, g(n)=log n, a=(2)^{1/2}, b=1/2. In that case, p=1/2 and you need to evaluate the integral \int_1^x log(u)/u^{3/2} du. You can use integration by parts, or a symbolic integrator. Wolfram Alpha tells me the indefinite integral is -2(log u + 2)/u^{1/2} + C, so the definite integral is 4 - 2(log x + 2)/x^{1/2}. Adding 1 and multiplying by x^{1/2}, we get T(x) = \Theta(5x^{1/2} - 2 log x - 4).
Master theorem have only constrains on your a and b which holds for your case. The fact that a is irrational and you have log(n) as your f(n) has no relation to it.
So in your case your c = log2(sqrt(2)) = 1/2. Since n^c grows faster than your log(n), the complexity of the recursion is O(sqrt(n)).
P.S. solution of Danyal is wrong as the complexity is not nlogn and the solution of Edward Doolittle is correct, also it is an overkill in this simple case.
As per master theorem, f(n) should be polynomial but here
f(n) = logn
which is not a polynomial so it can not be solved by master theorem as per rules. I read somewhere about the fourth case as well. I must mention that as well.
It is also discussed here:
Master's theorem with f(n)=log n
However, there is a limited "fourth case" for the master theorem, which allows it to apply to polylogarithmic functions.
If
f(n) = O(nlogba logk n), then T(n) = O(nlogba log k+1 n).
In other words, suppose you have T(n) = 2T (n/2) + n log n. f(n) isn't a polynomial, but f(n)=n log n, and k = 1. Therefore, T(n) = O(n log2 n)
See this handout for more information: http://cse.unl.edu/~choueiry/S06-235/files/MasterTheorem-HandoutNoNotes.pdf