So, I have
T(n) = 2T(n/2) + n^2
And I found a = 2, b = 2, and f(n) = n^2, and to get n^2 I used case 3 for the master theorem which is
f(n) = big Omega ( n ^ logba + E )
and I found big Omega (n^2)
So, how can I solve
big theta ( big Omega (n^2) ) ?
Also, is my calculations correct?
Related
How would I determine the time complexity using the Master Theorem for the given problem?
T(n) = aT(n/b) + O(n^d)
T(n) = 4 T(n/2) + n*log(n)
a = 4, b = 2, d = 1
1. O(n^d) if d > logb(a)
2. O(n^d logn) if d = logb(a)
3. O(n^logb(a)) if d < logb(a)
In my case:
log2(4) = 2 --> d < log2(4)
T(n) = O(n^logb(a))
= O(n^2)
Is this correct?
edit:
I have made an different approach now, resulting in the same as before though, and following my professors instruction.
T(n) = a*T(n/b) + f(n)
T(n) = 4 T(n/2) + n*log(n)
1. Ө(n^(logba)) , if --> f(n) є O(n^(logba-ε)),
T(n) = { 2. Ө(n^(logba) log2n) , if --> f(n) є Ө(n^(logba))
3. Ө(f(n)) , if --> f(n) є Ω(n^(logba+ε))
First I look at the 2nd case
f(n) = n*log(n)
logb^a = log2^4 = 2
n^log24 = n^2
The second case does not apply because:
n*log(n) ∉ Θ(n^logba) —> the runtime of f(n) is smaller, because n*log(n) < n^2
Look at the first case if the runtime of f(n) is smaller than n^2
1. Ө(n^(logba)) , if --> f(n) є O(n^(logba-ε))
O(n^(logba-ε)) = O(n^(log24-ε))= O(n^2-ε)
n*log(n) ∈ O(n^(log24-ε)) --> works
f(n) has O(n^2-ε) as upper limit for worst case scenario,
case of Master—Theorems applies:
T(n) = Θ(n^log24) = Θ(n^2)
T(n) = Θ(n^2)
The version of the master theorem that you've stated here specifically assumes that the additive term has the form O(nd). In this case, the additive term is of the form n log n, which is not O(n1). This means that you cannot apply the master theorem as you have above.
A useful technique when working with recurrences like these is to sandwich the recurrence between two others. Notice, for example, that T(n) is bounded by these recurrences:
S(n) = 4S(n / 2) + n
R(n) = 4T(n / 2) + n1+ε, where ε is some small positive number.
Try solving each of these recurrences using the master theorem and see what you get, keeping in mind that your recurrence solves to a value sandwiched between them.
Hope this helps!
How do I solve the following as f(n)=n! does not as to my knowledge apply to any of the cases of master theorem.
T (n) = 16T (n/4) + n!
David Eisenstat is partially correct. Case 3 does apply, but T(n) = theta(n!), not O(n!).
T(n) = 16T(n/4) + n!
Case 3 of the Master Theorem (AKA Master Method) applies. a = 16, b = 4, f(n) = n!. n^(log [base(b)] a) = n^2. f(n) is n!. Since n! is omega(f(n)) i.e. n! omega n^2 AND af(n/b) <= cf(n) for a large n, T(n) is theta(n!).
For reference, consult #10 here : http://www.csd.uwo.ca/~moreno/CS433-CS9624/Resources/master.pdf
T(n) = 16T(n/4) + n!
I know it can be solved using Master theorem, but I don't know how to handle
f(n) = n!
This is case three of Master Theorem.
Since T(n) = 16T(n/4) + n!
Here f(n) = n!.
a = 16 and b = 4, so logb a = log4 16 = 2.
Master Theorem states that the complexity T(n) = Θ(f(n)) if
c > logb a where f(n) ∈ Ω(nc) .
Since f(n) = n! > nc for some value of n > n0 the statement f(n) ∈ Ω (nc) is true. Thus the statement
c > logb a =2 is also true. Hence by the third case of Master Thoerem the complexity T(n) = Θ(f(n)) = Θ(n!).
How do i apply the Master Theorem on this equation?
if T(1) = 1 and
T(n) = T(n-1)+2
What would be the runtime of such a program?
What is the T(1) = 1 for?
Which case is this and why?
Please an detailed explanation. Thanks.
You cannot use Master Theorem here (without variable substitution at least) here, since it is not in the correct format.
Your function however is easy to analyze and is in Theta(n).
Proof by induction, T(k) <= 2k for each k<n
T(n) = T(n-1) + 2 <= 2(n-1) + 2 = 2n -2 + 2 <= 2n
^
induction
hypothesis
Base of induction is T(1) = 1 <= 2
The above shows that T(n) is in O(n), since we found c=2 such that for n>0, the following is correct: T(n) <= c*n, and this is the definition of big O notation.
Proving similarly that T(n) is in Omega(n) is easy, and from this you can conclude T(n) is in Theta(n)
Thank you for your help. So with the help of you answer and a new recursiv equation i think i finally understood it. Lets say i have T(N) = 1*T(n-1) + n^2. Master Theorem doesn´t apply here aswell so i have my base case.
T(1) = 1
T(2) = 5
T(3) = 14
T(4) = 30
--> Proof by induction, T(k) <= 2k for each k<n
T(n) = T(n-1) + n^2 <= n^2(n-1) + n^2 = n^3 - n^2 + n^2 = n^3
^
induction
hypothesis
So this leads to O(N^3) ? not sure about this. Why not Omega or Theta.
When would my hypothesis/induction be <, > or >=.
Can we solve this
T(n) = 2T( n/2 ) + n lg n recurrence equation master theorem I am coming from a link where he is stating that we can't apply here master theorem because it doesn't satisfied any of the 3ree case condition. On the other hand he has taken a another example
T(n) = 27T(n/3) + Θ(n^3 lg n) and find the closed solution theta(n^3logn) For solving this he used 2nd case of master theorem If f(n) = Θ(nlogba (lg n)k ) then T(n) ∈ Θ(nlogba (lg n)k+1) for some k >= 0 Here my confusion arises why not we can apply 2nd case here while it is completely fit in 2nd case.
My thought: a = 2 , b =2; let k =1 then
f(n) = theta(n^log_2 2 logn) for k= 1 so T(n) = theta(nlogn) But he as mentioned on this we can't apply master theorem I m confused why not.
Note: It is due to f(n) bcz in T(n) = 2T( n/2 ) + n lg n f(n) = nlog n and in T(n) = 27T(n/3) + Θ(n^3 lg n) *f(n) = theta(n^3log n)* Please Correct me if I am wrong here.
Using case 2 of master theorem I find that
T(n) = Theta( n log^2 (n))
Your link states that the case 2 of theroem is :
f(n) = Theta( n log_b(a))
While from several other links, like the one from mit, the case is :
f(n) = Theta( n log_b(a) log_k(n)) for k >= 0