So my recursive equation is T(n) = 2T(n/2) + log n
I used the master theorem and I find that a = 2, b =2 and d = 1.
which is case 2. So the solution should be O(n^1 log n) which is O(n log n)
I looked online and some found it O(n). I'm confused
Can anyone tell me how it's not O(n log n) ?
This should not be case 2, but case 1.
With T(n) = 2T(n/2) + log n the critical exponent of c_crit = log_2(2) = 1 as you found, I think correctly. But certainly log n is O(n^c) for some c < 1, even for all 0 < c < 1, so case 1 applies and the whole thing is O(n^c_crit) = O(n^1) = O(n).
I'm not familiar with d in the master theorem. The wikipedia article on the Master Theorem states that you need to find c = log_b a, the critical exponent. Here the c = 1. Case 2 requires we have f(n) = Theta(n log n), but in reality we have f(n) = log n. Instead, this problem falls into case 1 (see if you can figure out why!), which means T(n) = Theta(n), as you found elsewhere.
Related
How would I determine the time complexity using the Master Theorem for the given problem?
T(n) = aT(n/b) + O(n^d)
T(n) = 4 T(n/2) + n*log(n)
a = 4, b = 2, d = 1
1. O(n^d) if d > logb(a)
2. O(n^d logn) if d = logb(a)
3. O(n^logb(a)) if d < logb(a)
In my case:
log2(4) = 2 --> d < log2(4)
T(n) = O(n^logb(a))
= O(n^2)
Is this correct?
edit:
I have made an different approach now, resulting in the same as before though, and following my professors instruction.
T(n) = a*T(n/b) + f(n)
T(n) = 4 T(n/2) + n*log(n)
1. Ө(n^(logba)) , if --> f(n) є O(n^(logba-ε)),
T(n) = { 2. Ө(n^(logba) log2n) , if --> f(n) є Ө(n^(logba))
3. Ө(f(n)) , if --> f(n) є Ω(n^(logba+ε))
First I look at the 2nd case
f(n) = n*log(n)
logb^a = log2^4 = 2
n^log24 = n^2
The second case does not apply because:
n*log(n) ∉ Θ(n^logba) —> the runtime of f(n) is smaller, because n*log(n) < n^2
Look at the first case if the runtime of f(n) is smaller than n^2
1. Ө(n^(logba)) , if --> f(n) є O(n^(logba-ε))
O(n^(logba-ε)) = O(n^(log24-ε))= O(n^2-ε)
n*log(n) ∈ O(n^(log24-ε)) --> works
f(n) has O(n^2-ε) as upper limit for worst case scenario,
case of Master—Theorems applies:
T(n) = Θ(n^log24) = Θ(n^2)
T(n) = Θ(n^2)
The version of the master theorem that you've stated here specifically assumes that the additive term has the form O(nd). In this case, the additive term is of the form n log n, which is not O(n1). This means that you cannot apply the master theorem as you have above.
A useful technique when working with recurrences like these is to sandwich the recurrence between two others. Notice, for example, that T(n) is bounded by these recurrences:
S(n) = 4S(n / 2) + n
R(n) = 4T(n / 2) + n1+ε, where ε is some small positive number.
Try solving each of these recurrences using the master theorem and see what you get, keeping in mind that your recurrence solves to a value sandwiched between them.
Hope this helps!
Can anyone help me check the correctness, and explain why
What is the asymptotic running time of T(n) = 3T(n/3) + O(n) with T(1) = 1 _______ .
My answer is nlog33 .
You seem to have misapplied the Master Theorem.
We have T(n) = a T(n/b) + O(n) where a, b = 3.
Since here the recurence function is O(n), it takes the form O(nc logk(n)) with c = 1 and k = 0.
We are thus in the case where c = loga(b) = 1.
Then according to the Master Theorem, the complexity is O(nc logk+1(n)), that is O(n log(n)).
I had the following recurrence relations on a test and I got them wrong, I am not sure why.
1. T(n) = 2T(n/4) + O(n^0.5)
Using MT: a = 2, b = 4, f(n) = n^0.5
Comparing n^(log_4(2)) to n^0.5 => n^0.5 == n^0.5
Thus, case 3: Θ(n log n)
Apparently thats wrong, don't know why.
2. T(n) = 3T(n/4) + O(n^0.75)
Using MT: a = 3, b = 4, f(n) = n^0.75
Comparing n^(log_4(3)) to n^0.75
Thus, case 1: Θ(n^log_4(3))
3. T(n) = T(n/2) + T(n/3) + O(n log n)
This isn't in a form that can be solved with MT and I cannot easily find a p-value without aid. Thus, I took a stab in the dark and was wrong. No clue where to begin with this one.
4. T(n) = 2T(n/2) + n log n
Using MT: a = 2, b = 2, f(n) = n log n
Comparing n^log_2(2) to n log n => n^1 to n log n
Case 2: Θ(n log n)
You may have misread or omitted some details of the Master theorem. Will refer to the Wikipedia article.
1)
The second case states that:
Since c_crit = 0.5 and k = 0, the final complexity is:
You just missed out the exponent on the n in front.
2)
This is correct.
4)
You missed another detail here: k = 1, and there needs to be an additional factor of log n:
3)
This is slightly trickier. Using the Akra-Bazzi method:
To solve for the exponent p, just use Newton-Raphson on your calculator - gives p = 0.787885.... Performing the integration by parts:
Substituting in:
I would like to solve the following recurrence relation:
T(n) = 2T(√n);
I'm guessing that T(n) = O(log log n), but I'm not sure how to prove this. How would I show that this recurrence solves to O(log log n)?
One idea would be to simplify the recurrence by introducing a new variable k such that 2k = n. Then, the recurrence relation works out to
T(2k) = 2T(2k/2)
If you then let S(k) = T(2k), you get the recurrence
S(k) = 2S(k / 2)
Note that this is equivalent to
S(k) = 2S(k / 2) + O(1)
Since 0 = O(1). Therefore, by the Master Theorem, we get that S(k) = Θ(k), since we have that a = 2, b = 2, and d = 0 and logb a > d.
Since S(k) = Θ(k) and S(k) = T(2k) = T(n), we get that T(n) = Θ(k). Since we picked 2k = n, this means that k = log n, so T(n) = Θ(log n). This means that your initial guess of O(log log n) is incorrect and that the runtime is only logarithmic, not doubly-logarithmic. If there was only one recursive call being made, though, you would be right that the runtime would be O(log log n).
Hope this helps!
You can solve this easily by unrolling the recursion:
Now the recurrence will finish when T(1) = a and you can find the appropriate a. When a = 0 or 1 it does not make sense but when a=2 you will get:
Substituting the k into latest part of the first equation you will get the complexity of O(log(n)).
Check other similar recursions here:
T(n) = 2T(n^(1/2)) + log n
T(n) = T(n^(1/2)) + Θ(lg lg n)
T(n) = T(n^(1/2)) + 1
Can we solve this
T(n) = 2T( n/2 ) + n lg n recurrence equation master theorem I am coming from a link where he is stating that we can't apply here master theorem because it doesn't satisfied any of the 3ree case condition. On the other hand he has taken a another example
T(n) = 27T(n/3) + Θ(n^3 lg n) and find the closed solution theta(n^3logn) For solving this he used 2nd case of master theorem If f(n) = Θ(nlogba (lg n)k ) then T(n) ∈ Θ(nlogba (lg n)k+1) for some k >= 0 Here my confusion arises why not we can apply 2nd case here while it is completely fit in 2nd case.
My thought: a = 2 , b =2; let k =1 then
f(n) = theta(n^log_2 2 logn) for k= 1 so T(n) = theta(nlogn) But he as mentioned on this we can't apply master theorem I m confused why not.
Note: It is due to f(n) bcz in T(n) = 2T( n/2 ) + n lg n f(n) = nlog n and in T(n) = 27T(n/3) + Θ(n^3 lg n) *f(n) = theta(n^3log n)* Please Correct me if I am wrong here.
Using case 2 of master theorem I find that
T(n) = Theta( n log^2 (n))
Your link states that the case 2 of theroem is :
f(n) = Theta( n log_b(a))
While from several other links, like the one from mit, the case is :
f(n) = Theta( n log_b(a) log_k(n)) for k >= 0