Can anyone help me check the correctness, and explain why
What is the asymptotic running time of T(n) = 3T(n/3) + O(n) with T(1) = 1 _______ .
My answer is nlog33 .
You seem to have misapplied the Master Theorem.
We have T(n) = a T(n/b) + O(n) where a, b = 3.
Since here the recurence function is O(n), it takes the form O(nc logk(n)) with c = 1 and k = 0.
We are thus in the case where c = loga(b) = 1.
Then according to the Master Theorem, the complexity is O(nc logk+1(n)), that is O(n log(n)).
Related
How would I determine the time complexity using the Master Theorem for the given problem?
T(n) = aT(n/b) + O(n^d)
T(n) = 4 T(n/2) + n*log(n)
a = 4, b = 2, d = 1
1. O(n^d) if d > logb(a)
2. O(n^d logn) if d = logb(a)
3. O(n^logb(a)) if d < logb(a)
In my case:
log2(4) = 2 --> d < log2(4)
T(n) = O(n^logb(a))
= O(n^2)
Is this correct?
edit:
I have made an different approach now, resulting in the same as before though, and following my professors instruction.
T(n) = a*T(n/b) + f(n)
T(n) = 4 T(n/2) + n*log(n)
1. Ө(n^(logba)) , if --> f(n) є O(n^(logba-ε)),
T(n) = { 2. Ө(n^(logba) log2n) , if --> f(n) є Ө(n^(logba))
3. Ө(f(n)) , if --> f(n) є Ω(n^(logba+ε))
First I look at the 2nd case
f(n) = n*log(n)
logb^a = log2^4 = 2
n^log24 = n^2
The second case does not apply because:
n*log(n) ∉ Θ(n^logba) —> the runtime of f(n) is smaller, because n*log(n) < n^2
Look at the first case if the runtime of f(n) is smaller than n^2
1. Ө(n^(logba)) , if --> f(n) є O(n^(logba-ε))
O(n^(logba-ε)) = O(n^(log24-ε))= O(n^2-ε)
n*log(n) ∈ O(n^(log24-ε)) --> works
f(n) has O(n^2-ε) as upper limit for worst case scenario,
case of Master—Theorems applies:
T(n) = Θ(n^log24) = Θ(n^2)
T(n) = Θ(n^2)
The version of the master theorem that you've stated here specifically assumes that the additive term has the form O(nd). In this case, the additive term is of the form n log n, which is not O(n1). This means that you cannot apply the master theorem as you have above.
A useful technique when working with recurrences like these is to sandwich the recurrence between two others. Notice, for example, that T(n) is bounded by these recurrences:
S(n) = 4S(n / 2) + n
R(n) = 4T(n / 2) + n1+ε, where ε is some small positive number.
Try solving each of these recurrences using the master theorem and see what you get, keeping in mind that your recurrence solves to a value sandwiched between them.
Hope this helps!
I realize that solving this with Master's theorem gives the answer of Big Theta(log n). However, I want to know more and find the base of the logarithm. I tried reading about masters theorem more to find out about the base but could not find more information on wikipedia (https://en.wikipedia.org/wiki/Master_theorem_(analysis_of_algorithms)).
How would I solve this using recursion tree or substitution method for solving recurrences?
You can assume n = 2^K and T(0) = 0.
Don't set n=2^k but n=3^k
thus T(3^k) = T(3^{k-1}) + c
recurrence becomes w_k = w_{k-1} + c
Assuming T(1) = 1
with the general term: w_k = ck+1
and w_0 = 1
you conclude T(n) = clog_3(n) + 1
and thus T(n) = O(log_3(n))
T(n) = T(n/3) + O(1) = T(n/9) + O(1) + O(1) = T(n/27) + O(1) + O(1) + O(1) = …
After log3(n) steps, the term T vanishes and T(n) = O(log(n)).
i want to know what the Time Complexity of my recursion method :
T(n) = 2T(n/2) + O(1)
i saw a result that says it is O(n) but i don't know why , i solved it like this :
T(n) = 2T(n/2) + 1
T(n-1) = 4T(n-1/4) + 3
T(n-2) = 8T(n-2/8) + 7
...... ………….. ..
T(n) = 2^n+1 T (n/2^n+1) + (2^n+1 - 1)
I think you have got the wrong idea about recursive relations. You can think as follows:
If T(n) represents the value of function T() at input = n then the relation says that output is one more double the value at half of the current input. So for input = n-1 output i.e. T(n-1) will be one more than double the value at half of this input, that is T(n-1) = 2*T((n-1)/2) + 1
The above kind of recursive relation should be solved as answered by Yves Daoust. For more examples on recursive relations, you can refer this
Consider that n=2^m, which allows you to write
T(2^m)=2T(2^(m-1))+O(1)
or by denoting S(m):= T(2^m),
S(m)=2 S(m-1) + O(1),
2^m S(m)=2 2^(m-1)S(m-1) + 2^(m-1) O(1)
and finally,
R(m) = R(m-1) + 2^(m-1) O(1).
Now by induction,
R(m) = R(0) + (2^m-1) O(1),
T(n) = S(m) = 2^(1-m) T(2^m) + (2 - 2^(m-1)) O(1) = 2/n T(n) + (2 - n/2) O(1).
There are a couple of rules that you might need to remember. If you can remember these easy rules then Master Theorem is very easy to solve recurrence equations. The following are the basic rules which needs to be remembered
case 1) If n^(log b base a) << f(n) then T(n) = f(n)
case 2) If n^(log b base a) = f(n) then T(n) = f(n) * log n
case 3) 1) If n^(log b base a) >> f(n) then T(n) = n^(log b base a)
Now, lets solve the recurrence using the above equations.
a = 2, b = 2, f(n) = O(1)
n^(log b base a) = n = O(n)
This is case 3) in the above equations. Hence T(n) = n^(log b base a) = O(n).
Does Master Theorem assumes T(1) is constant? Say if I have an algorithm with time complexity: T(n) = 2T(n/2) + O(1) and T(1) = O(logn), what is the time complexity of this algorithm?
For the recurrence relation: T(n) = 2T(n/2) + O(1), we have
a = 2
b = 2
an O(1) time cost work outside the recursion
therefore the master theorem case 1 applies, and we have:
T(n) ∈ Θ(n ^ log2(2)) ⇒
T(n) ∈ Θ(n)
A recurrence relation defines a sequence based on an initial term. If a problem of size 1 is solved by the recurrence relation T(1) = f(n), where f ∈ O(logn), the value of T(1) can't be determined, i.e. makes no sense as a recurrence relation.
Your statement T(1) = O(logn) does not make any sense. You basically states that some function that does not depend on n for some reason has a logarithmic complexity (thus depends on n in a logarithmic way).
T(1), T(2), T(532143243) are boundary conditions and can not depend on any parameter. They should be a number (5, pi/e, sqrt(5) - i)
Sometimes it's best just to try things out rather than relying on a Theorem.
T(m) = 2T(m/2) + O(1)
T(1) = O(logn)
T(2) = 2T(1) = 2log(n)
T(4) = 2T(2) = 4log(n)
T(8) = 2T(4) = 8log(n)
T(16) = 2T(8) = 16log(n)
T(32) = 2T(16) = 32log(n)
T(m) = 2T(m/2) = mlog(n)
In conclusion, your initial question is indeed nonsensical as others have pointed out because you are attempting to calculate T(n) when the same n is used in T(1) = O(logn). But we can answer your second question that you have added as a comment.
I would like to solve the following recurrence relation:
T(n) = 2T(√n);
I'm guessing that T(n) = O(log log n), but I'm not sure how to prove this. How would I show that this recurrence solves to O(log log n)?
One idea would be to simplify the recurrence by introducing a new variable k such that 2k = n. Then, the recurrence relation works out to
T(2k) = 2T(2k/2)
If you then let S(k) = T(2k), you get the recurrence
S(k) = 2S(k / 2)
Note that this is equivalent to
S(k) = 2S(k / 2) + O(1)
Since 0 = O(1). Therefore, by the Master Theorem, we get that S(k) = Θ(k), since we have that a = 2, b = 2, and d = 0 and logb a > d.
Since S(k) = Θ(k) and S(k) = T(2k) = T(n), we get that T(n) = Θ(k). Since we picked 2k = n, this means that k = log n, so T(n) = Θ(log n). This means that your initial guess of O(log log n) is incorrect and that the runtime is only logarithmic, not doubly-logarithmic. If there was only one recursive call being made, though, you would be right that the runtime would be O(log log n).
Hope this helps!
You can solve this easily by unrolling the recursion:
Now the recurrence will finish when T(1) = a and you can find the appropriate a. When a = 0 or 1 it does not make sense but when a=2 you will get:
Substituting the k into latest part of the first equation you will get the complexity of O(log(n)).
Check other similar recursions here:
T(n) = 2T(n^(1/2)) + log n
T(n) = T(n^(1/2)) + Θ(lg lg n)
T(n) = T(n^(1/2)) + 1