How would I determine the time complexity using the Master Theorem for the given problem?
T(n) = aT(n/b) + O(n^d)
T(n) = 4 T(n/2) + n*log(n)
a = 4, b = 2, d = 1
1. O(n^d) if d > logb(a)
2. O(n^d logn) if d = logb(a)
3. O(n^logb(a)) if d < logb(a)
In my case:
log2(4) = 2 --> d < log2(4)
T(n) = O(n^logb(a))
= O(n^2)
Is this correct?
edit:
I have made an different approach now, resulting in the same as before though, and following my professors instruction.
T(n) = a*T(n/b) + f(n)
T(n) = 4 T(n/2) + n*log(n)
1. Ө(n^(logba)) , if --> f(n) є O(n^(logba-ε)),
T(n) = { 2. Ө(n^(logba) log2n) , if --> f(n) є Ө(n^(logba))
3. Ө(f(n)) , if --> f(n) є Ω(n^(logba+ε))
First I look at the 2nd case
f(n) = n*log(n)
logb^a = log2^4 = 2
n^log24 = n^2
The second case does not apply because:
n*log(n) ∉ Θ(n^logba) —> the runtime of f(n) is smaller, because n*log(n) < n^2
Look at the first case if the runtime of f(n) is smaller than n^2
1. Ө(n^(logba)) , if --> f(n) є O(n^(logba-ε))
O(n^(logba-ε)) = O(n^(log24-ε))= O(n^2-ε)
n*log(n) ∈ O(n^(log24-ε)) --> works
f(n) has O(n^2-ε) as upper limit for worst case scenario,
case of Master—Theorems applies:
T(n) = Θ(n^log24) = Θ(n^2)
T(n) = Θ(n^2)
The version of the master theorem that you've stated here specifically assumes that the additive term has the form O(nd). In this case, the additive term is of the form n log n, which is not O(n1). This means that you cannot apply the master theorem as you have above.
A useful technique when working with recurrences like these is to sandwich the recurrence between two others. Notice, for example, that T(n) is bounded by these recurrences:
S(n) = 4S(n / 2) + n
R(n) = 4T(n / 2) + n1+ε, where ε is some small positive number.
Try solving each of these recurrences using the master theorem and see what you get, keeping in mind that your recurrence solves to a value sandwiched between them.
Hope this helps!
Related
I am trying to find the time complexity (Big-Θ) of this algorithm:
Recursion(n):
while n > 1:
n = floor(n/2)
Recursion(n)
I have found an upper bound of O(n) by considering the worst case which is when n is a power of 2.
However, I am having trouble finding a lower bound (Big-Ω) for this. My intuition is that this is Ω(n) as well, but I am not sure how to show this with the floor function in the way.
Any suggestions? Thank you!
EDIT: the main thing I'm not convinced of is that T(n/2) is equivalent to T(floor(n/2)). How would one prove this for this algorithm?
floor function performs its operation in constant time, O(1). Therefore you can ignore it/see it as a constant. Let's analyze the time complexity of the algorithm below:
T(n) = T(n/2) + 1 (floor operation)
T(n/2) = T(n/4) + 1
...
T(2) = T(1) + 1 --> T(1) = constant
T(n) = T(n/4) + 2
T(n) = T(n/8) + 3
...
T(n) = T(n/2^k) + k 2^k = n therefore k=log(n)
T(n) = T(1) + log(n)
T(n) = log(n)
We can conclude that T(n) ∈ θ(log(n)).
i want to know what the Time Complexity of my recursion method :
T(n) = 2T(n/2) + O(1)
i saw a result that says it is O(n) but i don't know why , i solved it like this :
T(n) = 2T(n/2) + 1
T(n-1) = 4T(n-1/4) + 3
T(n-2) = 8T(n-2/8) + 7
...... ………….. ..
T(n) = 2^n+1 T (n/2^n+1) + (2^n+1 - 1)
I think you have got the wrong idea about recursive relations. You can think as follows:
If T(n) represents the value of function T() at input = n then the relation says that output is one more double the value at half of the current input. So for input = n-1 output i.e. T(n-1) will be one more than double the value at half of this input, that is T(n-1) = 2*T((n-1)/2) + 1
The above kind of recursive relation should be solved as answered by Yves Daoust. For more examples on recursive relations, you can refer this
Consider that n=2^m, which allows you to write
T(2^m)=2T(2^(m-1))+O(1)
or by denoting S(m):= T(2^m),
S(m)=2 S(m-1) + O(1),
2^m S(m)=2 2^(m-1)S(m-1) + 2^(m-1) O(1)
and finally,
R(m) = R(m-1) + 2^(m-1) O(1).
Now by induction,
R(m) = R(0) + (2^m-1) O(1),
T(n) = S(m) = 2^(1-m) T(2^m) + (2 - 2^(m-1)) O(1) = 2/n T(n) + (2 - n/2) O(1).
There are a couple of rules that you might need to remember. If you can remember these easy rules then Master Theorem is very easy to solve recurrence equations. The following are the basic rules which needs to be remembered
case 1) If n^(log b base a) << f(n) then T(n) = f(n)
case 2) If n^(log b base a) = f(n) then T(n) = f(n) * log n
case 3) 1) If n^(log b base a) >> f(n) then T(n) = n^(log b base a)
Now, lets solve the recurrence using the above equations.
a = 2, b = 2, f(n) = O(1)
n^(log b base a) = n = O(n)
This is case 3) in the above equations. Hence T(n) = n^(log b base a) = O(n).
T(n) = 16T(n/4) + n!
I know it can be solved using Master theorem, but I don't know how to handle
f(n) = n!
This is case three of Master Theorem.
Since T(n) = 16T(n/4) + n!
Here f(n) = n!.
a = 16 and b = 4, so logb a = log4 16 = 2.
Master Theorem states that the complexity T(n) = Θ(f(n)) if
c > logb a where f(n) ∈ Ω(nc) .
Since f(n) = n! > nc for some value of n > n0 the statement f(n) ∈ Ω (nc) is true. Thus the statement
c > logb a =2 is also true. Hence by the third case of Master Thoerem the complexity T(n) = Θ(f(n)) = Θ(n!).
I am trying to get a better understanding of Master's Theorem and time complexity. I found some examples online that I am practicing. Is my work correct?
T(N) = 3T(N/3) + O(N)
Will have time complexity Θ(n), because log(base 3) 3 = 1. Thus, Θ(n^1) + O(N) is simplified to Θ(n).
T(N) = 3T(2N/3) + O(1)
This one I don't understand. I Know it is the stooge sort algorithm, but if using master's theorem, wouldn't a and b both be 3, making log(base 3) 3 = 1, making this Θ(n)? I know that is incorrect but I am having a tough time understanding master's theorem.
T(N) = 4T(N/2) + O(N)
Will have time complexity Θ(n^2), because log(base 2) 4 = 2. Then, N^(log(base 2) 4) = N^2
T(N) = 2T(N/2) + O(N log(N))
Here I am thinking it is simply O(N log(N)), since log(base 2) of 2 is one.
by master theorem :-
if
T(n) = aT(n/b) + f(n^k)
if loga/logb > k then T(n) = O(n^(loga/logb))
if loga/logb < k then T(n) = O(n^k)
else T(n) = O(n*logn)
1. a = 3 b = 3 k=0 loga/logb = 1 = k hence T(n) = O(nlogn)
2. a = 3 b = 3/2 k=0 log3/log(3/2) > 1 > k hence T(n) = O(n^(log3/log(3/2)))
3. a = 4 b = 2 k = 1 log4/log2 = 2 > 1 hence T(n) = O(n^2)
Let's first elaborate on the master theorem and analyze your four cases.
In the above picture, it shows that at each level the complexity is:
And we sum up all computations from all levels and get the total:
Then we only need to analyze the total function which is a geometric series and determined by the multiplicative factor(or common ratio): a/b^d.
If the common ratio is bigger than one, there will be exponential growth towards the last term:
which is the big O when a/b^d > 1 or d<log_b a.
If the common ratio is less than 1, there will be an exponential decay starting from the first term n^d which is the dominant one or the big O when a/b^d < 1 or d > log_b a.
If the common ratio is equal to 1, the series will be a constant sequence and we sum up all terms:
In your case 1 where T(N) = 3T(N/3) + O(N), we first see the common ratio a/b^d = 3/3^1=1.
And for your case 2 T(N) = 3T(2N/3) + O(1) it would be a/b^d = 3/(3/2)^0 = 3 > 1(where a = 3, b = 3/2 and d = 0) and hence the big O would be: .
For your case 3 T(N) = 4T(N/2) + O(N) a would be 4, b would be 2 and d would be 1.
For your fourth case T(N) = 2T(N/2) + O(N log(N)) the common ratio would be smaller than 1 since a/b^d = 2/2^1.x where d > 1, then the geometric series would be exponentially decaying. And hence the first term n log n would dominate the series and hence it would be the big O.
References:
https://www.coursera.org/learn/algorithmic-toolbox
Can we solve this
T(n) = 2T( n/2 ) + n lg n recurrence equation master theorem I am coming from a link where he is stating that we can't apply here master theorem because it doesn't satisfied any of the 3ree case condition. On the other hand he has taken a another example
T(n) = 27T(n/3) + Θ(n^3 lg n) and find the closed solution theta(n^3logn) For solving this he used 2nd case of master theorem If f(n) = Θ(nlogba (lg n)k ) then T(n) ∈ Θ(nlogba (lg n)k+1) for some k >= 0 Here my confusion arises why not we can apply 2nd case here while it is completely fit in 2nd case.
My thought: a = 2 , b =2; let k =1 then
f(n) = theta(n^log_2 2 logn) for k= 1 so T(n) = theta(nlogn) But he as mentioned on this we can't apply master theorem I m confused why not.
Note: It is due to f(n) bcz in T(n) = 2T( n/2 ) + n lg n f(n) = nlog n and in T(n) = 27T(n/3) + Θ(n^3 lg n) *f(n) = theta(n^3log n)* Please Correct me if I am wrong here.
Using case 2 of master theorem I find that
T(n) = Theta( n log^2 (n))
Your link states that the case 2 of theroem is :
f(n) = Theta( n log_b(a))
While from several other links, like the one from mit, the case is :
f(n) = Theta( n log_b(a) log_k(n)) for k >= 0