Time complexity and Master's theorem - algorithm

I am trying to get a better understanding of Master's Theorem and time complexity. I found some examples online that I am practicing. Is my work correct?
T(N) = 3T(N/3) + O(N)
Will have time complexity Θ(n), because log(base 3) 3 = 1. Thus, Θ(n^1) + O(N) is simplified to Θ(n).
T(N) = 3T(2N/3) + O(1)
This one I don't understand. I Know it is the stooge sort algorithm, but if using master's theorem, wouldn't a and b both be 3, making log(base 3) 3 = 1, making this Θ(n)? I know that is incorrect but I am having a tough time understanding master's theorem.
T(N) = 4T(N/2) + O(N)
Will have time complexity Θ(n^2), because log(base 2) 4 = 2. Then, N^(log(base 2) 4) = N^2
T(N) = 2T(N/2) + O(N log(N))
Here I am thinking it is simply O(N log(N)), since log(base 2) of 2 is one.

by master theorem :-
if
T(n) = aT(n/b) + f(n^k)
if loga/logb > k then T(n) = O(n^(loga/logb))
if loga/logb < k then T(n) = O(n^k)
else T(n) = O(n*logn)
1. a = 3 b = 3 k=0 loga/logb = 1 = k hence T(n) = O(nlogn)
2. a = 3 b = 3/2 k=0 log3/log(3/2) > 1 > k hence T(n) = O(n^(log3/log(3/2)))
3. a = 4 b = 2 k = 1 log4/log2 = 2 > 1 hence T(n) = O(n^2)

Let's first elaborate on the master theorem and analyze your four cases.
In the above picture, it shows that at each level the complexity is:
And we sum up all computations from all levels and get the total:
Then we only need to analyze the total function which is a geometric series and determined by the multiplicative factor(or common ratio): a/b^d.
If the common ratio is bigger than one, there will be exponential growth towards the last term:
which is the big O when a/b^d > 1 or d<log_b a.
If the common ratio is less than 1, there will be an exponential decay starting from the first term n^d which is the dominant one or the big O when a/b^d < 1 or d > log_b a.
If the common ratio is equal to 1, the series will be a constant sequence and we sum up all terms:
In your case 1 where T(N) = 3T(N/3) + O(N), we first see the common ratio a/b^d = 3/3^1=1.
And for your case 2 T(N) = 3T(2N/3) + O(1) it would be a/b^d = 3/(3/2)^0 = 3 > 1(where a = 3, b = 3/2 and d = 0) and hence the big O would be: .
For your case 3 T(N) = 4T(N/2) + O(N) a would be 4, b would be 2 and d would be 1.
For your fourth case T(N) = 2T(N/2) + O(N log(N)) the common ratio would be smaller than 1 since a/b^d = 2/2^1.x where d > 1, then the geometric series would be exponentially decaying. And hence the first term n log n would dominate the series and hence it would be the big O.
References:
https://www.coursera.org/learn/algorithmic-toolbox

Related

Determine the time complexity using master theorem

How would I determine the time complexity using the Master Theorem for the given problem?
T(n) = aT(n/b) + O(n^d)
T(n) = 4 T(n/2) + n*log(n)
a = 4, b = 2, d = 1
1. O(n^d) if d > logb(a)
2. O(n^d logn) if d = logb(a)
3. O(n^logb(a)) if d < logb(a)
In my case:
log2(4) = 2 --> d < log2(4)
T(n) = O(n^logb(a))
= O(n^2)
Is this correct?
edit:
I have made an different approach now, resulting in the same as before though, and following my professors instruction.
T(n) = a*T(n/b) + f(n)
T(n) = 4 T(n/2) + n*log(n)
1. Ө(n^(logba)) , if --> f(n) є O(n^(logba-ε)),
T(n) = { 2. Ө(n^(logba) log2n) , if --> f(n) є Ө(n^(logba))
3. Ө(f(n)) , if --> f(n) є Ω(n^(logba+ε))
First I look at the 2nd case
f(n) = n*log(n)
logb^a = log2^4 = 2
n^log24 = n^2
The second case does not apply because:
n*log(n) ∉ Θ(n^logba) —> the runtime of f(n) is smaller, because n*log(n) < n^2
Look at the first case if the runtime of f(n) is smaller than n^2
1. Ө(n^(logba)) , if --> f(n) є O(n^(logba-ε))
O(n^(logba-ε)) = O(n^(log24-ε))= O(n^2-ε)
n*log(n) ∈ O(n^(log24-ε)) --> works
f(n) has O(n^2-ε) as upper limit for worst case scenario,
case of Master—Theorems applies:
T(n) = Θ(n^log24) = Θ(n^2)
T(n) = Θ(n^2)
The version of the master theorem that you've stated here specifically assumes that the additive term has the form O(nd). In this case, the additive term is of the form n log n, which is not O(n1). This means that you cannot apply the master theorem as you have above.
A useful technique when working with recurrences like these is to sandwich the recurrence between two others. Notice, for example, that T(n) is bounded by these recurrences:
S(n) = 4S(n / 2) + n
R(n) = 4T(n / 2) + n1+ε, where ε is some small positive number.
Try solving each of these recurrences using the master theorem and see what you get, keeping in mind that your recurrence solves to a value sandwiched between them.
Hope this helps!

Calculating the Recurrence Relation T(n)=T(n / [(log n)^2]) + Θ(1)

I tried to solve this problem many hours and I think the solution is O(log n/[log (log n)^2]). but I'm not sure.Is this solution correct?
Expand the equation:
T(n) = (T(n/(log^2(n)*log(n/log^2(n))^2) + Theta(1)) Theta(1) =
T(n/(log^4(n) + 4 (loglog(n))^2 - 4log(n)loglog(n)) + 2 * Theta(1)
We know n/(log^4(n) + 4 (log(log(n)))^2 - 4log(n)log(log(n)) is greater than n/log^4(n) asymptotically. As you can see, each time n is divided by log^2(n). Hence, we can say if we compute the height of dividing n by log^2(n) up to reaching to 1, it will be a lower bound for T(n).
Hence, the height of the expansion tree will be k such that
n = (log^2(n))^k = lof^2k(n) =>‌ (take a log)
log(n) = 2k log(log(n)) => k = log(n)/(2 * log(log(n)))
Therefore, T(n) = Omega(log(n)/log(log(n))).
For the upper bound, as we know that n/(i-th statement) <‌ n/log^i(n) (instead of applying log^2(n), we've applied log(n)), we can say the height of division of n by log(n) will be an upper bound for T(n). Hence, as:
n = log^k(n) => log(n) = k log(log(n)) => k = log(n) / log(log(n))
we can say T(n) = O(log(n) / log(log(n))).

Big-O for T(N) = 2T(N − 1) + N, T(1) = 2

How to get big-O for this?
T(N) = 2T(N − 1) + N, T(1) = 2
I got two variants of answer O(2^N) or O(N^2), but I am not sure how to solve it correctly
Divide T(N) by 2^N and name the result:
S(N) = T(N)/2^N
From the definition of T(N) we get
S(N) = S(N-1) + N/2^N (eq.1)
meaning that S(N) increases, but quickly converges to a constant (since N/2^N -> 0). So,
T(N)/2^N -> constant
or
T(N) = O(2^N)
Detailed proof
In the comment below Paul Hankin suggests how to complete the proof. Take eq.1 and sum from N=2 to N=M
sum_{N=2}^M S(N) = sum_{N=2}^M S(N-1) + sum_{N=2}^M N/2^N
= sum_{N=1}{M-1} S(N) + sum_{N=1}^{M-1} (N-1)/2^{N-1}
thus, after canceling terms with indexes N = 2, 3, ..., M-1, we get
S(M) = S(1) + sum_{N=1}^M N/2^N - M/2^M
and since the series on the right converges (because its terms are bounded by 1/N^2 for N>>1 which is known to converge), S(M) converges to a finite constant.
It's a math problem and Leandro Caniglia is right.
let b(n) = T(n) / 2^n
thus b(n) = b(n-1) + n / 2^n = b(n-2) + n / 2^n + (n-1) / 2^(n-1) ....
i / 2^i is less than 1 for every integer i
So the sum of them has limit and must smaller than some constant.
thus b(n) < C.
thus T(n) < 2^n * C.
It is obvious that T(n) >= 2^n.
So T(n) is O(2^n)
Check by plugging the answer in the equation.
2^N = 2.2^(N-1) + N = 2^N + N
or
N^2 = 2 (N-1)^2 + N
Keeping only the dominant terms, you have
2^N ~ 2^N
or
N^2 ~ 2 N^2.
Conclude.

Finding these three algorithm's run time

Hi I am having a tough time showing the run time of these three algorithms for T(n). Assumptions include T(0)=0.
1) This one i know is close to Fibonacci so i know it's close to O(n) time but having trouble showing that:
T(n) = T(n-1) + T(n-2) +1
2) This on i am stumped on but think it's roughly about O(log log n):
T(n) = T([sqrt(n)]) + n. n greater-than-or-equal to 1. sqrt(n) is lower bound.
3) i believe this one is in roughly O(n*log log n):
T(n) = 2T(n/2) + (n/(log n)) + n.
Thanks for the help in advance.
T(n) = T(n-1) + T(n-2) + 1
Assuming T(0) = 0 and T(1) = a, for some constant a, we notice that T(n) - T(n-1) = T(n-2) + 1. That is, the growth rate of the function is given by the function itself, which suggests this function has exponential growth.
Let T'(n) = T(n) + 1. Then T'(n) = T'(n-1) + T'(n-2), by the above recurrence relation, and we have eliminated the troublesome constant term. T(n) and U(n) differ by a constant factor of 1, so assuming they are both non-decreasing (they are) then they will have the same asymptotic complexity, albeit for different constants n0.
To show T'(n) has asymptotic growth of O(b^n), we would need some base cases, then the hypothesis that the condition holds for all n up to, say, k - 1, and then we'd need to show it for k, that is, cb^(n-2) + cb^(n-1) < cb^n. We can divide through by cb^(n-2) to simplify this to 1 + b <= b^2. Rearranging, we get b^2 - b - 1 > 0; roots are (1 +- sqrt(5))/2, and we must discard the negative one since we cannot use a negative number as the base for our exponent. So for b >= (1+sqrt(5))/2, T'(n) may be O(b^n). A similar thought experiment will show that for b <= (1+sqrt(5))/2, T'(n) may be Omega(b^n). Thus, for b = (1+sqrt(5))/2 only, T'(n) may be Theta(b^n).
Completing the proof by induction that T(n) = O(b^n) is left as an exercise.
T(n) = T([sqrt(n)]) + n
Obviously, T(n) is at least linear, assuming the boundary conditions require T(n) be nonnegative. We might guess that T(n) is Theta(n) and try to prove it. Base case: let T(0) = a and T(1) = b. Then T(2) = b + 2 and T(4) = b + 6. In both cases, a choice of c >= 1.5 will work to make T(n) < cn. Suppose that whatever our fixed value of c is works for all n up to and including k. We must show that T([sqrt(k+1)]) + (k+1) <= c*(k+1). We know that T([sqrt(k+1)]) <= csqrt(k+1) from the induction hypothesis. So T([sqrt(k+1)]) + (k+1) <= csqrt(k+1) + (k+1), and csqrt(k+1) + (k+1) <= c(k+1) can be rewritten as cx + x^2 <= cx^2 (with x = sqrt(k+1)); dividing through by x (OK since k > 1) we get c + x <= cx, and solving this for c we get c >= x/(x-1) = sqrt(k+1)/(sqrt(k+1)-1). This eventually approaches 1, so for large enough n, any constant c > 1 will work.
Making this proof totally rigorous by fixing the following points is left as an exercise:
making sure enough base cases are proven so that all assumptions hold
distinguishing the cases where (a) k + 1 is a perfect square (hence [sqrt(k+1)] = sqrt(k+1)) and (b) k + 1 is not a perfect square (hence sqrt(k+1) - 1 < [sqrt(k+1)] < sqrt(k+1)).
T(n) = 2T(n/2) + (n/(log n)) + n
This T(n) > 2T(n/2) + n, which we know is the recursion relation for the runtime of Mergesort, which by the Master theorem is O(n log n), s we know our complexity is no less than that.
Indeed, by the master theorem: T(n) = 2T(n/2) + (n/(log n)) + n = 2T(n/2) + n(1 + 1/(log n)), so
a = 2
b = 2
f(n) = n(1 + 1/(log n)) is O(n) (for n>2, it's always less than 2n)
f(n) = O(n) = O(n^log_2(2) * log^0 n)
We're in case 2 of the Master Theorem still, so the asymptotic bound is the same as for Mergesort, Theta(n log n).

The Recurrence T(n)= 2T(n/2) + (n-1)

I have this recurrence:
T(n)= 2T(n/2) + (n-1)
My try is as follow:
the tree is like this:
T(n) = 2T(n/2) + (n-1)
T(n/2) = 2T(n/4) + ((n/2)-1)
T(n/4) = 2T(n/8) + ((n/4)-1)
...
the hight of the tree : (n/(2h))-1 = 1 ⇒ h = lg n - 1 = lg n - lg 2
the cost of the last level : 2h = 2lg n - lg 2 = (1/2) n
the cost of all levels until level h-1 : Σi=0,...,lg(2n) n - (2i-1), which is a geometric series and equals (1/2)((1/2)n-1)
So, T(n) = Θ(n lg n)
my question is: Is that right?
No, it isn't. You have the cost of the last level wrong, so what you derived from that is also wrong.
(I'm assuming you want to find the complexity yourself, so no more hints unless you ask.)
Edit: Some hints, as requested
To find the complexity, one usually helpful method is to recursively apply the equation and insert the result into the first,
T(n) = 2*T(n/2) + (n-1)
= 2*(2*T(n/4) + (n/2-1)) + (n-1)
= 4*T(n/4) + (n-2) + (n-1)
= 4*T(n/4) + 2*n - 3
= 4*(2*T(n/8) + (n/4-1)) + 2*n - 3
= ...
That often leads to a closed formula you can prove via induction (you don't need to carry out the proof if you have enough experience, then you see the correctness without writing down the proof).
Spoiler: You can look up the complexity in almost any resource dealing with the Master Theorem.
This can be easily solved with Masters theorem.
You have a=2, b=2, f(n) = n - 1 = O(n) and therefore c = log2(2) = 1. This falls into the first case of Master's theorem, which means that the complexity is O(n^c) = O(n)

Resources