Solving Recurrence relation: T(n) = 3T(n/5) + lgn * lgn - algorithm

Consider the following recurrence
T(n) = 3T(n/5) + lgn * lgn
What is the value of T(n)?
(A) Theta(n ^ log_5{3})
(B) Theta(n ^ log_3{5})
(c) Theta(n Log n )
(D) Theta( Log n )
Answer is (A)
My Approach :
lgn * lgn = theta(n) since c2lgn < 2*lglgn < c1*lgn for some n>n0
Above inequality is shown in this picture for c2 = 0.1 and c1 = 1
log_5{3} < 1,
Hence by master theorem answer has to be theta(n) and none of the answers match. How to solve this problem??

Your claim that lg n * lg n = Θ(n) is false. Notice that the limit of (lg n)2 / n tends toward 0 as n goes to infinity. You can see this using l'Hopital's rule:
limn → ∞ (lg n)2 / n
= lim n → ∞ 2 lg n / n
= lim n → ∞ 2 / n
= 0
More generally, using similar reasoning, you can prove that lg n = o(nε) for any ε > 0.
Let's try to solve this recurrence using the master theorem. We see that there are three subproblems of size n / 5 each, so we should look at the value of log5 3. Since (lg n)2 = o(nlog5 3), we see that the recursion is bottom-heavy and can conclude that the recurrence solves to O(nlog5 3), which is answer (A) in your list up above.
Hope this helps!

To apply Master Theorem we should check the relation between
nlog5(3) ~= n0.682 and (lg(n))2
Unfortunately lg(n)2 != 2*lg(n): it is lg(n2) that's equal to 2*lg(n)
Also, there is a big difference, in Master Theorem, if f(n) is O(nlogb(a)-ε), or instead Θ(nlogba): if the former holds we can apply case 1, if the latter holds case 2 of the theorem.
With just a glance, it looks highly unlikely (lg(n))2 = Ω(n0.682), so let's try to prove that (lg(n))2 = O(n0.682), i.e.:
∃ n0, c ∈ N+, such that for n>n0, (lg(n))2 < c * n0.682
Let's take the square root of both sides (assuming n > 1, the inequality holds)
lg(n) < c1 * n0.341 , (where c1 = sqrt(c))
now we can assume, that lg(n) = log2(n) (otherwise the multiplicative factor could be absorbed by our constant - as you know constant factors don't matter in asymptotic analysis) and exponentiate both sides:
2lg(n) < 2c2 * n0.341 <=> n < 2c2 * n0.341 <=> n < (n20.341)c2 <=> n < (n20.341)c2 <=> n < (n1.266)c2
which is immediately true choosing c2 = 1 and n0 = 1
Therefore, it does hold true that f(n) = O(nlogb(a)-ε), and we can apply case 1 of the Master Theorem, and conclude that:
T(n) = O(nlog53)
Same result, a bit more formally.

Related

Finding these three algorithm's run time

Hi I am having a tough time showing the run time of these three algorithms for T(n). Assumptions include T(0)=0.
1) This one i know is close to Fibonacci so i know it's close to O(n) time but having trouble showing that:
T(n) = T(n-1) + T(n-2) +1
2) This on i am stumped on but think it's roughly about O(log log n):
T(n) = T([sqrt(n)]) + n. n greater-than-or-equal to 1. sqrt(n) is lower bound.
3) i believe this one is in roughly O(n*log log n):
T(n) = 2T(n/2) + (n/(log n)) + n.
Thanks for the help in advance.
T(n) = T(n-1) + T(n-2) + 1
Assuming T(0) = 0 and T(1) = a, for some constant a, we notice that T(n) - T(n-1) = T(n-2) + 1. That is, the growth rate of the function is given by the function itself, which suggests this function has exponential growth.
Let T'(n) = T(n) + 1. Then T'(n) = T'(n-1) + T'(n-2), by the above recurrence relation, and we have eliminated the troublesome constant term. T(n) and U(n) differ by a constant factor of 1, so assuming they are both non-decreasing (they are) then they will have the same asymptotic complexity, albeit for different constants n0.
To show T'(n) has asymptotic growth of O(b^n), we would need some base cases, then the hypothesis that the condition holds for all n up to, say, k - 1, and then we'd need to show it for k, that is, cb^(n-2) + cb^(n-1) < cb^n. We can divide through by cb^(n-2) to simplify this to 1 + b <= b^2. Rearranging, we get b^2 - b - 1 > 0; roots are (1 +- sqrt(5))/2, and we must discard the negative one since we cannot use a negative number as the base for our exponent. So for b >= (1+sqrt(5))/2, T'(n) may be O(b^n). A similar thought experiment will show that for b <= (1+sqrt(5))/2, T'(n) may be Omega(b^n). Thus, for b = (1+sqrt(5))/2 only, T'(n) may be Theta(b^n).
Completing the proof by induction that T(n) = O(b^n) is left as an exercise.
T(n) = T([sqrt(n)]) + n
Obviously, T(n) is at least linear, assuming the boundary conditions require T(n) be nonnegative. We might guess that T(n) is Theta(n) and try to prove it. Base case: let T(0) = a and T(1) = b. Then T(2) = b + 2 and T(4) = b + 6. In both cases, a choice of c >= 1.5 will work to make T(n) < cn. Suppose that whatever our fixed value of c is works for all n up to and including k. We must show that T([sqrt(k+1)]) + (k+1) <= c*(k+1). We know that T([sqrt(k+1)]) <= csqrt(k+1) from the induction hypothesis. So T([sqrt(k+1)]) + (k+1) <= csqrt(k+1) + (k+1), and csqrt(k+1) + (k+1) <= c(k+1) can be rewritten as cx + x^2 <= cx^2 (with x = sqrt(k+1)); dividing through by x (OK since k > 1) we get c + x <= cx, and solving this for c we get c >= x/(x-1) = sqrt(k+1)/(sqrt(k+1)-1). This eventually approaches 1, so for large enough n, any constant c > 1 will work.
Making this proof totally rigorous by fixing the following points is left as an exercise:
making sure enough base cases are proven so that all assumptions hold
distinguishing the cases where (a) k + 1 is a perfect square (hence [sqrt(k+1)] = sqrt(k+1)) and (b) k + 1 is not a perfect square (hence sqrt(k+1) - 1 < [sqrt(k+1)] < sqrt(k+1)).
T(n) = 2T(n/2) + (n/(log n)) + n
This T(n) > 2T(n/2) + n, which we know is the recursion relation for the runtime of Mergesort, which by the Master theorem is O(n log n), s we know our complexity is no less than that.
Indeed, by the master theorem: T(n) = 2T(n/2) + (n/(log n)) + n = 2T(n/2) + n(1 + 1/(log n)), so
a = 2
b = 2
f(n) = n(1 + 1/(log n)) is O(n) (for n>2, it's always less than 2n)
f(n) = O(n) = O(n^log_2(2) * log^0 n)
We're in case 2 of the Master Theorem still, so the asymptotic bound is the same as for Mergesort, Theta(n log n).

Solution to the equation using the generalization of the Master Theorem

I ask help in explaining how the proof works. I've seen examples of it, but have trouble understanding it.
Prove the following
The solution to the equation
T(n) = aT(n/b) + Θ(nk logp n) where a ≥ 1, b > 1, p ≥ 0
T(n) = O(nlogb a) if a > bk
T(n) = O(nk logp+1 n) if a = bk
T(n) = O(nk logp (n)) if a < bk
Here is the screenshot of the question in a better format
This a generalization of the Master Theorem.
For some x =log(n)/log(b) one has n=bx. Divide the equation by ax
T(bx)/ax = T(bx-1)/ax-1 + Θ((bk/a)x·xp·logp b)
The summation of terms mp·qm for m < x is
bounded by a constant for q < 1
growing like xp+1 for q = 1
dominated by the last term xp·qx for q > 1
Recognizing q=bk/a and substituting back gives the result
for a < bk: T(bx)=O(ax), or T(n)=O(nlogba)
for a = bk: T(bx)=O(xp+1·ax), or T(n)=O(nlogba·logp+1n )
for a > bk: T(bx)=O(xp·bkx), or T(n)=O(nk·logpn)

Difference between solving T(n) = 2T(n/2) + n/log n and T(n) = 4T(n/2) + n/log n using Master Method

I recently stumbled upon a resource where the 2T(n/2) + n/log n type of recurrences were declared unsolvable by MM.
I accepted it as a lemma, until today, when another resource proved to be a contradiction (in some sense).
As per the resource (link below): Q7 and Q18 in it are the rec. 1 and 2 respectively in the question whereby, the answer to Q7 says it can't be solved by giving the reason 'Polynomial difference b/w f(n) and n^(log a base b)'.
On the contrary, answer 18 solves the second recurrence (in the question here) using case 1.
http://www.csd.uwo.ca/~moreno/CS433-CS9624/Resources/master.pdf
Can somebody please clear the confusion?
If you try to apply the master theorem to
T(n) = 2T(n/2) + n/log n
You consider a = 2, b = 2 which means logb(a) = 1
Can you apply case 1?0 < c < logb(a) = 1. Is n/logn = O(n^c). No, because n/logn grow infinitely faster than n^c
Can you apply case 2? No. c = 1 You need to find some k > 0 such that n/log n = Theta(n log^k n )
Can you apply case 3 ? c > 1, is n/logn = Big Omega(n^c) ? No because it is not even Big Omega(n)
If you try to apply the master theorem to
T(n) = 4T(n/2) + n/log n
You consider a = 4, b = 2 which means logb(a) = 2
Can you apply case 1? c < logb(a) = 2. is n/logn = O(n^0) or n/logn = O(n^1). Yes indeed n/logn = O(n). Thus we have
T(n) = Theta(n^2)
note: Explanation about 0 < c <1, case 1
The case 1 is more about analytics.
f(x) = x/log(x) , g(x) = x^c , 0< c < 1
f(x) is O(g(x)) if f(x) < M g(x) after some x0, for some M finite, so
f(x) is O(g(x)) if f(x)/g(x) < M cause we know they are positive
This isnt true here We pose y = log x
f2(y) = e^y/y , g2(y) = e^cy , 0< c < 1
f2(y)/g2(y) = (e^y/y) / (e^cy) = e^(1-c)y / y , 0< c < 1
lim inf f2(y)/g2(y) = inf
lim inf f(x)/g(x) = inf
This is because in Q18 we have a = 4 and b = 2, thus we get that n^{log(b,a)} = n^2 which has an exponent strictly bigger than the exponent of the polynomial part of n/log(n).

How to Prove Asymptotic Notations

I want to prove the following statement
2^(⌊lg n⌋+⌈lg n⌉)∕n ∈ Θ(n)
I know that to prove it, we have to find the constants c1>0, c2>0, and n0>0 such that
c1.g(n) <= f(n) <= c2.g(n) for all n >= n0
In other words, we have to prove f(n) <= c.g(n) and f(n) >= c.g(n).
The problem is how to prove the left hand side (2^(⌊lg n⌋+⌈lg n⌉)∕n)
Thank you
You can start by expanding the exponential. It is equal to n1*n2/n, where n1<=n<=n2, 2*n1>n and n*2>n2. The rest should be easy.
Here's a derivation for the upper bound:
2^(⌊lg n⌋+⌈lg n⌉)/n
= 2^(2⌊lg n⌋+1)/n
<= 2^(2 lg n + 1)/n
= 2^(2 lg n) 2^(1) / n
= 2 n^2 / n
= 2 n
= O(n)
So we know your function can be bounded above by 2*n. Now we do the lower bound:
2^(⌊lg n⌋+⌈lg n⌉)/n
= 2^(2⌈lg n⌉ - 1) / n
>= 2^(2 lg n - 1)/n
= 2^(2 lg n) 2^(-1) / n
= 1/2 n^2 / n
= 1/2 n
= O(n)
We now know that your function can be bounded below by n/2.
Checked on gnuplot; these answers look good and tight. This is a purely algebraic solution using the definition if floor() and ceiling() functions.

What is the recurrence if the base case is O(n)?

We have to create an algorithm and find and solve its recurrence. Finding the recurrence has me stumped..
foo(A, C)
if (C.Length = 0)
Sum(A)
else
t = C.Pop()
A.Push(t)
foo(A,C)
foo(A,C)
Initially A is empty and C.Length = n. I can't give the real algorithm because that's not allowed.
My instructor told me that I might try to use 2 variables. This is what I came up with:
T(n, i) = { n if i = 0
2*T(n, i-1) + C if i != 0
I couldn't solve it, so I also tried to solve a recurrence with just one variable:
T(n) = { n0 if n = 0
2*T(n-1) + C if n != 0
Where n0 is the initial value of n.
How do you form a recurrence from an algorithm where the complexity of the base case is O(n)?
Let f(n) be the complexity if C is of size n. Let N be the original size of C.
Then f(0) = N and f(n) = 2 * f(n - 1) + c.
This has the solution f(n) = N * 2^n + (2^n - 1) * c, and so f(N) = O(N * 2^N).

Resources