What is the recurrence if the base case is O(n)? - algorithm

We have to create an algorithm and find and solve its recurrence. Finding the recurrence has me stumped..
foo(A, C)
if (C.Length = 0)
Sum(A)
else
t = C.Pop()
A.Push(t)
foo(A,C)
foo(A,C)
Initially A is empty and C.Length = n. I can't give the real algorithm because that's not allowed.
My instructor told me that I might try to use 2 variables. This is what I came up with:
T(n, i) = { n if i = 0
2*T(n, i-1) + C if i != 0
I couldn't solve it, so I also tried to solve a recurrence with just one variable:
T(n) = { n0 if n = 0
2*T(n-1) + C if n != 0
Where n0 is the initial value of n.
How do you form a recurrence from an algorithm where the complexity of the base case is O(n)?

Let f(n) be the complexity if C is of size n. Let N be the original size of C.
Then f(0) = N and f(n) = 2 * f(n - 1) + c.
This has the solution f(n) = N * 2^n + (2^n - 1) * c, and so f(N) = O(N * 2^N).

Related

How can I solve recurrence relation T(n)=T(√n)+O(n)? [duplicate]

I know how to solve the recurrence relations using Master Method.
Also I'm aware of how to solve the recurrences below:
T(n) = sqrt(n)*T(sqrt(n)) + n
T(n) = 2*T(sqrt(n)) + lg(n)
In the above two recurrences there is same amount of work at each level of the recursion tree. And there are a total of log log n levels in the recursion tree.
I'm having trouble in solving this one:
T(n) = 4*T(sqrt(n)) + n
EDIT:
Here n is a power of 2
Suppose that n = 2^k. We have T(2^k) = 4*T(2^(k/2)) + 2^k. Let S(k) = T(2^k). We have S(k) = 4S(k/2) + 2^k. By using Mater Theorem, we get S(k) = O(2^k). Since S(k) = O(2^k) and S(k) = T(2^k), T(2^k) = O(2^k) which implies T(n) = O(n).
I'm having trouble in solving this one: T(n) = 4*T(sqrt(n)) + n
EDIT: Here n is a power of 2
This edit is important. So lets say that the recurrence stops at 2.
So the question now is how deep the recursion tree is. Well, that is the number of times that you can take the square root of n before n gets sufficiently small (say, less than 2). If we write
n = 2lg n
then on each recursive call n will have its square root taken. This is equivalent to halving the above exponent, so after k iterations we have that
n1/(2k) = 2lg n/(2k)
We want to stop when this is less than 2, giving
2lg n/(2k) = 2
lg n/(2k) = 1
lg n = 2k
lg lg n = k
So after lg lg n iterations of square rooting the recursion stops. (source)
For each recursion we will have 4 new branches, the total of branches is 4 ^ (depth of the tree) therefore 4^(lg lg n).
EDIT:
Source
T(n) = 4 T(sqrt(n)) + n
4 [ 4 T(sqrt(sqrt(n) + n ] + n
4^k * T(n^(1/2^k)) +kn because n is power of 2.
4^k * T(2^(L/2^k)) +kn [ Let n = 2^L , L= logn]
4^k * T(2) +kn [ Let L = 2^k, k = logL = log log n]
2^2k * c +kn
L^2 * c + nloglogn
logn^2 * c + nloglogn
= O(nloglogn)
T(n) = 4T(√n) + n
suppose that (n = 2^m) . so we have :
T(2^m) = 4T(2^(m/2)) + (2^m)
now let name T(2^m) as S(m):
S(m) = 4S(m/2) + m . now with master Method we can solve this relation, and the answer is :
S(m) = Θ(m^2)
now we step back to T(2^m):
T(2^m) = Θ((2^m)^2)
now we need m to solve our problem and we can get it from the second line and we have :
n = 2^m => m=lgn
and the problem solved .
T(n) = Θ((2^lgn)^2)
T(n) = Θ(n^2)

Solution to the equation using the generalization of the Master Theorem

I ask help in explaining how the proof works. I've seen examples of it, but have trouble understanding it.
Prove the following
The solution to the equation
T(n) = aT(n/b) + Θ(nk logp n) where a ≥ 1, b > 1, p ≥ 0
T(n) = O(nlogb a) if a > bk
T(n) = O(nk logp+1 n) if a = bk
T(n) = O(nk logp (n)) if a < bk
Here is the screenshot of the question in a better format
This a generalization of the Master Theorem.
For some x =log(n)/log(b) one has n=bx. Divide the equation by ax
T(bx)/ax = T(bx-1)/ax-1 + Θ((bk/a)x·xp·logp b)
The summation of terms mp·qm for m < x is
bounded by a constant for q < 1
growing like xp+1 for q = 1
dominated by the last term xp·qx for q > 1
Recognizing q=bk/a and substituting back gives the result
for a < bk: T(bx)=O(ax), or T(n)=O(nlogba)
for a = bk: T(bx)=O(xp+1·ax), or T(n)=O(nlogba·logp+1n )
for a > bk: T(bx)=O(xp·bkx), or T(n)=O(nk·logpn)

complexity algorithm recurrence relation

int function(int n){
if (n<=1)
return 1;
else
return (2*function(n/2));
}
What is the recurrence relation T(n) for running time , and why ?
The complexity-function of this algorithm would be
T(n) = T(n / 2) + 1
T(1) = 1
Applying the master-theorem, we would get
a = 1
b = 2
c = 0 (1 = n^0)
log b(A) = log2(1) = 0 = 0 c, thus case 2
apply values and the result is O(log n).
As #guillaume already correctly stated, this can be solved a lot easier by using a linear function though.
You can calculate directly: it is the nearest 2^n, largest or equal.
You calculate L=log2(n), and you take 2^L, or 2^(L+1)
Complexity is O(log2 N) : log2 N operations.

Solving Recurrence relation: T(n) = 3T(n/5) + lgn * lgn

Consider the following recurrence
T(n) = 3T(n/5) + lgn * lgn
What is the value of T(n)?
(A) Theta(n ^ log_5{3})
(B) Theta(n ^ log_3{5})
(c) Theta(n Log n )
(D) Theta( Log n )
Answer is (A)
My Approach :
lgn * lgn = theta(n) since c2lgn < 2*lglgn < c1*lgn for some n>n0
Above inequality is shown in this picture for c2 = 0.1 and c1 = 1
log_5{3} < 1,
Hence by master theorem answer has to be theta(n) and none of the answers match. How to solve this problem??
Your claim that lg n * lg n = Θ(n) is false. Notice that the limit of (lg n)2 / n tends toward 0 as n goes to infinity. You can see this using l'Hopital's rule:
limn → ∞ (lg n)2 / n
= lim n → ∞ 2 lg n / n
= lim n → ∞ 2 / n
= 0
More generally, using similar reasoning, you can prove that lg n = o(nε) for any ε > 0.
Let's try to solve this recurrence using the master theorem. We see that there are three subproblems of size n / 5 each, so we should look at the value of log5 3. Since (lg n)2 = o(nlog5 3), we see that the recursion is bottom-heavy and can conclude that the recurrence solves to O(nlog5 3), which is answer (A) in your list up above.
Hope this helps!
To apply Master Theorem we should check the relation between
nlog5(3) ~= n0.682 and (lg(n))2
Unfortunately lg(n)2 != 2*lg(n): it is lg(n2) that's equal to 2*lg(n)
Also, there is a big difference, in Master Theorem, if f(n) is O(nlogb(a)-ε), or instead Θ(nlogba): if the former holds we can apply case 1, if the latter holds case 2 of the theorem.
With just a glance, it looks highly unlikely (lg(n))2 = Ω(n0.682), so let's try to prove that (lg(n))2 = O(n0.682), i.e.:
∃ n0, c ∈ N+, such that for n>n0, (lg(n))2 < c * n0.682
Let's take the square root of both sides (assuming n > 1, the inequality holds)
lg(n) < c1 * n0.341 , (where c1 = sqrt(c))
now we can assume, that lg(n) = log2(n) (otherwise the multiplicative factor could be absorbed by our constant - as you know constant factors don't matter in asymptotic analysis) and exponentiate both sides:
2lg(n) < 2c2 * n0.341 <=> n < 2c2 * n0.341 <=> n < (n20.341)c2 <=> n < (n20.341)c2 <=> n < (n1.266)c2
which is immediately true choosing c2 = 1 and n0 = 1
Therefore, it does hold true that f(n) = O(nlogb(a)-ε), and we can apply case 1 of the Master Theorem, and conclude that:
T(n) = O(nlog53)
Same result, a bit more formally.

How to solve this recurrence relation: T(n) = 4*T(sqrt(n)) + n

I know how to solve the recurrence relations using Master Method.
Also I'm aware of how to solve the recurrences below:
T(n) = sqrt(n)*T(sqrt(n)) + n
T(n) = 2*T(sqrt(n)) + lg(n)
In the above two recurrences there is same amount of work at each level of the recursion tree. And there are a total of log log n levels in the recursion tree.
I'm having trouble in solving this one:
T(n) = 4*T(sqrt(n)) + n
EDIT:
Here n is a power of 2
Suppose that n = 2^k. We have T(2^k) = 4*T(2^(k/2)) + 2^k. Let S(k) = T(2^k). We have S(k) = 4S(k/2) + 2^k. By using Mater Theorem, we get S(k) = O(2^k). Since S(k) = O(2^k) and S(k) = T(2^k), T(2^k) = O(2^k) which implies T(n) = O(n).
I'm having trouble in solving this one: T(n) = 4*T(sqrt(n)) + n
EDIT: Here n is a power of 2
This edit is important. So lets say that the recurrence stops at 2.
So the question now is how deep the recursion tree is. Well, that is the number of times that you can take the square root of n before n gets sufficiently small (say, less than 2). If we write
n = 2lg n
then on each recursive call n will have its square root taken. This is equivalent to halving the above exponent, so after k iterations we have that
n1/(2k) = 2lg n/(2k)
We want to stop when this is less than 2, giving
2lg n/(2k) = 2
lg n/(2k) = 1
lg n = 2k
lg lg n = k
So after lg lg n iterations of square rooting the recursion stops. (source)
For each recursion we will have 4 new branches, the total of branches is 4 ^ (depth of the tree) therefore 4^(lg lg n).
EDIT:
Source
T(n) = 4 T(sqrt(n)) + n
4 [ 4 T(sqrt(sqrt(n) + n ] + n
4^k * T(n^(1/2^k)) +kn because n is power of 2.
4^k * T(2^(L/2^k)) +kn [ Let n = 2^L , L= logn]
4^k * T(2) +kn [ Let L = 2^k, k = logL = log log n]
2^2k * c +kn
L^2 * c + nloglogn
logn^2 * c + nloglogn
= O(nloglogn)
T(n) = 4T(√n) + n
suppose that (n = 2^m) . so we have :
T(2^m) = 4T(2^(m/2)) + (2^m)
now let name T(2^m) as S(m):
S(m) = 4S(m/2) + m . now with master Method we can solve this relation, and the answer is :
S(m) = Θ(m^2)
now we step back to T(2^m):
T(2^m) = Θ((2^m)^2)
now we need m to solve our problem and we can get it from the second line and we have :
n = 2^m => m=lgn
and the problem solved .
T(n) = Θ((2^lgn)^2)
T(n) = Θ(n^2)

Resources