Recurrence relation T(n) = 3T(n-1) + n - big-o

I'm trying to solve the recurrence relation T(n) = 3T(n-1) + n and I think the answer is O(n^3) because each new node spawns three child nodes in the recurrence tree. Is this correct? And, in terms of the recurrence tree, is there a more mathematical way to approach it?

Recurrence equations are just math: Unfold, sum up and simplify.
T(n) = 3T(n-1) + n = 3 (3 T(n - 2) + (n-1)) + n
= 3^2 T(n - 3) + 3^2(n-2) + 3(n-1) + n)
= ...
= 3^i T(n - i) + Σ_(j=0..i-1) 3^j * (n-j)
= Σ_(j=0..n-1) 3^j * (n-j) // assuming T(0) = 0
Now we can find different upper bounds depending on how we much we want to think about it:
T(n) = Σ_(j=0..n-1) 3^j * (n-j)
< n * Σ_(j=0..n-1) 3^j
= n * (3^n - 1)
So T(n) = O(n*3^n).
You can also get a tighter bound by splitting the sum up into two parts:
T(n) = Σ_(j=0..n-1) 3^j * (n-j)
< n * Σ_(j=0..n-x) 3^j * (n-j) + x * Σ_(j=n-x..n-1) 3^j
= n (3^(n-x+1) - 1) + x * (3^n - 3^(n-x+1))
using x = log_3(n) you get T(n) = O(3^n * log(n)).
You can also approximate the sum Σ(n-i)3^i using an integral and get the exact complexity Θ(3^n).

Related

Solving recurrence using the iteration method

I need help in solving T(n) = T(n/4) + T(n/3) + 2n using iteration method (recursion tree. I am thinking it would be either Θ(2n) or Θ(n)?
It's straightforward. We have two following inequalities:
T(n) > 2T(n/4) + 2n
and
T(n) < 2T(n/3) + 2n
Now, try to find upper bound and lower bound by expansion. Based on both cases, you will find that T(n) = Theta(n).
For example, for T'(n) = 2T(n/3) + 2n we have the following expansion:
T'(n) = 2T(n/3) + 2n = 2^2 T(n/3^2) + (1 + 2/3) * 2n
By induction we can show that:
T'(n) = 2^log_3(n) T(0) + (1 + 2/3 + 2^2/3^2 + ...) * 2n
< n + 6 * n = 7n
Because 2^log_3(n) < 2^log_2(n) = n and (1 + 2/3 + 2^2/3^2 + ...) is a geometric sum with factor 2/3. Hence, the sum will be 1/(1-2/3) = 3 when n goes to infinity.
You can do the same analysis for the lower bound of T(n).
Therefore, as c1 * n <= T(n) <= c_2 * n, we can conclude that T(n) is in Theta(n).

How to Solve this task 𝑇(𝑛)=4𝑇(𝑛/4)+𝑛√n?

I'm new to Divide and Conquer. I would like to know how to find out the running time of this calculation?
What exactly do I have to pay attention to and how do I proceed?
n=1 runningtime = O(1)
so lets see the calculation for this :
T(n) = 4T(n/4) + n * sqrt(n)
expanding for sum k steps it will be like
T(n) = 4^k[T(n/4^k)] + n * sqrt(n) * {sqrt(1/4)+sqrt(1/16)....}
here {sqrt(1/4)+sqrt(1/16)....} is Geometric progression
if we take k=log4(n) //here base is 4
T(n) = n * [T(1)] + n * sqrt(n)*{1-[2/sqrt(n)]}
T(n) = n * [T(1)] + n * sqrt(n) -2 * n
you still can use
Master theorem
T(n) = aT(n/b) + f(n).
If f(n) = Θ(n^d), where d ≥ 0, then
T(n) = Θ(n^d) if a < bd,
T(n) = Θ((n^d)log n) if a = bd,
T(n) = Θ(n^(logba)) if a > bd
yes the ans is O(n^3/2)
{sorry could not comment due to low reputation}

Time Complexity Solution

Help me solve this recurrence relation
T(n) = 8T(n/2) + qn , n > 1
= p , n = 1
Answer is : n^3
Please solve by back substitution method.
The following is my attempt for the question.
You are doing right:
T(n) = 8T(n/2) + qn
=> T(n) = 8(8T(n/4) + q(n/2)) + qn = 8^2 T(n/2^2) + q(n + 8 * n/2)
=> T(n) = 8^2‌(8T(n/2^3) + q(n/2^2)) + q(n + 8 * n/2)
= 8^3 T(n/2^3) + q ( n + 8 * n/2 + 8^2 * n/2^2)
Now by induction (if we suppose n = 2^k and T(1) = 1):
T(n) = q * sum_{i=0}^{k} 8^i * n / 2^i
Now, simplify the equation as we know 8^i / 2^i = 4^i and by factoring n from the summation, we have:
T(n) = q * n * sum_{i=0}^{k} 4^i = q * n * (4^k - 1)
As 4^k = 2^{2k} = (2^k)^2 = n^2, we can write:
T(n) = q * n * (n^2 - 1) = Theta(n^3)

Solving the Recurrence Relation T(n)=T(n-1)*T(n-2)+c where T(0)=1 and T(1)=2

I need to solve the recurrence relation for
T(n) = T(n - 1) * T(n - 2) + c where T(0) = 1 and T(1) = 2.
The algorithm computes 2f(n) where f(n) is the nth Fibonacci number. I assume that T(n - 2) is approximately equal to T(n - 1) and then I rewrite the relation as
T(n) = T(n - 1)^2 + c
and finality I reach to complexity of O(n). This the algorithm:
Power(n):
if n == 0 then x = 1;
else if n == 1 then x = 2;
else x = Power(n - 1) * Power(n - 2);
return x;
Is it true that the recurrence relation T(n) = T(n - 1) * T(n - 2) + c has O(n) complexity?
You are confusing multiplication in the algorithm with multiplication in the recurrence relation. You calculate Power(n - 1) and Power(n - 2) and multiply them, but this does not take T(n - 1) * T(n - 2) time. These values are computed separately, and then multiplied, taking T(n - 1) + T(n - 2) time, assuming constant-time multiplication.
The recurrence then becomes T(n) = T(n - 1) + T(n - 2) + c. We'll solve this by substitution. I'll skip choosing an n0; our induction hypothesis will be T(n) = d * 2^n - 1. (We need the -1 to allow of to later cancel the c; this is called strengthening the induction hypothesis, see e.g. here).
T(n) = T(n - 1) + T(n - 2) + c
= d * (2^(n - 1) - 1) + d * (2^(n - 2) - 1) + c (substitute by IH)
= 3d * 2^(n - 2) - 2d + c
<= d * 2^n - 2d + c (3 < 2^2)
<= d * 2^n (when d > c / 2)
Conclusion: T(n) ∈ O(2n).
Bonus fact: the tight bound is actually Θ(φn), where φ is the golden ratio. See this answer for an explanation.
First, the recurrence that models the running time is T(n) = T(n-1) + T(n-2) + C for T(1) = T(0) = 1, for some C>0. This is because you make two recursive calls, one on n-1 and the other on n-2.
The recurrence you have is precisely the Fibonacci recurrence if we forget the additional constant (which contributed a factor C in the end). The exact solution is this (image source: Wikipedia):
where
and
You could say that T(n) = Theta(1.618... ^ n).

How to solve: T(n) = T(n - 1) + n

I have the following worked out:
T(n) = T(n - 1) + n = O(n^2)
Now when I work this out I find that the bound is very loose. Have I done something wrong or is it just that way?
You need also a base case for your recurrence relation.
T(1) = c
T(n) = T(n-1) + n
To solve this, you can first guess a solution and then prove it works using induction.
T(n) = (n + 1) * n / 2 + c - 1
First the base case. When n = 1 this gives c as required.
For other n:
T(n)
= (n + 1) * n / 2 + c - 1
= ((n - 1) + 2) * n / 2 + c - 1
= ((n - 1) * n / 2) + (2 * n / 2) + c - 1
= (n * (n - 1) / 2) + c - 1) + (2 * n / 2)
= T(n - 1) + n
So the solution works.
To get the guess in the first place, notice that your recurrence relationship generates the triangular numbers when c = 1:
T(1) = 1:
*
T(2) = 3:
*
**
T(3) = 6:
*
**
***
T(4) = 10:
*
**
***
****
etc..
Intuitively a triangle is roughly half of a square, and in Big-O notation the constants can be ignored so O(n^2) is the expected result.
Think of it this way:
In each "iteration" of the recursion you do O(n) work.
Each iteration has n-1 work to do, until n = base case. (I'm assuming base case is O(n) work)
Therefore, assuming the base case is a constant independant of n, there are O(n) iterations of the recursion.
If you have n iterations of O(n) work each, O(n)*O(n) = O(n^2).
Your analysis is correct. If you'd like more info on this way of solving recursions, look into Recursion Trees. They are very intuitive compared to the other methods.
The solution is pretty easy for this one. You have to unroll the recursion:
T(n) = T(n-1) + n = T(n-2) + (n - 1) + n =
= T(n-3) + (n-2) + (n-1) + n = ... =
= T(0) + 1 + 2 + ... + (n-1) + n
You have arithmetic progression here and the sum is 1/2*n*(n-1). Technically you are missing the boundary condition here, but with any constant boundary condition you see that the recursion is O(n^2).
Looks about right, but will depend on the base case T(1). Assuming you will do n steps to get T(n) to T(0) and each time the n term is anywhere between 0 and n for an average of n/2 so n * n/2 = (n^2)/2 = O(n^2).

Resources