Recursive Tribonacci Sequence Time Complexity - algorithm

How do you calculate the time complexity of the recursive tribonacci function F(n) = F(n-1) + F(n-2) + F(n-3) with base cases F(0) = 0, F(1) = F(2) = 1?

It's easier to use induction to prove it's O(1.84^(n-1)).
T(n) = 1 when n <= 2 and T(n) = T(n-1) + T(n-2) + T(n-3) when n > 2.
Base case:
T(3) = 1 + 1 + 1 = 3
T(3) = 1.84^2 ≈ 3
T(3) = O(1.84^(n-1))
Inductive case: Assume T(n-1) = 1.84^(n-2). Then,
T(n) = T(n-1) + T(n-2) + T(n-3)
T(n) = 1.84^(n-2) + 1.84^(n-3) + 1.84^(n-4)
T(n) ≈ 1.84^(n-1)
T(n) = O(1.84^(n-1))
If you want it to be exact, use the tribonacci constant instead, but it's tedious to show it is equal. However, I can edit this to show it if you want.

Related

The Recurrence T(n)= 2T(n-1) + (2^n)

Can someone please help me with this ?
Use iteration method to solve it. T(n)= 2T(n-1) + (2^n) , T(0) = 1
Explanation of steps would be greatly appreciated.
I tried to solve the recursion as follows
T(n)= 2T(n-1) + 2^n
T(n-1) = 2T(n-2) + 2^(n-1)
T(n-2) = 2T(n-3) + 2^(n-2)
T(n-3) = 2T(n-4) + 2^(n-3)
...
T(0) = 1
Then:
T(n) = 2^k * T(n-k) + ... Here's where I get stuck.
Well, let's compute some values for small n:
T(0) = 1
T(1) = 4
T(2) = 12
T(3) = 32
T(4) = 80
T(5) = 192
the function seems to be exponetial; we have 2^n term that's why let's check if
T(n) = f(n) * 2^n
where f(n) is some unknown function. If we divide by 2^n we have f(n) = T(n) / 2^n
T(0) / 2^0 = 1
T(1) / 2^1 = 2
T(2) / 2^2 = 3
T(3) / 2^3 = 4
Looks quite clear that f(n) = n + 1 and
T(n) = (n + 1) * 2^n
Now let's prove it by induction.
Base: it holds for n = 0: (0 + 1) * 2^0 = 1
Step: from T(n - 1) = n * 2^(n - 1) we have
T(n) = 2 * T(n - 1) + 2^n =
= 2 * n * 2^(n - 1) + 2^n =
= n * 2^n + 2^n =
= (n + 1) * 2^n
So, if T(n - 1) holds, T(n) holds as well.
Q.E.D.
Closed formula for
T(n) = 2T(n-1) + (2^n)
T(0) = 1
Is
T(n) = (n + 1) * 2^n
Cheating way: try oeis (on-line encyclopedia of integer sequences) and you'll find A001787

Recursive Time Complexity With Fibonacci Numbers?

public static int recurC(int n) {
if(n==1)
return 1;
return n + recurC(n-1) + recurC(n-2);
}
So I need to find the formal equation for T(n). I set it up as a recurrence relation with T(n) = C + T(n-1) + T(n-2). However, when I tried to evaluate it out, I got nowhere. The relation with each subsequent recursive call isn't entirely clear to me. Any help would be appreciated, thanks!
Let's analyze the recurrence relation below:
T(n) = T(n-1) + T(n-2) + C
T(0) = T(1) = 1
Notice that T(n-1) ≈ T(n-2), that is to say:
Number of operations performed with input size n - 1 is approximately equal to the number of operations performed with input size n - 2.
Therefore, we can show that:
T(n) = T(n-1) + T(n-2) + C
T(n) ≈ 2T(n-2) + C
T(n) ≈ 4T(n-4) + 3C
T(n) ≈ 8T(n-6) + 7C
...
T(n) = 2^k*T(n-2k) + (2^k-1)*C
n-2k = 0 --> k = n/2
T(n) = 2^(n/2) + (2^(n/2)-1)*C
T(n) = (1 + C)*2^(N/2) - C
Therefore we can conclude that, T(n) ∈ O(2^(n/2))

Is T(n)= T(n-1) + n always n(n+1)/2 or O(n^2)

I watched a video where they prove T(n)= T(n-1) + n is O(n^2)
I have the following expressions which are:
T(1) = 4
T(N) = T(N – 1) + N + 3, N > 1
My question is, is the expression above solved the same way, even though there is a +3 after N.
The question is a bit messed up, but i hope you get the point. If there are questions i will try to explain better.
In a word is T(N) = T(N – 1) + N + 3 = O(n^2)
T(n) = T(n-1) + n-1 + 4 => given equation by adding 1 and subtracting 1
T(n) = T(n-1) + n-1 + T(1) ...(1)
Now, T(1) = constant.
Therefore, from eq(1),
T(n) = T(n-1) + (n-1) ...(2)
Eq(2) reduces to T(n) = T(n-k) + n*k - k*(k+1)/2 ...(3)
Upon substituting (n-k)=1 or k=(n-1) in eq(3),
we get,
T(n) = T(1) + n*(n-1) - (n-1)(n)/2
T(n) = n*(n-1)/2 => O(n^2)
PS: If we won't neglect T(1) in eq(1), final equation we get is T(n) = n*(n-1)/2 + T(1) + 4*k => T(n) = n*(n-1)/2 + 4 + 4*(n-1) which still gives O(n^2) as final answer.

Time complexity of recursion-tree

What will be the time complexity of a function Fn(n) which recursively calls Fn(1),Fn(2),Fn(3),...,Fn(n-1) to solve Fn(n). Fn(1) =1 is given as base condition. Will it be O(n^n) or less. I think it should be less than O(n^n) but i am not able to find a way to get the correct complexity of this recursion.
recursion tree for Fn(4) would be something like this
Fn(4)
/ | \
Fn(3) Fn(2) Fn(1)
/ \ /
Fn(2) Fn(1) Fn(1)
/
Fn(1)
The recurrence would look something like this:
T(1) = 1
T(n) = Σ T(i), from i = 1 to n-1
Not particularly helpful at first glance, huh? So Let's break this down into subproblems and see what they look like:
T(5) = T(4) + T(3) + T(2) + T(1)
=> T(5) = T(4) + T(3) + T(2) + 1
// The sub problems
T(4) = T(3) + T(2) + 1
T(3) = T(2) + 1
T(2) = 1
Now let's substitute some of these sub problems back into our original problem:
T(5) = T(4) + T(3) + T(2) + 1
=> T(5) = T(4) + T(4)
=> T(5) = 2T(4)
So we can derive that the recurrence really looks like:
T(n) = 2T(n-1)
T(n-1) = 2T(n-2)
So we can rewrite our recurrence as
T(n) = 2[ 2T(n-2) ]
T(n) = 2[ 2 [ 2T(n-3) ] ]
...
T(n) = 2^k [ T(n-k) ]
Since our base case we've described earlier is
T(1) = 1
// Therefore
n = 1
k = 1
n = k
Now we can substitute at our recurrence for:
T(n) = 2^n [ T(1) ]
T(n) = 2^n [ O(1) ]
=> T(n) = 2^n
Therefore, your recurrence is O(2^n)
T(F(I)) = T(F(I - 1)) + T(F(I - 1)) + O(1), so it looks like O(2^n).
(Take a look at you recursion tree, T(F(4)) = T(F(3)) + T(F(2)) + T(F(1)) + O(1), while substitute T(F(2)) + T(F(1)) with T(F(3)) you will get T(F(4)) = T(F(3)) + T(F(3)))
We can prove by induction that it is 2^n
First we prove that F(n) = (2^n-1) * F(1)
it is true for n=1 so let's prove this for n+1
F(n+1) = F(n) + F(n-1) + .. + F(1)
=(2^n-1 + 2^n-2 +...+2^0) * F(1)
=((1-2^n)/(1-2)) * F(1) = 2^n * F(1)
So we have the formula and from it the complexity is easy to get
Because you memoize, it will be O(n) time. You use O(n) in memory to compensate for it.

How to solve for this recurrence T(n) = T(n − 1) + lg(1 + 1/n), T(1) = 1?

I got stuck in this recurrence:
T(n) = T(n − 1) + lg(1 + 1/n), T(1) = 1?
for a while and it seems the master method cannot be applied on this one.
We have:
lg(1 + 1/n) = lg((n + 1) / n) = lg(n+1) - lg(n)
Hence:
T(n) - T(n - 1) = lg(n + 1) - lg(n)
T(n-1) - T(n - 2) = lg(n) - lg(n - 1)
...
T(3) - T(2) = lg(3) - lg(2)
T(2) - T(1) = lg(2) - lg(1)
Adding and eliminating, we get:
T(n) - T(1) = lg(n + 1) - lg(1) = lg(n + 1)
or T(n) = 1 + lg(n + 1)
Hence T(n) = O(lg(n))
Same answer as the other correct answer here, just proved differently.
All the following equations are created from the given recurrence:
T(n) = T(n-1) + Log((n+1)/n)
T(n-1) = T(n-2) + Log(n/(n-1))
.
.
.
T(2) = T(1) + Log(3/2)
Summing all RHS and LHS in the above equations results in:
T(n) = T(1) + Log(3/2) + Log(4/3) + ... + Log((n+1)/n)
Since Log(a) + Log(b) = Log(ab),
T(n) = 1 + Log((n+1)/2)
T(n) = Log(10) + Log((n+1)/2) = Log(5n + 5) assuming that base was 10 and using 1 = Log1010
Therefore T(n) = O(Log(5n + 5)) = O(Log(n))
It is not linear as some people claim. It is O(log(n)). Here is mathematical analysis:
If you will start unrolling recursion you will get:
if you will do this till the end you will have
or in a short form:
Once you approximate the sum with an integral, you will get:
Finally if you will take a limit x -> infinity:
You will see that the first part is
Which gives you a final solution log(x + 1) which is O(log(n))

Resources