Time complexity of recursion-tree - algorithm

What will be the time complexity of a function Fn(n) which recursively calls Fn(1),Fn(2),Fn(3),...,Fn(n-1) to solve Fn(n). Fn(1) =1 is given as base condition. Will it be O(n^n) or less. I think it should be less than O(n^n) but i am not able to find a way to get the correct complexity of this recursion.
recursion tree for Fn(4) would be something like this
Fn(4)
/ | \
Fn(3) Fn(2) Fn(1)
/ \ /
Fn(2) Fn(1) Fn(1)
/
Fn(1)

The recurrence would look something like this:
T(1) = 1
T(n) = Σ T(i), from i = 1 to n-1
Not particularly helpful at first glance, huh? So Let's break this down into subproblems and see what they look like:
T(5) = T(4) + T(3) + T(2) + T(1)
=> T(5) = T(4) + T(3) + T(2) + 1
// The sub problems
T(4) = T(3) + T(2) + 1
T(3) = T(2) + 1
T(2) = 1
Now let's substitute some of these sub problems back into our original problem:
T(5) = T(4) + T(3) + T(2) + 1
=> T(5) = T(4) + T(4)
=> T(5) = 2T(4)
So we can derive that the recurrence really looks like:
T(n) = 2T(n-1)
T(n-1) = 2T(n-2)
So we can rewrite our recurrence as
T(n) = 2[ 2T(n-2) ]
T(n) = 2[ 2 [ 2T(n-3) ] ]
...
T(n) = 2^k [ T(n-k) ]
Since our base case we've described earlier is
T(1) = 1
// Therefore
n = 1
k = 1
n = k
Now we can substitute at our recurrence for:
T(n) = 2^n [ T(1) ]
T(n) = 2^n [ O(1) ]
=> T(n) = 2^n
Therefore, your recurrence is O(2^n)

T(F(I)) = T(F(I - 1)) + T(F(I - 1)) + O(1), so it looks like O(2^n).
(Take a look at you recursion tree, T(F(4)) = T(F(3)) + T(F(2)) + T(F(1)) + O(1), while substitute T(F(2)) + T(F(1)) with T(F(3)) you will get T(F(4)) = T(F(3)) + T(F(3)))

We can prove by induction that it is 2^n
First we prove that F(n) = (2^n-1) * F(1)
it is true for n=1 so let's prove this for n+1
F(n+1) = F(n) + F(n-1) + .. + F(1)
=(2^n-1 + 2^n-2 +...+2^0) * F(1)
=((1-2^n)/(1-2)) * F(1) = 2^n * F(1)
So we have the formula and from it the complexity is easy to get

Because you memoize, it will be O(n) time. You use O(n) in memory to compensate for it.

Related

The Recurrence T(n)= 2T(n-1) + (2^n)

Can someone please help me with this ?
Use iteration method to solve it. T(n)= 2T(n-1) + (2^n) , T(0) = 1
Explanation of steps would be greatly appreciated.
I tried to solve the recursion as follows
T(n)= 2T(n-1) + 2^n
T(n-1) = 2T(n-2) + 2^(n-1)
T(n-2) = 2T(n-3) + 2^(n-2)
T(n-3) = 2T(n-4) + 2^(n-3)
...
T(0) = 1
Then:
T(n) = 2^k * T(n-k) + ... Here's where I get stuck.
Well, let's compute some values for small n:
T(0) = 1
T(1) = 4
T(2) = 12
T(3) = 32
T(4) = 80
T(5) = 192
the function seems to be exponetial; we have 2^n term that's why let's check if
T(n) = f(n) * 2^n
where f(n) is some unknown function. If we divide by 2^n we have f(n) = T(n) / 2^n
T(0) / 2^0 = 1
T(1) / 2^1 = 2
T(2) / 2^2 = 3
T(3) / 2^3 = 4
Looks quite clear that f(n) = n + 1 and
T(n) = (n + 1) * 2^n
Now let's prove it by induction.
Base: it holds for n = 0: (0 + 1) * 2^0 = 1
Step: from T(n - 1) = n * 2^(n - 1) we have
T(n) = 2 * T(n - 1) + 2^n =
= 2 * n * 2^(n - 1) + 2^n =
= n * 2^n + 2^n =
= (n + 1) * 2^n
So, if T(n - 1) holds, T(n) holds as well.
Q.E.D.
Closed formula for
T(n) = 2T(n-1) + (2^n)
T(0) = 1
Is
T(n) = (n + 1) * 2^n
Cheating way: try oeis (on-line encyclopedia of integer sequences) and you'll find A001787

Solving the recurrence relation for rod cutting problem (without DP) using iteration method

I'm going through the Dynamic Programming chapter in the CLRS book. In the rod cutting problem, this recurrence relation is obtained when we don't use dynamic programming (with base case T(0) = 1). The solution is directly given as T(n) = 2^n.
I can verify that the solution is correct using induction. But I can't seem to figure out how to arrive at this solution step-by-step from the given recurrence using iteration (plug and chug) method. I would really appreciate some help on this matter.
T(0) = 1
T(1) = 1 + T(0)
= 2
T(2) = 1 + T(0) + T(1)
\_,____/
= T(1) + T(1)
= 2*T(1)
= 4
T(3) = 1 + T(0) + T(1) + T(2)
\_,___________/
= T(2) + T(2)
= 2*T(2)
= 8
T(4) = 1 + T(0) + T(1) + T(2) + T(3)
\_,__________________/
= T(3) + T(3)
= 2*T(3)
= 16
:
T(n) = 2*T(n-1) = 2^n

Recursive Tribonacci Sequence Time Complexity

How do you calculate the time complexity of the recursive tribonacci function F(n) = F(n-1) + F(n-2) + F(n-3) with base cases F(0) = 0, F(1) = F(2) = 1?
It's easier to use induction to prove it's O(1.84^(n-1)).
T(n) = 1 when n <= 2 and T(n) = T(n-1) + T(n-2) + T(n-3) when n > 2.
Base case:
T(3) = 1 + 1 + 1 = 3
T(3) = 1.84^2 ≈ 3
T(3) = O(1.84^(n-1))
Inductive case: Assume T(n-1) = 1.84^(n-2). Then,
T(n) = T(n-1) + T(n-2) + T(n-3)
T(n) = 1.84^(n-2) + 1.84^(n-3) + 1.84^(n-4)
T(n) ≈ 1.84^(n-1)
T(n) = O(1.84^(n-1))
If you want it to be exact, use the tribonacci constant instead, but it's tedious to show it is equal. However, I can edit this to show it if you want.

How to solve for this recurrence T(n) = T(n − 1) + lg(1 + 1/n), T(1) = 1?

I got stuck in this recurrence:
T(n) = T(n − 1) + lg(1 + 1/n), T(1) = 1?
for a while and it seems the master method cannot be applied on this one.
We have:
lg(1 + 1/n) = lg((n + 1) / n) = lg(n+1) - lg(n)
Hence:
T(n) - T(n - 1) = lg(n + 1) - lg(n)
T(n-1) - T(n - 2) = lg(n) - lg(n - 1)
...
T(3) - T(2) = lg(3) - lg(2)
T(2) - T(1) = lg(2) - lg(1)
Adding and eliminating, we get:
T(n) - T(1) = lg(n + 1) - lg(1) = lg(n + 1)
or T(n) = 1 + lg(n + 1)
Hence T(n) = O(lg(n))
Same answer as the other correct answer here, just proved differently.
All the following equations are created from the given recurrence:
T(n) = T(n-1) + Log((n+1)/n)
T(n-1) = T(n-2) + Log(n/(n-1))
.
.
.
T(2) = T(1) + Log(3/2)
Summing all RHS and LHS in the above equations results in:
T(n) = T(1) + Log(3/2) + Log(4/3) + ... + Log((n+1)/n)
Since Log(a) + Log(b) = Log(ab),
T(n) = 1 + Log((n+1)/2)
T(n) = Log(10) + Log((n+1)/2) = Log(5n + 5) assuming that base was 10 and using 1 = Log1010
Therefore T(n) = O(Log(5n + 5)) = O(Log(n))
It is not linear as some people claim. It is O(log(n)). Here is mathematical analysis:
If you will start unrolling recursion you will get:
if you will do this till the end you will have
or in a short form:
Once you approximate the sum with an integral, you will get:
Finally if you will take a limit x -> infinity:
You will see that the first part is
Which gives you a final solution log(x + 1) which is O(log(n))

Complexity of Recursion inside for loop

I'm trying to analysis the complexity of this algorithm, I predicted that it's t(n) = n*t(n) + 1 and solve the t(n) via master theorem which is n^logn. however, I'm not sure, and stuck with it.
Algorithm CheckPath(m[0..n-1][0..n-1],v1,v2)
if v1==v2
return 0
cost = 0
for i = 0 to n-1
if m[v1]m[v2]!=0 //any path exits
cost2 = m[v1]m[v2]+checkpath(m,i,v2)
if cost2<cost OR cost==0
cost = cost2
return cost
EDIT: I corrected the costtmp as cost 2, it does not goes to an infinite loop while I check if v1==v2 return 0
I think your approach is wrong. You have an algorithm, which runs over some graph. So try to find its complexity over the underlying graph and not over the recursion you run.
But to answer your original question, your recursion has exponential complexity (and probably it does not terminate), unless your graph is a DAG (directed acyclic graph). The reason for that is because you don't mark the already reached vertices as such. Therefore, the recursion can infinitely loop between two vertices u and v, such that the edges (u, v) and (v, u) are in E (or simply if your graph is unoriented, then any edge will cause this).
For me it seems like:
t(n) = 1 + t(1) + t(2) + ... + t(n-3) + t(n-2) + t(n-1)
t(n-1) = 1 + t(1) + t(2) + ... + t(n-3) + t(n-2)
t(n-2) = 1 + t(1) + t(2) + ... + t(n-3)
...
t(4) = 1 + t(1) + t(2) + t(3) = 8
t(3) = 1 + t(1) + t(2) = 4
t(2) = 1 + t(1) = 2
t(1) = 1
Looking at the first few members of the sequence, it looks like the closed form is t(n) = 2^(n-1).
Can we prove this by induction?
For n == 1 we have t(1) = 2^(1-1) = 1
Suppose we have t(k) = 2^(k-1) for each k < n.
Then:
t(n) = 1 + t(1) + t(2) + ... + t(n-1) = 1 + 2^0 + 2 + 4 + ... + 2^(n-2) = 2^(n-1)
Hence, T(n) = O(2^n)

Resources