Complexity of Recursion inside for loop - algorithm

I'm trying to analysis the complexity of this algorithm, I predicted that it's t(n) = n*t(n) + 1 and solve the t(n) via master theorem which is n^logn. however, I'm not sure, and stuck with it.
Algorithm CheckPath(m[0..n-1][0..n-1],v1,v2)
if v1==v2
return 0
cost = 0
for i = 0 to n-1
if m[v1]m[v2]!=0 //any path exits
cost2 = m[v1]m[v2]+checkpath(m,i,v2)
if cost2<cost OR cost==0
cost = cost2
return cost
EDIT: I corrected the costtmp as cost 2, it does not goes to an infinite loop while I check if v1==v2 return 0

I think your approach is wrong. You have an algorithm, which runs over some graph. So try to find its complexity over the underlying graph and not over the recursion you run.
But to answer your original question, your recursion has exponential complexity (and probably it does not terminate), unless your graph is a DAG (directed acyclic graph). The reason for that is because you don't mark the already reached vertices as such. Therefore, the recursion can infinitely loop between two vertices u and v, such that the edges (u, v) and (v, u) are in E (or simply if your graph is unoriented, then any edge will cause this).

For me it seems like:
t(n) = 1 + t(1) + t(2) + ... + t(n-3) + t(n-2) + t(n-1)
t(n-1) = 1 + t(1) + t(2) + ... + t(n-3) + t(n-2)
t(n-2) = 1 + t(1) + t(2) + ... + t(n-3)
...
t(4) = 1 + t(1) + t(2) + t(3) = 8
t(3) = 1 + t(1) + t(2) = 4
t(2) = 1 + t(1) = 2
t(1) = 1
Looking at the first few members of the sequence, it looks like the closed form is t(n) = 2^(n-1).
Can we prove this by induction?
For n == 1 we have t(1) = 2^(1-1) = 1
Suppose we have t(k) = 2^(k-1) for each k < n.
Then:
t(n) = 1 + t(1) + t(2) + ... + t(n-1) = 1 + 2^0 + 2 + 4 + ... + 2^(n-2) = 2^(n-1)
Hence, T(n) = O(2^n)

Related

What is the time complexity of the function T(n)=2T(n/4)+O(1)?(Without masters theorem)

Can anybody please explain the time complexity of T(n)=2T(n/4)+O(1) using recurrence tree? I saw somewhere it says O(n^1/2).
Just expand the equation for some iteration, and use the mathematical induction to prove the observed pattern:
T(n) = 2T(n/4) + 1 = 2(2T(n/4^2) + 1) + 1 = 2^2 T(n/4^2) + 2 + 1
Hence:
T(n) = 1 + 2 + 2^2 + ... + 2^k = 2^(k+1) - 1 \in O(2^(k+1))
What is k? from the expansion 4^k = n. So, k = 1/2 log(n). Thus, T(n) \in O(2^(1/2 log(n) + 1)) = O(sqrt(n)). Note that 2^log(n) = n.

Solving the recurrence relation for rod cutting problem (without DP) using iteration method

I'm going through the Dynamic Programming chapter in the CLRS book. In the rod cutting problem, this recurrence relation is obtained when we don't use dynamic programming (with base case T(0) = 1). The solution is directly given as T(n) = 2^n.
I can verify that the solution is correct using induction. But I can't seem to figure out how to arrive at this solution step-by-step from the given recurrence using iteration (plug and chug) method. I would really appreciate some help on this matter.
T(0) = 1
T(1) = 1 + T(0)
= 2
T(2) = 1 + T(0) + T(1)
\_,____/
= T(1) + T(1)
= 2*T(1)
= 4
T(3) = 1 + T(0) + T(1) + T(2)
\_,___________/
= T(2) + T(2)
= 2*T(2)
= 8
T(4) = 1 + T(0) + T(1) + T(2) + T(3)
\_,__________________/
= T(3) + T(3)
= 2*T(3)
= 16
:
T(n) = 2*T(n-1) = 2^n

Recursive Tribonacci Sequence Time Complexity

How do you calculate the time complexity of the recursive tribonacci function F(n) = F(n-1) + F(n-2) + F(n-3) with base cases F(0) = 0, F(1) = F(2) = 1?
It's easier to use induction to prove it's O(1.84^(n-1)).
T(n) = 1 when n <= 2 and T(n) = T(n-1) + T(n-2) + T(n-3) when n > 2.
Base case:
T(3) = 1 + 1 + 1 = 3
T(3) = 1.84^2 ≈ 3
T(3) = O(1.84^(n-1))
Inductive case: Assume T(n-1) = 1.84^(n-2). Then,
T(n) = T(n-1) + T(n-2) + T(n-3)
T(n) = 1.84^(n-2) + 1.84^(n-3) + 1.84^(n-4)
T(n) ≈ 1.84^(n-1)
T(n) = O(1.84^(n-1))
If you want it to be exact, use the tribonacci constant instead, but it's tedious to show it is equal. However, I can edit this to show it if you want.

Time complexity of recursion-tree

What will be the time complexity of a function Fn(n) which recursively calls Fn(1),Fn(2),Fn(3),...,Fn(n-1) to solve Fn(n). Fn(1) =1 is given as base condition. Will it be O(n^n) or less. I think it should be less than O(n^n) but i am not able to find a way to get the correct complexity of this recursion.
recursion tree for Fn(4) would be something like this
Fn(4)
/ | \
Fn(3) Fn(2) Fn(1)
/ \ /
Fn(2) Fn(1) Fn(1)
/
Fn(1)
The recurrence would look something like this:
T(1) = 1
T(n) = Σ T(i), from i = 1 to n-1
Not particularly helpful at first glance, huh? So Let's break this down into subproblems and see what they look like:
T(5) = T(4) + T(3) + T(2) + T(1)
=> T(5) = T(4) + T(3) + T(2) + 1
// The sub problems
T(4) = T(3) + T(2) + 1
T(3) = T(2) + 1
T(2) = 1
Now let's substitute some of these sub problems back into our original problem:
T(5) = T(4) + T(3) + T(2) + 1
=> T(5) = T(4) + T(4)
=> T(5) = 2T(4)
So we can derive that the recurrence really looks like:
T(n) = 2T(n-1)
T(n-1) = 2T(n-2)
So we can rewrite our recurrence as
T(n) = 2[ 2T(n-2) ]
T(n) = 2[ 2 [ 2T(n-3) ] ]
...
T(n) = 2^k [ T(n-k) ]
Since our base case we've described earlier is
T(1) = 1
// Therefore
n = 1
k = 1
n = k
Now we can substitute at our recurrence for:
T(n) = 2^n [ T(1) ]
T(n) = 2^n [ O(1) ]
=> T(n) = 2^n
Therefore, your recurrence is O(2^n)
T(F(I)) = T(F(I - 1)) + T(F(I - 1)) + O(1), so it looks like O(2^n).
(Take a look at you recursion tree, T(F(4)) = T(F(3)) + T(F(2)) + T(F(1)) + O(1), while substitute T(F(2)) + T(F(1)) with T(F(3)) you will get T(F(4)) = T(F(3)) + T(F(3)))
We can prove by induction that it is 2^n
First we prove that F(n) = (2^n-1) * F(1)
it is true for n=1 so let's prove this for n+1
F(n+1) = F(n) + F(n-1) + .. + F(1)
=(2^n-1 + 2^n-2 +...+2^0) * F(1)
=((1-2^n)/(1-2)) * F(1) = 2^n * F(1)
So we have the formula and from it the complexity is easy to get
Because you memoize, it will be O(n) time. You use O(n) in memory to compensate for it.

Confused on recurrence and Big O

I know that
T(n) = T(n/2) + θ(1) can be result to O(Log N)
and my book said this is a case of Binary Search.
But, how do you know that? Is it just by the fact that Binary Search cuts the problem in half each time so it is O(Log N)?
And T(n) = 2T(n/2) + θ(1)
why is it that the result is O(N) and not O(Log N) when the algorithm divides in half each time as well.
Then T(n) = 2T(n/2) + θ(n)
can be result to O(N Log N)? I see the O(N) is from θ(n) and O(Log N) is from T(n/2)
I am really confused about how to determine the Big O of an algorithm that I don't even know how to word it properly. I hope my question is making sense.
Thanks in advance!
an intuitive solution for these problems is to see the result when unfolding the recursive formula:
Let's assume Theta(1) is actually 1 and Theta(n) is n, for simplicity
T(n) = T(n/2) + 1 = T(n/4) + 1 + 1 = T(n/8) + 1 + 1 + 1 = ... =
= T(0) + 1 + ... + 1 [logN times] = logn
T'(n) = 2T'(n/2) + 1 = 2(2T'(n/4) + 1) + 1 = 4T'(n/4) + 2 + 1 =
= 8T'(n/4) + 4 + 2 + 1 = ... = 2^(logn) + 2^(logn-1) + ... + 1 = n + n/2 + ... + 1 =
= 2n-1
T''(n) = 2T(n/2) + n = 2(2T''(n/2) + n/2) + n = 4T''(n/4) + 2* (n/2) + n =
= 8T''(n/8) + 4*n/4 + 2*n/2 + n = .... = n + n + .. + n [logn times] = nlogn
To formally prove these equations, you should use induction. Assume T(n/2) = X, and using it - prove T(n) = Y, as expected.
For example, for the first formula [T(n) = T(n/2) + 1] - and assume base is T(1) = 0
Base trivially holds for n = 1
Assume T(n) <= logn for any k <= n-1, and prove it for k = n
T(n) = T(n/2) + 1 <= (induction hypothesis) log(n/2) + 1 = log(n/2) + log(2) = log(n/2*2) = log(n)
I find an easy way to understand these is to consider the time the algorithm spends on each step of the recurrence, and then add them up to find the total time. First, let's consider
T(n) = T(n/2) + O(1)
where n=64. Let's add up how much the algorithm takes at each step:
T(64) = T(32) + 1 ... 1 so far
T(32) = T(16) + 1 ... 2 so far
T(16) = T(08) + 1 ... 3 so far
T(08) = T(04) + 1 ... 4 so far
T(04) = T(02) + 1 ... 5 so far
T(02) = T(01) + 1 ... 6 so far
T(01) = 1 ... 7 total
So, we can see that the algorithm took '1' time at each step. And, since each step divides the input in half, the total work is the number of times the algorithm had to divide the input in two... which is log2 n.
Next, let's consider the case where
T(n) = 2T(n/2) + O(1)
However, to make things simpler, we'll build up from the base case T(1) = 1.
T(01) = 1 ... 1 so far
now we have to do T(01) twice and then add one, so
T(02) = 2T(01) + 1 ... (1*2)+1 = 3
now we have to do T(02) twice, and then add one, so
T(04) = 2T(02) + 1 ... (3*2)+1 = 7
T(08) = 2T(04) + 1 ... (7*2)+1 = 15
T(16) = 2T(08) + 1 ... (15*2)+1 = 31
T(32) = 2T(16) + 1 ... (32*2)+1 = 63
T(64) = 2T(32) + 1 ... (65*2)+1 = 127
So we can see that here the algorithm has done 127 work - which is equal to the input multiplied by a constant (2) and plus a constant (-1), which is O(n). Basically this recursion corresponds to the infinite sequence (1 + 1/2 + 1/4 + 1/8 + 1/16) which sums to 2.
Try using this method on T(n) = 2T(n/2) + n and see if it makes more sense to you.
One visual solution to find the T(n) for a recursive equation is to sketch it with a tree then:
T(n) = number of nodes * time specified on each node.
In your case T(n) = 2T(n/2) + 1
I write the one in the node itself and expand it to two node T(n/2)
Note T(n/2) = 2T(n/4) + 1, and again I do the same for it.
T(n) + 1
/ \
T(n/2)+1 T(n/2)+1
/ \ / \
T(n/4)+1 T(n/4)+1 T(n/4)+1 T(n/4)+1
... ... .. .. .. .. .. ..
T(1) T(1) .......... ............T(1)
In this tree the number of nodes equals
2*height of tree = 2*log(n) = n
Then T(n) = n * 1 = n = O(n)

Resources