Towers of Hanoi Closed Form Solution - algorithm

So I'm trying to find the closed form solution for the Towers of Hanoi problem. I understand that the recurrence relation is T(n) = 2T(n-1) + 1, because it takes T(n-1) to move the top of the tower back and forth which is why there are two, and the "+ 1" is to move the base. However, I cannot understand why the closed solution is 2^n - 1.
When I am trying to solve for the answer and I use back substitution, I get as far as: T(n) = 8T(n-3) + 4 + 2 + 1, which is T(n) = 2^k (T(n - k)) + 2^k-1 + 2^k-2 + 2^k-3 where k is the step? I know the last part is also geometric series, which means it is 2^(n + 1) - 1/(2-1). But I just can't understand where the answer comes from.
edit:
is it because the geometric series part is not 2^k + 2^k-1 + ... + 2^k-k? which means that the geometric series is not 2^n + 1 - 1, but rather 2^n - 1. and we use H(0) as the base case --> so H(n - k), use k = n?

You can easily prove it by induction. Let's assume that T(n) = 2^n - 1 is true for a given n. Then:
T(n+1) = 2*T(n) + 1
= 2*(2^n-1) + 1
= 2^(n+1) - 2 + 1
= 2^(n+1) - 1
As we know that T(0) = 0 = 2^0 - 1 it proves that for any n the equality T(n) = 2^n - 1 is true.

A trick which sometimes works is to find another function for which the relation is simpler.
So we start from T(n)=2*T(n-1)+1.
It looks similar to T(n)=2*T(n-1) which has an obvious solution.
So we should transform the equation so that +1 is inside of 2*(...).
In this case it's T(n)+1=2*T(n-1)+2=2*(T(n-1)+1).
So T(n)+1=2^n*(T(0)+1) and T(n)=2^n*(T(0)+1)-1.

Related

Runtime of following algorithm?

A divide and conquer algorithm solves a problem of size n by dividing it into 2
subproblems, each of size n-1, and takes O(n) time to combine their solutions. What is the runtime of this algorithm?
I'm not quite sure how to structure this recurrence relation and determine what the runtime is. Is the following relation correct?
T(n) = 2T(n-1) + O(n)
How can I get the runtime from this, if so?
Thank you so much!
Yes, your recurrence relation correctly describes your problem. To make things concrete, let's say the recurrence relation is: T(n) = 2T(n-1) + n (that is +n rather than +O(n).
Then, telescoping the recurrence relation (and assuming T(0) = 0).
T(n) = n + 2(n-1) + 4(n-2) + 8(n-3) + ... + 2^n(n-n)
= (1 + 2 + 4 + ... + 2^n)n - (0*2^0 + 1*2^1 + ... + n*2^n)
= n*(2^(n+1)-1) - 2(n*2^n-2^n+1)
= 2^(n+1) - n - 2
Checking this is correct:
2T(n-1) + n
= 2(2^n - (n-1) - 2) + n
= (2^(n+1) - 2n + 2 - 4) + n
= 2^(n+1) - n - 2
= T(n)

Determining the running time for recurrence relation T(n) = T(n-1)+n

How do I determine the running time (in terms of Big-Theta) for the algorithm of input size n that satisfies recurrence relation T(n) = T(n-1)+n where n >= 1 and with initial condition T(1) = 1?
Edit: I was practicing a past exam paper. Got stuck on this question. Need guidance
Look at it this way: T(n) = T(n-1) + n = T(n-2) + (n-1) + n = T(n-3) + (n-2) + (n-1) + n. Which means if n >= 1 then you will get something like T(n) = 1 + 2 + 3 + ... + n. If you work out the pattern of this series you will see that (n+1)n/2. Therefore, Ө(n^2)

Expanding Recurrence Relation and Finding Closed Form

I have a snippet of algorithm and must find the worst-case recurrence and find its closed form. So far I have the worst-case recurrence:
T(n)= 2T(n/4) + C for n > 1.
I tried expanding it, and I have this form currently:
T(n) = 2kT(n/4k) + Ck
with k = log4(n) or k = (log2(n))/2.
I have T(1) = 1000.
I am at a loss on what to do next, or how to find its closed form exactly. I still cannot see a pattern in the algorithm or my expansion of T(n). Any insight would be great, thank you.
What you can get is a closed formula when n = 4^k:
T(4^k) = 2^k x 10^3 + C + 2C + ... + 2^(k-1)C
= 2^k x 10^3 + (2^k - 1)C
Where the last eqaulity comes from the geometric series formula.
For all other n, I think the best you can do is to apply the master theorem
Your equation falls in case 1 of the theorem (you have a = 2, b = 4, c = 0).
Therefore:
log_b(a) = 1 / 2
and
T(n) = O(sqrt(n))
I'm not sure if it admits an unique closed form.

Why is the complexity of computing the Fibonacci series 2^n and not n^2?

I am trying to find complexity of Fibonacci series using a recursion tree and concluded height of tree = O(n) worst case, cost of each level = cn, hence complexity = n*n=n^2
How come it is O(2^n)?
The complexity of a naive recursive fibonacci is indeed 2ⁿ.
T(n) = T(n-1) + T(n-2) = T(n-2) + T(n-3) + T(n-3) + T(n-4) =
= T(n-3) + T(n-4) + T(n-4) + T(n-5) + T(n-4) + T(n-5) + T(n-5) + T(n-6) = ...
In each step you call T twice, thus will provide eventual asymptotic barrier of:
T(n) = 2⋅2⋅...⋅2 = 2ⁿ
bonus: The best theoretical implementation to fibonacci is actually a close formula, using the golden ratio:
Fib(n) = (φⁿ – (–φ)⁻ⁿ)/sqrt(5) [where φ is the golden ratio]
(However, it suffers from precision errors in real life due to floating point arithmetics, which are not exact)
The recursion tree for fib(n) would be something like :
n
/ \
n-1 n-2 --------- maximum 2^1 additions
/ \ / \
n-2 n-3 n-3 n-4 -------- maximum 2^2 additions
/ \
n-3 n-4 -------- maximum 2^3 additions
........
-------- maximum 2^(n-1) additions
Using n-1 in 2^(n-1) since for fib(5) we will eventually go down to fib(1)
Number of internal nodes = Number of leaves - 1 = 2^(n-1) - 1
Number of additions = Number of internal nodes + Number of leaves = (2^1 + 2^2 + 2^3 + ...) + 2^(n-1)
We can replace the number of internal nodes to 2^(n-1) - 1 because it will always be less than this value :
= 2^(n-1) - 1 + 2^(n-1)
~ 2^n
Look at it like this. Assume the complexity of calculating F(k), the kth Fibonacci number, by recursion is at most 2^k for k <= n. This is our induction hypothesis. Then the complexity of calculating F(n + 1) by recursion is
F(n + 1) = F(n) + F(n - 1)
which has complexity 2^n + 2^(n - 1). Note that
2^n + 2^(n - 1) = 2 * 2^n / 2 + 2^n / 2 = 3 * 2^n / 2 <= 2 * 2^n = 2^(n + 1).
We have shown by induction that the claim that calculating F(k) by recursion is at most 2^k is correct.
You are correct that the depth of the tree is O(n), but you are not doing O(n) work at each level. At each level, you do O(1) work per recursive call, but each recursive call then contributes two new recursive calls, one at the level below it and one at the level two below it. This means that as you get further and further down the recursion tree, the number of calls per level grows exponentially.
Interestingly, you can actually establish the exact number of calls necessary to compute F(n) as 2F(n + 1) - 1, where F(n) is the nth Fibonacci number. We can prove this inductively. As a base case, to compute F(0) or F(1), we need to make exactly one call to the function, which terminates without making any new calls. Let's say that L(n) is the number of calls necessary to compute F(n). Then we have that
L(0) = 1 = 2*1 - 1 = 2F(1) - 1 = 2F(0 + 1) - 1
L(1) = 1 = 2*1 - 1 = 2F(2) - 1 = 2F(1 + 1) - 1
Now, for the inductive step, assume that for all n' < n, with n ≥ 2, that L(n') = 2F(n + 1) - 1. Then to compute F(n), we need to make 1 call to the initial function that computes F(n), which in turn fires off calls to F(n-2) and F(n-1). By the inductive hypothesis we know that F(n-1) and F(n-2) can be computed in L(n-1) and L(n-2) calls. Thus the total runtime is
1 + L(n - 1) + L(n - 2)
= 1 + 2F((n - 1) + 1) - 1 + 2F((n - 2) + 1) - 1
= 2F(n) + 2F(n - 1) - 1
= 2(F(n) + F(n - 1)) - 1
= 2(F(n + 1)) - 1
= 2F(n + 1) - 1
Which completes the induction.
At this point, you can use Binet's formula to show that
L(n) = 2(1/√5)(((1 + √5) / 2)n - ((1 - √5) / 2)n) - 1
And thus L(n) = O(((1 + √5) / 2)n). If we use the convention that
φ = (1 + √5) / 2 &approx; 1.6
We have that
L(n) = Θ(φn)
And since φ < 2, this is o(2n) (using little-o notation).
Interestingly, I've chosen the name L(n) for this series because this series is called the Leonardo numbers. In addition to its use here, it arises in the analysis of the smoothsort algorithm.
Hope this helps!
t(n)=t(n-1)+t(n-2)
which can be solved through tree method:
t(n-1) + t(n-2) 2^1=2
| |
t(n-2)+t(n-3) t(n-3)+t(n-4) 2^2=4
. . 2^3=8
. . .
. . .
similarly for the last level . . 2^n
it will make total time complexity=>2+4+8+.....2^n
after solving the above gp we will get time complexity as O(2^n)
The complexity of Fibonacci series is O(F(k)), where F(k) is the kth Fibonacci number. This can be proved by induction. It is trivial for based case. And assume for all k<=n, the complexity of computing F(k) is c*F(k) + o(F(k)), then for k = n+1, the complexity of computing F(n+1) is c*F(n) + o(F(n)) + c*F(n-1) + o(F(n-1)) = c*(F(n) + F(n-1)) + o(F(n)) + o(F(n-1)) = O(F(n+1)).
The complexity of recursive Fibonacci series is 2^n:
This will be the Recurrence Relations for recursive Fibonacci
T(n)=T(n-1)+T(n-2) No of elements 2
Now on solving this relation using substitution method (substituting value of T(n-1) and T(n-2))
T(n)=T(n-2)+2*T(n-3)+T(n-4) No of elements 4=2^2
Again substituting values of above term we will get
T(n)=T(n-3)+3*T(n-4)+3*T(n-5)+T(n-6) No of elements 8=2^3
After solving it completely, we get
T(n)={T(n-k)+---------+---------}----------------------------->2^k eq(3)
This implies that maximum no of recursive calls at any level will be at most 2^n.
And for all the recursive calls in equation 3 is ϴ(1) so time complexity will be 2^n* ϴ(1)=2^n
The O(2^n) complexity of Fibonacci number calculation only applies to the recursion approach. With a few extra space, you can achieve a much better performance with O(n).
public static int fibonacci(int n) throws Exception {
if (n < 0)
throws new Exception("Can't be a negative integer")
if (n <= 1)
return n;
int s = 0, s1 = 0, s2 = 1;
for(int i= 2; i<=n; i++) {
s = s1 + s2;
s1 = s2;
s2 = s;
}
return s;
}
I cannot resist the temptation of connecting a linear time iterative algorithm for Fib to the exponential time recursive one: if one reads Jon Bentley's wonderful little book on "Writing Efficient Algorithms" I believe it is a simple case of "caching": whenever Fib(k) is calculated, store it in array FibCached[k]. Whenever Fib(j) is called, first check if it is cached in FibCached[j]; if yes, return the value; if not use recursion. (Look at the tree of calls now ...)

solution to towers of Hanoi problem

how do i solve for running time in the towers of Hanoi problem. I get a recurrence realation like t(n) = 2t(n-1) + 1. After drawing the recursion tree i get at every step values like 1+2+4+8... the height of the tree will be lg(n). how do i calculate the sum of the series? when do i stop?
What you get at each level of the recursion tree is a power of 2. Hence, the sum is: 2^0 + 2^1 + 2^2 + ... + 2^{n-1}.
That's a geometric sum: http://en.wikipedia.org/wiki/Geometric_progression
Let S(n) = 1 + 2 + 4 + ... + 2^{n-1}. Then: S(n) - 2*S(n) = 1 - 2^n
And finally: S(n) = 2^n - 1.
Did you checked http://en.wikipedia.org/wiki/Tower_of_Hanoi? You have everything in there.

Resources