Best Case time complexity of Tower of Hanoi algorithm - algorithm

I want to know the best case complexity of Tower of Hanoi algorithm.
The algorithm that I used is
Algorithm
I have calculated the time complexity and it is T(2^n -1) and Big O is O(n).
But what is the best case complexity and how to calculate it?

You have incorrectly calculated the time complexity.
The correct recurrence can be represented by:
T(n) = 2*T(n-1) + 1 .... eq(1)
T(n) = (2^2)*T(n-2) + 1 + 2 .... eq(2)
T(n) = (2^3)*T(n-3) + 1 + 2 + 2^2 ....eq(3)
Therefore,
T(n) = (2^k)*T(n-k) + (2^0) + (2^1) + (2^2) + .... + (2^(k-1)) .... eq(4)
Now, substituting, n-k = 1 in eq(4), we get,
T(k+1) = (2^k)*T(1) + ((2^k) + 1)
Substituting T(1) = 1, we get,
T(k+1) = 2^k + 2^k + 1 = 2^(k+1) + 1 ... eq(5)
Finally, substituting k+1 = n in eq(5) to get the Closed Form:
T(n) = 2^n + 1
Now, the answer to your question:
The algorithm takes O(2^n) just to print out the steps, therefore the best case time complexity also remains exponential, i.e., O(2^n).
Therefore, you cannot find any better algorithm.

Related

What is the time complexity of the function T(n)=2T(n/4)+O(1)?(Without masters theorem)

Can anybody please explain the time complexity of T(n)=2T(n/4)+O(1) using recurrence tree? I saw somewhere it says O(n^1/2).
Just expand the equation for some iteration, and use the mathematical induction to prove the observed pattern:
T(n) = 2T(n/4) + 1 = 2(2T(n/4^2) + 1) + 1 = 2^2 T(n/4^2) + 2 + 1
Hence:
T(n) = 1 + 2 + 2^2 + ... + 2^k = 2^(k+1) - 1 \in O(2^(k+1))
What is k? from the expansion 4^k = n. So, k = 1/2 log(n). Thus, T(n) \in O(2^(1/2 log(n) + 1)) = O(sqrt(n)). Note that 2^log(n) = n.

analyze algorithm of finding maximum number in array with n number

def maximum(array):
max = array[0]
counter = 0
for i in array:
size +=1
if i>max:
max=i
return max
I need to analyze that algorithm which find maximum number in array with n numbers in it. the only thing I want to know how to get Recursive and General formula for Average case of this algorithm.
Not sure what you mean by "Recursive and General formula for Average case of this algorithm". Your algorithm is not recursive. So, how can it be "recursive formula"?
Recursive way to find maximum in an array:
def findMax(Array, n):
if (n == 1):
return A[0]
return max(Array[n - 1], findMax(Array, n - 1))
I guess you want Recurrence relation.
Let T(n) be time taken to find the maximum of n elements. So, for above written code.
T(n) = T(n-1) + 1 .... Equation I
In case you are interested to solve the recurrence relation:
T(n-1) = T((n-1)-1) + 1 = T(n-2) + 1 .... Equation II
If you substitute value of T(n-1) from Equation II into Equation I, you get:
T(n) = (T(n-2) + 1) + 1 = T(n-2) + 2
Similarly,
T(n) = T(n-3) + 3
T(n) = T(n-4) + 4
and so on..
Continuing the above for k times,
T(n) = T(n-k) + k
If n-k = 0, means n = k. The equation then becomes
T(n) = T(0) + n = 1 + n
Therefore, the recursive algorithm we came up with has time complexity O(n).
Hope it helped.

Calculating Big O complexity of Recursive Algorithms

Somehow, I find that it is much harder to derive Big O complexities for recursive algorithms compared to iterative algorithms. Do provide some insight about how I should go about solving these 2 questions.
*assume that submethod has linear complexity
def myMethod(n)
if (n>0)
submethod(n)
myMethod(n/2)
end
end
def myMethod(k,n)
if(n>0)
submethod(k)
myMethod(k,n/2)
end
end
For your first problem, the recurrence will be:
T(n) = n + T(n/2)
T(n/2) = n/2 + T(n/4)
...
...
...
T(2) = 2 + T(1)
T(1) = 1 + T(0) // assuming 1/2 equals 0(integer division)
adding up we get:
T(n) = n + n/2 + n/4 + n/8 + ..... 1 + T(0)
= n(1 + 1/2 + 1/4 + 1/8 .....) + k // assuming k = T(0)
= n*1/(1 - 1/2) ( sum of geometric series a/(1-r) when n tends to infinity)
= 2n + k
Therefore, T(n) = O(n). Remember i have assumed n tends to infinity ,cause this is what we do in Asymptotic analysis.
For your second problem its easy to see that, we perform k primitive operations everytime till n becomes 0. This happens log(n) times. Therefore, T(n) = O(k*log(n))
All you need to do is count how many times a basic operation is executed. This is true for analysing any kind of algorithm. In your case, we will count the number of times submethod is called.
You could break-down the running time of call myMethod(n) to be 1 + myMethod(n / 2). Which you can further break down to 1 + (1 + myMethod(n / 4)). At some point you will reach the base case, in log(n)th step. That gives you an algorithm of log(n).
The second one is no different, since k is constant all the time, it will again take log(n) time, assuming submethod takes constant time regardless of its input.

Time Complexity of Sequential search

I am trying to find the time complexity for selection sort which has the following equation T(n)=T(n-1)+O(n)
First I supposed its T(n)=T(n-1)+n .. n is easier though..
Figured T(n-1) = T(n-2) + (n-1)
and T(n-2) = T(n-3) + (n-2)
This makes T(n) = (T(n-3) + (n-2)) + (n-1) + n so its T(n) = T(n-3) + 3n - 3..
K instead of (3) .. T(n) = T(n-k) + kn - k and because n-k >= 0 .. ==> n-k = 0 and n=k Back to the eqaution its.. T(n) = T(0)// which is C + n*n - n which makes it C + n^2 -n.. so its O(n^2).. is what I did ryt??
Yes, your solution is correct. You are combining O(n) with O(n-1), O(n-2) ... and coming up with O(n^2). You can apply O(n) + O(n-1) = O(n), but only finitely. In a series it is different.
T(n) = (0 to n)Σ O(n - i)
Ignore i inside O(), your result is O(n^2)
The recurrence relationship you gave T(n)=T(n-1)+O(n) is true for Selection Sort, which has overall time complexity as O(n^2). Check this link to verify
In selection sort:
In iteration i, we find the index min of smallest remaining entry.
And then swap a[i] and a[min].
As such the selection sort uses
(n-1)+(n-2)+....+2+1+0 = (n-1)*(n-2)/2 = O(n*n) compares
and exactly n exchanges(swappings).
FROM ABOVE
And from the recurrence relation given above
=> T(n) = T(n-1)+ O(n)
=> T(n) = T(n-1)+ cn, where c is some positive constant
=> T(n) = cn + T(n-2) + c(n-1)
=> T(n) = cn + c(n-1) +T(n-3)+ c(n-2)
And this goes on and we finally get
=> T(n) = cn + c(n-1) + c(n-2) + ...... c (total no of n terms)
=> T(n) = c(n*(n-1)/2)
=> T(n) = O(n*n)
EDIT
Its always better to replace theta(n) as cn, where c is some constant. Helps in visualizing the equation more easily.

The Recurrence T(n)= 2T(n/2) + (n-1)

I have this recurrence:
T(n)= 2T(n/2) + (n-1)
My try is as follow:
the tree is like this:
T(n) = 2T(n/2) + (n-1)
T(n/2) = 2T(n/4) + ((n/2)-1)
T(n/4) = 2T(n/8) + ((n/4)-1)
...
the hight of the tree : (n/(2h))-1 = 1 ⇒ h = lg n - 1 = lg n - lg 2
the cost of the last level : 2h = 2lg n - lg 2 = (1/2) n
the cost of all levels until level h-1 : Σi=0,...,lg(2n) n - (2i-1), which is a geometric series and equals (1/2)((1/2)n-1)
So, T(n) = Θ(n lg n)
my question is: Is that right?
No, it isn't. You have the cost of the last level wrong, so what you derived from that is also wrong.
(I'm assuming you want to find the complexity yourself, so no more hints unless you ask.)
Edit: Some hints, as requested
To find the complexity, one usually helpful method is to recursively apply the equation and insert the result into the first,
T(n) = 2*T(n/2) + (n-1)
= 2*(2*T(n/4) + (n/2-1)) + (n-1)
= 4*T(n/4) + (n-2) + (n-1)
= 4*T(n/4) + 2*n - 3
= 4*(2*T(n/8) + (n/4-1)) + 2*n - 3
= ...
That often leads to a closed formula you can prove via induction (you don't need to carry out the proof if you have enough experience, then you see the correctness without writing down the proof).
Spoiler: You can look up the complexity in almost any resource dealing with the Master Theorem.
This can be easily solved with Masters theorem.
You have a=2, b=2, f(n) = n - 1 = O(n) and therefore c = log2(2) = 1. This falls into the first case of Master's theorem, which means that the complexity is O(n^c) = O(n)

Resources