how to solve a iterative substitution equation? - algorithm

So I'm struggling with a question, I need to find the time complexity of this equation using iterative substitution, Now I understand how to expand the equation however I do not know how to find the time complexity using it. I know the answer is o(n^log2 3).
T(n) = 3T(n/2) + O(n)
I've expanded from about here, what do I do next?
3^3T(n/2^3) + 3^2(n/2^2) + 3n/2 + n
What do I do next to reach the answer of o(n^log2 3)?
Can someone please explain this to me step by step?

It can be solved like this:
T(n) = 3T(n/2) + O(n)
= 3(3T(n/4) + O(n/2)) + O(n)
= 3^2(T(n/4)) + O(n)
= 3^4(T(n/8)) + O(n)
Note O(n/4) + O(n/2) + O(n) is still O(n)
let n = 2^k
then T(n) = 3^(k-1) + O(n)
= 3^(log(n) - 1 ) + O(n) // log(n) is log (n) to base 2
= O( n^log(3) ) because n^log(3) > n

Related

Best Case time complexity of Tower of Hanoi algorithm

I want to know the best case complexity of Tower of Hanoi algorithm.
The algorithm that I used is
Algorithm
I have calculated the time complexity and it is T(2^n -1) and Big O is O(n).
But what is the best case complexity and how to calculate it?
You have incorrectly calculated the time complexity.
The correct recurrence can be represented by:
T(n) = 2*T(n-1) + 1 .... eq(1)
T(n) = (2^2)*T(n-2) + 1 + 2 .... eq(2)
T(n) = (2^3)*T(n-3) + 1 + 2 + 2^2 ....eq(3)
Therefore,
T(n) = (2^k)*T(n-k) + (2^0) + (2^1) + (2^2) + .... + (2^(k-1)) .... eq(4)
Now, substituting, n-k = 1 in eq(4), we get,
T(k+1) = (2^k)*T(1) + ((2^k) + 1)
Substituting T(1) = 1, we get,
T(k+1) = 2^k + 2^k + 1 = 2^(k+1) + 1 ... eq(5)
Finally, substituting k+1 = n in eq(5) to get the Closed Form:
T(n) = 2^n + 1
Now, the answer to your question:
The algorithm takes O(2^n) just to print out the steps, therefore the best case time complexity also remains exponential, i.e., O(2^n).
Therefore, you cannot find any better algorithm.

Solve the recurrence equation T(n) = T(n/3) + O(1) using iteration or substitution

I realize that solving this with Master's theorem gives the answer of Big Theta(log n). However, I want to know more and find the base of the logarithm. I tried reading about masters theorem more to find out about the base but could not find more information on wikipedia (https://en.wikipedia.org/wiki/Master_theorem_(analysis_of_algorithms)).
How would I solve this using recursion tree or substitution method for solving recurrences?
You can assume n = 2^K and T(0) = 0.
Don't set n=2^k but n=3^k
thus T(3^k) = T(3^{k-1}) + c
recurrence becomes w_k = w_{k-1} + c
Assuming T(1) = 1
with the general term: w_k = ck+1
and w_0 = 1
you conclude T(n) = clog_3(n) + 1
and thus T(n) = O(log_3(n))
T(n) = T(n/3) + O(1) = T(n/9) + O(1) + O(1) = T(n/27) + O(1) + O(1) + O(1) = …
After log3(n) steps, the term T vanishes and T(n) = O(log(n)).

Trying to understand complexity of this big0 notation

T(n) = n(T(n-1) + T(n-1)) + o(1) . The answer as per the book is o(n!) I am not able to come to this solution. Can someone give some guidance.
Okay, here's my take on this:
T(n) = n(T(n-1) + T(n-1)) + O(1)
T(n) = n(2T(n-1)) + O(1)
T(n) = nT(n-1) + O(1) // constants are not included in complexity. O(n + k) = O(k)
T(n) = nT(n-1)
This is a factorial complexity.

Deduct time complexity from this Recurrence formula?

I was reading a time complexity calculation related question on SO but I can't comment there (not enough reps).
What's the time complexity of this algorithm for Palindrome Partitioning?
I have a question regarding going from 1st to 2nd equation here:
Now you can write the same expression for H(n-1), then substitute back
to simplify:
H(n) = 2 H(n-1) + O(n) =========> Eq.1
And this solves to
H(n) = O(n * 2^n) =========> Eq.2
Can someone illustrate how he got Eq.2 from Eq.1? Thank you.
Eq 1. is a recurrence relation. See the link for a tutorial on how to solve these types of equations, but we can solve via expansion as below:
H(n) = 2H(n-1) + O(n)
H(n) = 2*2H(n-2) + 2O(n-1) + O(n)
H(n) = 2*2*2H(n-3) + 2*2O(n-2) + 2O(n-1) + O(n)
...
H(n) = 2^n*H(1) + 2^(n-1)*O(1) + ... + 2O(n-1) + O(n)
since H(1) = O(n) (see the original question)
H(n) = 2^n*O(n) + 2^(n-1)*O(1) + ... + 2O(n-1) + O(n)
H(n) = O(n * 2^n)
We need to homogenize the equation, in this simple case just by adding a constant to each side. First, designate O(n) = K to avoid ealing with the O notation at this stage:
H(n) = 2 H(n-1) + K
Then add a K to each side:
H(n) + K = 2 (H(n-1) + K)
Let G(n) = H(n) + K, then
G(n) = 2 G(n-1)
This is a well-known homogeneous 1-st order recurrence, with the solution
G(n) = G(0)×2n = G(1)×2n-1
Since H(1) = O(n), G(1) = H(1) + K = O(n) + O(n) = O(n),
G(n) = O(n)×2n-1 = O(n×2n-1) = O(n×2n)
and
H(n) = G(n) - K = O(n×2n) - O(n) = O(n×2n)
They are wrong.
Let's assume that O refers to a tight bound and substitute O(n) with c * n for some constant c. Unrolling the recursion you will get:
When you finish to unroll recursion n = i and b = T(0).
Now finding the sum:
Summing up you will get:
So now it is clear that T(n) is O(2^n) without any n
For people who are still skeptical about the math:
solution to F(n) = 2F(n-1) + n
solution to F(n) = 2F(n-1) + 99n

The Recurrence T(n)= 2T(n/2) + (n-1)

I have this recurrence:
T(n)= 2T(n/2) + (n-1)
My try is as follow:
the tree is like this:
T(n) = 2T(n/2) + (n-1)
T(n/2) = 2T(n/4) + ((n/2)-1)
T(n/4) = 2T(n/8) + ((n/4)-1)
...
the hight of the tree : (n/(2h))-1 = 1 ⇒ h = lg n - 1 = lg n - lg 2
the cost of the last level : 2h = 2lg n - lg 2 = (1/2) n
the cost of all levels until level h-1 : Σi=0,...,lg(2n) n - (2i-1), which is a geometric series and equals (1/2)((1/2)n-1)
So, T(n) = Θ(n lg n)
my question is: Is that right?
No, it isn't. You have the cost of the last level wrong, so what you derived from that is also wrong.
(I'm assuming you want to find the complexity yourself, so no more hints unless you ask.)
Edit: Some hints, as requested
To find the complexity, one usually helpful method is to recursively apply the equation and insert the result into the first,
T(n) = 2*T(n/2) + (n-1)
= 2*(2*T(n/4) + (n/2-1)) + (n-1)
= 4*T(n/4) + (n-2) + (n-1)
= 4*T(n/4) + 2*n - 3
= 4*(2*T(n/8) + (n/4-1)) + 2*n - 3
= ...
That often leads to a closed formula you can prove via induction (you don't need to carry out the proof if you have enough experience, then you see the correctness without writing down the proof).
Spoiler: You can look up the complexity in almost any resource dealing with the Master Theorem.
This can be easily solved with Masters theorem.
You have a=2, b=2, f(n) = n - 1 = O(n) and therefore c = log2(2) = 1. This falls into the first case of Master's theorem, which means that the complexity is O(n^c) = O(n)

Resources