Trying to understand complexity of this big0 notation - big-o

T(n) = n(T(n-1) + T(n-1)) + o(1) . The answer as per the book is o(n!) I am not able to come to this solution. Can someone give some guidance.

Okay, here's my take on this:
T(n) = n(T(n-1) + T(n-1)) + O(1)
T(n) = n(2T(n-1)) + O(1)
T(n) = nT(n-1) + O(1) // constants are not included in complexity. O(n + k) = O(k)
T(n) = nT(n-1)
This is a factorial complexity.

Related

how to solve a iterative substitution equation?

So I'm struggling with a question, I need to find the time complexity of this equation using iterative substitution, Now I understand how to expand the equation however I do not know how to find the time complexity using it. I know the answer is o(n^log2 3).
T(n) = 3T(n/2) + O(n)
I've expanded from about here, what do I do next?
3^3T(n/2^3) + 3^2(n/2^2) + 3n/2 + n
What do I do next to reach the answer of o(n^log2 3)?
Can someone please explain this to me step by step?
It can be solved like this:
T(n) = 3T(n/2) + O(n)
= 3(3T(n/4) + O(n/2)) + O(n)
= 3^2(T(n/4)) + O(n)
= 3^4(T(n/8)) + O(n)
Note O(n/4) + O(n/2) + O(n) is still O(n)
let n = 2^k
then T(n) = 3^(k-1) + O(n)
= 3^(log(n) - 1 ) + O(n) // log(n) is log (n) to base 2
= O( n^log(3) ) because n^log(3) > n

Solve the recurrence equation T(n) = T(n/3) + O(1) using iteration or substitution

I realize that solving this with Master's theorem gives the answer of Big Theta(log n). However, I want to know more and find the base of the logarithm. I tried reading about masters theorem more to find out about the base but could not find more information on wikipedia (https://en.wikipedia.org/wiki/Master_theorem_(analysis_of_algorithms)).
How would I solve this using recursion tree or substitution method for solving recurrences?
You can assume n = 2^K and T(0) = 0.
Don't set n=2^k but n=3^k
thus T(3^k) = T(3^{k-1}) + c
recurrence becomes w_k = w_{k-1} + c
Assuming T(1) = 1
with the general term: w_k = ck+1
and w_0 = 1
you conclude T(n) = clog_3(n) + 1
and thus T(n) = O(log_3(n))
T(n) = T(n/3) + O(1) = T(n/9) + O(1) + O(1) = T(n/27) + O(1) + O(1) + O(1) = …
After log3(n) steps, the term T vanishes and T(n) = O(log(n)).

How to calculate algorithm time complex

I am trying to multiple two big integer with Karatsuba algorithm.
I known that O(n) is time complexity and T(n) is worst-case time complexity.
Can some one please explain why:
T(n) = 4T(n/2) + O(n) is O(n^2)
And
T(n) = 3T(n/2) + O(n) is O(n^1.59)
T(n) = 4T(n/2) + O(n)
According to the Master theorem:
T(n) is O(n^log_2(4)) = O(n^2)
and
T(n) = 3T(n/2) + O(n)
is
T(n) = O(log_2(3)) ~ O(n^1,5849)
so you can round it to 1.590.

Deduct time complexity from this Recurrence formula?

I was reading a time complexity calculation related question on SO but I can't comment there (not enough reps).
What's the time complexity of this algorithm for Palindrome Partitioning?
I have a question regarding going from 1st to 2nd equation here:
Now you can write the same expression for H(n-1), then substitute back
to simplify:
H(n) = 2 H(n-1) + O(n) =========> Eq.1
And this solves to
H(n) = O(n * 2^n) =========> Eq.2
Can someone illustrate how he got Eq.2 from Eq.1? Thank you.
Eq 1. is a recurrence relation. See the link for a tutorial on how to solve these types of equations, but we can solve via expansion as below:
H(n) = 2H(n-1) + O(n)
H(n) = 2*2H(n-2) + 2O(n-1) + O(n)
H(n) = 2*2*2H(n-3) + 2*2O(n-2) + 2O(n-1) + O(n)
...
H(n) = 2^n*H(1) + 2^(n-1)*O(1) + ... + 2O(n-1) + O(n)
since H(1) = O(n) (see the original question)
H(n) = 2^n*O(n) + 2^(n-1)*O(1) + ... + 2O(n-1) + O(n)
H(n) = O(n * 2^n)
We need to homogenize the equation, in this simple case just by adding a constant to each side. First, designate O(n) = K to avoid ealing with the O notation at this stage:
H(n) = 2 H(n-1) + K
Then add a K to each side:
H(n) + K = 2 (H(n-1) + K)
Let G(n) = H(n) + K, then
G(n) = 2 G(n-1)
This is a well-known homogeneous 1-st order recurrence, with the solution
G(n) = G(0)×2n = G(1)×2n-1
Since H(1) = O(n), G(1) = H(1) + K = O(n) + O(n) = O(n),
G(n) = O(n)×2n-1 = O(n×2n-1) = O(n×2n)
and
H(n) = G(n) - K = O(n×2n) - O(n) = O(n×2n)
They are wrong.
Let's assume that O refers to a tight bound and substitute O(n) with c * n for some constant c. Unrolling the recursion you will get:
When you finish to unroll recursion n = i and b = T(0).
Now finding the sum:
Summing up you will get:
So now it is clear that T(n) is O(2^n) without any n
For people who are still skeptical about the math:
solution to F(n) = 2F(n-1) + n
solution to F(n) = 2F(n-1) + 99n

T(n) = T(n/2) + T(n/4) + O(1), what is T(n)?

How to solve this recurrence: T(n) = T(n/2) + T(n/4) + O(1)
It doesn't seem like Master Method will help, as this is not in the form of T(n) = aT(n/b) + f(n). And I got stuck for quite a while.
Akra Bazzi is a much more powerful method than Master method.
Since the 'non-recursive' term is O(1), it amounts to solving the equation
1/2^p + 1/4^p = 1
And the answer you get will be T(n) = Theta(n^p)
I believe solving the above (quadratic in 1/2^p) gives us p = log_2 phi where phi is the golden ratio.
Computing that gives us T(n) = Theta(n^0.694...)

Resources