Deduct time complexity from this Recurrence formula? - algorithm

I was reading a time complexity calculation related question on SO but I can't comment there (not enough reps).
What's the time complexity of this algorithm for Palindrome Partitioning?
I have a question regarding going from 1st to 2nd equation here:
Now you can write the same expression for H(n-1), then substitute back
to simplify:
H(n) = 2 H(n-1) + O(n) =========> Eq.1
And this solves to
H(n) = O(n * 2^n) =========> Eq.2
Can someone illustrate how he got Eq.2 from Eq.1? Thank you.

Eq 1. is a recurrence relation. See the link for a tutorial on how to solve these types of equations, but we can solve via expansion as below:
H(n) = 2H(n-1) + O(n)
H(n) = 2*2H(n-2) + 2O(n-1) + O(n)
H(n) = 2*2*2H(n-3) + 2*2O(n-2) + 2O(n-1) + O(n)
...
H(n) = 2^n*H(1) + 2^(n-1)*O(1) + ... + 2O(n-1) + O(n)
since H(1) = O(n) (see the original question)
H(n) = 2^n*O(n) + 2^(n-1)*O(1) + ... + 2O(n-1) + O(n)
H(n) = O(n * 2^n)

We need to homogenize the equation, in this simple case just by adding a constant to each side. First, designate O(n) = K to avoid ealing with the O notation at this stage:
H(n) = 2 H(n-1) + K
Then add a K to each side:
H(n) + K = 2 (H(n-1) + K)
Let G(n) = H(n) + K, then
G(n) = 2 G(n-1)
This is a well-known homogeneous 1-st order recurrence, with the solution
G(n) = G(0)×2n = G(1)×2n-1
Since H(1) = O(n), G(1) = H(1) + K = O(n) + O(n) = O(n),
G(n) = O(n)×2n-1 = O(n×2n-1) = O(n×2n)
and
H(n) = G(n) - K = O(n×2n) - O(n) = O(n×2n)

They are wrong.
Let's assume that O refers to a tight bound and substitute O(n) with c * n for some constant c. Unrolling the recursion you will get:
When you finish to unroll recursion n = i and b = T(0).
Now finding the sum:
Summing up you will get:
So now it is clear that T(n) is O(2^n) without any n
For people who are still skeptical about the math:
solution to F(n) = 2F(n-1) + n
solution to F(n) = 2F(n-1) + 99n

Related

Solve the recurrence equation T(n) = T(n/3) + O(1) using iteration or substitution

I realize that solving this with Master's theorem gives the answer of Big Theta(log n). However, I want to know more and find the base of the logarithm. I tried reading about masters theorem more to find out about the base but could not find more information on wikipedia (https://en.wikipedia.org/wiki/Master_theorem_(analysis_of_algorithms)).
How would I solve this using recursion tree or substitution method for solving recurrences?
You can assume n = 2^K and T(0) = 0.
Don't set n=2^k but n=3^k
thus T(3^k) = T(3^{k-1}) + c
recurrence becomes w_k = w_{k-1} + c
Assuming T(1) = 1
with the general term: w_k = ck+1
and w_0 = 1
you conclude T(n) = clog_3(n) + 1
and thus T(n) = O(log_3(n))
T(n) = T(n/3) + O(1) = T(n/9) + O(1) + O(1) = T(n/27) + O(1) + O(1) + O(1) = …
After log3(n) steps, the term T vanishes and T(n) = O(log(n)).

what the Time Complexity of T(n) = 2T(n/2) +O(1)

i want to know what the Time Complexity of my recursion method :
T(n) = 2T(n/2) + O(1)
i saw a result that says it is O(n) but i don't know why , i solved it like this :
T(n) = 2T(n/2) + 1
T(n-1) = 4T(n-1/4) + 3
T(n-2) = 8T(n-2/8) + 7
...... ………….. ..
T(n) = 2^n+1 T (n/2^n+1) + (2^n+1 - 1)
I think you have got the wrong idea about recursive relations. You can think as follows:
If T(n) represents the value of function T() at input = n then the relation says that output is one more double the value at half of the current input. So for input = n-1 output i.e. T(n-1) will be one more than double the value at half of this input, that is T(n-1) = 2*T((n-1)/2) + 1
The above kind of recursive relation should be solved as answered by Yves Daoust. For more examples on recursive relations, you can refer this
Consider that n=2^m, which allows you to write
T(2^m)=2T(2^(m-1))+O(1)
or by denoting S(m):= T(2^m),
S(m)=2 S(m-1) + O(1),
2^m S(m)=2 2^(m-1)S(m-1) + 2^(m-1) O(1)
and finally,
R(m) = R(m-1) + 2^(m-1) O(1).
Now by induction,
R(m) = R(0) + (2^m-1) O(1),
T(n) = S(m) = 2^(1-m) T(2^m) + (2 - 2^(m-1)) O(1) = 2/n T(n) + (2 - n/2) O(1).
There are a couple of rules that you might need to remember. If you can remember these easy rules then Master Theorem is very easy to solve recurrence equations. The following are the basic rules which needs to be remembered
case 1) If n^(log b base a) << f(n) then T(n) = f(n)
case 2) If n^(log b base a) = f(n) then T(n) = f(n) * log n
case 3) 1) If n^(log b base a) >> f(n) then T(n) = n^(log b base a)
Now, lets solve the recurrence using the above equations.
a = 2, b = 2, f(n) = O(1)
n^(log b base a) = n = O(n)
This is case 3) in the above equations. Hence T(n) = n^(log b base a) = O(n).

Time Complexity of Sequential search

I am trying to find the time complexity for selection sort which has the following equation T(n)=T(n-1)+O(n)
First I supposed its T(n)=T(n-1)+n .. n is easier though..
Figured T(n-1) = T(n-2) + (n-1)
and T(n-2) = T(n-3) + (n-2)
This makes T(n) = (T(n-3) + (n-2)) + (n-1) + n so its T(n) = T(n-3) + 3n - 3..
K instead of (3) .. T(n) = T(n-k) + kn - k and because n-k >= 0 .. ==> n-k = 0 and n=k Back to the eqaution its.. T(n) = T(0)// which is C + n*n - n which makes it C + n^2 -n.. so its O(n^2).. is what I did ryt??
Yes, your solution is correct. You are combining O(n) with O(n-1), O(n-2) ... and coming up with O(n^2). You can apply O(n) + O(n-1) = O(n), but only finitely. In a series it is different.
T(n) = (0 to n)Σ O(n - i)
Ignore i inside O(), your result is O(n^2)
The recurrence relationship you gave T(n)=T(n-1)+O(n) is true for Selection Sort, which has overall time complexity as O(n^2). Check this link to verify
In selection sort:
In iteration i, we find the index min of smallest remaining entry.
And then swap a[i] and a[min].
As such the selection sort uses
(n-1)+(n-2)+....+2+1+0 = (n-1)*(n-2)/2 = O(n*n) compares
and exactly n exchanges(swappings).
FROM ABOVE
And from the recurrence relation given above
=> T(n) = T(n-1)+ O(n)
=> T(n) = T(n-1)+ cn, where c is some positive constant
=> T(n) = cn + T(n-2) + c(n-1)
=> T(n) = cn + c(n-1) +T(n-3)+ c(n-2)
And this goes on and we finally get
=> T(n) = cn + c(n-1) + c(n-2) + ...... c (total no of n terms)
=> T(n) = c(n*(n-1)/2)
=> T(n) = O(n*n)
EDIT
Its always better to replace theta(n) as cn, where c is some constant. Helps in visualizing the equation more easily.

Timecomplexity analysis of function, Big O

What time-complexity will the following code have in respect to the parameter size? Motivate.
// Process(A, N) is O(sqrt(N)).
Function Complex(array[], size){
if(size == 1) return 1;
if(rand() / float(RAND_MAX) < 0.1){
return Process(array, size*size)
+ Complex(array, size/2)
+ Process(array, size*size);
}
}
I think it is O(N), because if Process(A, N) is O(sqrt(N)), then Process(A, N*N) should be O(N), and Complex(array, size/2) is O(log(n)) because it halves the size every time it runs. So on one run it takes O(N) + O(log(N)) + O(N) = O(N).
Please correct me and give me some hints on how I should think / proceed an assignment like this.
I appreciate all help and thanks in advance.
The time complexity of the algorithm is O(N) indeed, but for a different reason.
The complexity of the function can be denoted as T(n) where:
T(n) = T(n/2) + 2*n
^ ^
recursive 2 calls to
invokation Process(arr,n*n),
each is O(n(
This recursion is well known to be O(n):
T(n) = T(n/2) + 2*n =
= T(n/4) + 2*n/2 + 2*n =
= T(n/8) + 2*n/4 + 2*n/2 + 2*n
= ....
= 2*n / (2^logN) + ... + 2*n/2 + 2*n
< 4n
in O(n)
Let's formally prove it, we will use mathematical induction for it:
Base: T(1) < 4 (check)
Hypothesis: For n, and for every k<n the claim T(k) < 4k holds true.
For n:
T(n) = T(n/2) + n*2 = (*)
< 2*n + 2*n
= 4n
Conclusion: T(n) is in O(n)
(*) From the induction hypothesis

The Recurrence T(n)= 2T(n/2) + (n-1)

I have this recurrence:
T(n)= 2T(n/2) + (n-1)
My try is as follow:
the tree is like this:
T(n) = 2T(n/2) + (n-1)
T(n/2) = 2T(n/4) + ((n/2)-1)
T(n/4) = 2T(n/8) + ((n/4)-1)
...
the hight of the tree : (n/(2h))-1 = 1 ⇒ h = lg n - 1 = lg n - lg 2
the cost of the last level : 2h = 2lg n - lg 2 = (1/2) n
the cost of all levels until level h-1 : Σi=0,...,lg(2n) n - (2i-1), which is a geometric series and equals (1/2)((1/2)n-1)
So, T(n) = Θ(n lg n)
my question is: Is that right?
No, it isn't. You have the cost of the last level wrong, so what you derived from that is also wrong.
(I'm assuming you want to find the complexity yourself, so no more hints unless you ask.)
Edit: Some hints, as requested
To find the complexity, one usually helpful method is to recursively apply the equation and insert the result into the first,
T(n) = 2*T(n/2) + (n-1)
= 2*(2*T(n/4) + (n/2-1)) + (n-1)
= 4*T(n/4) + (n-2) + (n-1)
= 4*T(n/4) + 2*n - 3
= 4*(2*T(n/8) + (n/4-1)) + 2*n - 3
= ...
That often leads to a closed formula you can prove via induction (you don't need to carry out the proof if you have enough experience, then you see the correctness without writing down the proof).
Spoiler: You can look up the complexity in almost any resource dealing with the Master Theorem.
This can be easily solved with Masters theorem.
You have a=2, b=2, f(n) = n - 1 = O(n) and therefore c = log2(2) = 1. This falls into the first case of Master's theorem, which means that the complexity is O(n^c) = O(n)

Resources