Time Complexity of Sequential search - algorithm

I am trying to find the time complexity for selection sort which has the following equation T(n)=T(n-1)+O(n)
First I supposed its T(n)=T(n-1)+n .. n is easier though..
Figured T(n-1) = T(n-2) + (n-1)
and T(n-2) = T(n-3) + (n-2)
This makes T(n) = (T(n-3) + (n-2)) + (n-1) + n so its T(n) = T(n-3) + 3n - 3..
K instead of (3) .. T(n) = T(n-k) + kn - k and because n-k >= 0 .. ==> n-k = 0 and n=k Back to the eqaution its.. T(n) = T(0)// which is C + n*n - n which makes it C + n^2 -n.. so its O(n^2).. is what I did ryt??

Yes, your solution is correct. You are combining O(n) with O(n-1), O(n-2) ... and coming up with O(n^2). You can apply O(n) + O(n-1) = O(n), but only finitely. In a series it is different.
T(n) = (0 to n)Σ O(n - i)
Ignore i inside O(), your result is O(n^2)
The recurrence relationship you gave T(n)=T(n-1)+O(n) is true for Selection Sort, which has overall time complexity as O(n^2). Check this link to verify

In selection sort:
In iteration i, we find the index min of smallest remaining entry.
And then swap a[i] and a[min].
As such the selection sort uses
(n-1)+(n-2)+....+2+1+0 = (n-1)*(n-2)/2 = O(n*n) compares
and exactly n exchanges(swappings).
FROM ABOVE
And from the recurrence relation given above
=> T(n) = T(n-1)+ O(n)
=> T(n) = T(n-1)+ cn, where c is some positive constant
=> T(n) = cn + T(n-2) + c(n-1)
=> T(n) = cn + c(n-1) +T(n-3)+ c(n-2)
And this goes on and we finally get
=> T(n) = cn + c(n-1) + c(n-2) + ...... c (total no of n terms)
=> T(n) = c(n*(n-1)/2)
=> T(n) = O(n*n)
EDIT
Its always better to replace theta(n) as cn, where c is some constant. Helps in visualizing the equation more easily.

Related

Deduct time complexity from this Recurrence formula?

I was reading a time complexity calculation related question on SO but I can't comment there (not enough reps).
What's the time complexity of this algorithm for Palindrome Partitioning?
I have a question regarding going from 1st to 2nd equation here:
Now you can write the same expression for H(n-1), then substitute back
to simplify:
H(n) = 2 H(n-1) + O(n) =========> Eq.1
And this solves to
H(n) = O(n * 2^n) =========> Eq.2
Can someone illustrate how he got Eq.2 from Eq.1? Thank you.
Eq 1. is a recurrence relation. See the link for a tutorial on how to solve these types of equations, but we can solve via expansion as below:
H(n) = 2H(n-1) + O(n)
H(n) = 2*2H(n-2) + 2O(n-1) + O(n)
H(n) = 2*2*2H(n-3) + 2*2O(n-2) + 2O(n-1) + O(n)
...
H(n) = 2^n*H(1) + 2^(n-1)*O(1) + ... + 2O(n-1) + O(n)
since H(1) = O(n) (see the original question)
H(n) = 2^n*O(n) + 2^(n-1)*O(1) + ... + 2O(n-1) + O(n)
H(n) = O(n * 2^n)
We need to homogenize the equation, in this simple case just by adding a constant to each side. First, designate O(n) = K to avoid ealing with the O notation at this stage:
H(n) = 2 H(n-1) + K
Then add a K to each side:
H(n) + K = 2 (H(n-1) + K)
Let G(n) = H(n) + K, then
G(n) = 2 G(n-1)
This is a well-known homogeneous 1-st order recurrence, with the solution
G(n) = G(0)×2n = G(1)×2n-1
Since H(1) = O(n), G(1) = H(1) + K = O(n) + O(n) = O(n),
G(n) = O(n)×2n-1 = O(n×2n-1) = O(n×2n)
and
H(n) = G(n) - K = O(n×2n) - O(n) = O(n×2n)
They are wrong.
Let's assume that O refers to a tight bound and substitute O(n) with c * n for some constant c. Unrolling the recursion you will get:
When you finish to unroll recursion n = i and b = T(0).
Now finding the sum:
Summing up you will get:
So now it is clear that T(n) is O(2^n) without any n
For people who are still skeptical about the math:
solution to F(n) = 2F(n-1) + n
solution to F(n) = 2F(n-1) + 99n

What is the time complexity of recurrence 2T(n-1)+O(n)?

What is the asymptotic complexity of T(n) = 2T(n-1) + O(n)? I guess it's solved using substitution method..How to solve this recurrence? You can assume that the algorithm stops when it reaches T(1).
The master theorem can't be used here because the size shrinkage factor b (=(n-1)/n) is basically equal to 1.
However, if you calculate the first few terms, then you can easily see that the time complexity is 2**n:
T(1) = 1 = 1
T(2) = 2*1 + O(n) = 2 + O(n)
T(3) = 2*(2*1 + O(n)) + O(n) = 2**2 + 3O(n)
T(4) = 2*(2*(2*1 + O(n)) + O(n)) + O(n) = 2**3 + 7O(n)
: : : : : : : : :
T(n) = 2**(n-1) + O(n) * (2**(n-1)-1) ≈ O(2**n)
There's a trick to the substitution method. If you try the straightforward approach, you'll get
T(n) <=? 2^n
T(n) = 2 T(n-1) + cn
<= 2^(n-1) + 2^(n-1) + cn
= 2^n + cn,
which is not less than or equal to 2^n. The solution is not intuitive: subtract a low-order term. Omitting some fiddling to get the right one,
T(n) <=? d 2^n - cn - 2c
T(n) = 2 T(n-1) + cn
<= 2 (d 2^(n-1) - c (n-1) - 2c) + cn
= d 2^n - cn - 2c,
and set d to cover the base case. (Given that you want big-O, you don't even need to guess the proper term very accurately.)

Analyzing the QuickSort algorithm

I am following an MIT lecture on YouTube about Quick Sort. I got most of the idea but I am stuck of what he said Arithmetic Series in the following point:
Worst-Case:
T(n) = T(n-1) + Theta(n)
He asked, "What is that equals to?"
And then he said it is equal to Theta(n^2)
Why is it equal to Theta(n^2) and not equal to Theta(n)??
It is a sum of arithmetic progression T(n) = T(n-1) + n = n + n-1 + n-2 + ... + 1 = n(n+1)/2 which is in Theta(n^2)
You can also get it with induction, assuming Theta(n) stands for n (for simplicity, can be modified using the same approach):
Hypothesis: T(n) = n(n+1)/2
Base: T(1) = 1*2/2 = 1
Step:
T(n) = T(n-1) + n <= (*) (n-1)*n/2 + n =
= (n^2 -n)/2 + 2n/2 = (n^2-n + 2n)/2 =
= (n^2 +n) /2 = n(n+1)/2
(*) induction hypothesis
This shows us T(n) = n(n+1)/2 which is indeed in Theta(n^2).

Recurrence relation - what sum is it?

I need to solve: T(n) = T(n-1) + O(1)
when I find the general T(n) = T(n-k) + k O(1)
what sum is it? I mean when I reach the base case: n-k=1; k=n-1
Is it "sum k, k=1 to n"? but the result of this sum is n(n-1)/2 and I know that the result is O(n).
So I know that I don't need a sum with this relation but what sum is correct for this recurrence relation?
Thanks
If we make the (reasonable) assumption that T(0) = 0 (or T(1) = O(1)), then we can apply your
T(n) = T(n - k) + k⋅O(1) to k = n and obtain
T(n) = T(n - n) + n⋅O(1) = 0 + n⋅O(1) = O(n).
Edit: if you insist on representing the recurrence as a sum, here it is:
T(n) = T(n - 1) + O(1) = T(n - 2) + O(1) + O(1) = ... = Σk = 1,...n O(1) = n⋅O(1) = O(n)

The Recurrence T(n)= 2T(n/2) + (n-1)

I have this recurrence:
T(n)= 2T(n/2) + (n-1)
My try is as follow:
the tree is like this:
T(n) = 2T(n/2) + (n-1)
T(n/2) = 2T(n/4) + ((n/2)-1)
T(n/4) = 2T(n/8) + ((n/4)-1)
...
the hight of the tree : (n/(2h))-1 = 1 ⇒ h = lg n - 1 = lg n - lg 2
the cost of the last level : 2h = 2lg n - lg 2 = (1/2) n
the cost of all levels until level h-1 : Σi=0,...,lg(2n) n - (2i-1), which is a geometric series and equals (1/2)((1/2)n-1)
So, T(n) = Θ(n lg n)
my question is: Is that right?
No, it isn't. You have the cost of the last level wrong, so what you derived from that is also wrong.
(I'm assuming you want to find the complexity yourself, so no more hints unless you ask.)
Edit: Some hints, as requested
To find the complexity, one usually helpful method is to recursively apply the equation and insert the result into the first,
T(n) = 2*T(n/2) + (n-1)
= 2*(2*T(n/4) + (n/2-1)) + (n-1)
= 4*T(n/4) + (n-2) + (n-1)
= 4*T(n/4) + 2*n - 3
= 4*(2*T(n/8) + (n/4-1)) + 2*n - 3
= ...
That often leads to a closed formula you can prove via induction (you don't need to carry out the proof if you have enough experience, then you see the correctness without writing down the proof).
Spoiler: You can look up the complexity in almost any resource dealing with the Master Theorem.
This can be easily solved with Masters theorem.
You have a=2, b=2, f(n) = n - 1 = O(n) and therefore c = log2(2) = 1. This falls into the first case of Master's theorem, which means that the complexity is O(n^c) = O(n)

Resources