What time-complexity will the following code have in respect to the parameter size? Motivate.
// Process(A, N) is O(sqrt(N)).
Function Complex(array[], size){
if(size == 1) return 1;
if(rand() / float(RAND_MAX) < 0.1){
return Process(array, size*size)
+ Complex(array, size/2)
+ Process(array, size*size);
}
}
I think it is O(N), because if Process(A, N) is O(sqrt(N)), then Process(A, N*N) should be O(N), and Complex(array, size/2) is O(log(n)) because it halves the size every time it runs. So on one run it takes O(N) + O(log(N)) + O(N) = O(N).
Please correct me and give me some hints on how I should think / proceed an assignment like this.
I appreciate all help and thanks in advance.
The time complexity of the algorithm is O(N) indeed, but for a different reason.
The complexity of the function can be denoted as T(n) where:
T(n) = T(n/2) + 2*n
^ ^
recursive 2 calls to
invokation Process(arr,n*n),
each is O(n(
This recursion is well known to be O(n):
T(n) = T(n/2) + 2*n =
= T(n/4) + 2*n/2 + 2*n =
= T(n/8) + 2*n/4 + 2*n/2 + 2*n
= ....
= 2*n / (2^logN) + ... + 2*n/2 + 2*n
< 4n
in O(n)
Let's formally prove it, we will use mathematical induction for it:
Base: T(1) < 4 (check)
Hypothesis: For n, and for every k<n the claim T(k) < 4k holds true.
For n:
T(n) = T(n/2) + n*2 = (*)
< 2*n + 2*n
= 4n
Conclusion: T(n) is in O(n)
(*) From the induction hypothesis
Related
How to get big-O for this?
T(N) = 2T(N − 1) + N, T(1) = 2
I got two variants of answer O(2^N) or O(N^2), but I am not sure how to solve it correctly
Divide T(N) by 2^N and name the result:
S(N) = T(N)/2^N
From the definition of T(N) we get
S(N) = S(N-1) + N/2^N (eq.1)
meaning that S(N) increases, but quickly converges to a constant (since N/2^N -> 0). So,
T(N)/2^N -> constant
or
T(N) = O(2^N)
Detailed proof
In the comment below Paul Hankin suggests how to complete the proof. Take eq.1 and sum from N=2 to N=M
sum_{N=2}^M S(N) = sum_{N=2}^M S(N-1) + sum_{N=2}^M N/2^N
= sum_{N=1}{M-1} S(N) + sum_{N=1}^{M-1} (N-1)/2^{N-1}
thus, after canceling terms with indexes N = 2, 3, ..., M-1, we get
S(M) = S(1) + sum_{N=1}^M N/2^N - M/2^M
and since the series on the right converges (because its terms are bounded by 1/N^2 for N>>1 which is known to converge), S(M) converges to a finite constant.
It's a math problem and Leandro Caniglia is right.
let b(n) = T(n) / 2^n
thus b(n) = b(n-1) + n / 2^n = b(n-2) + n / 2^n + (n-1) / 2^(n-1) ....
i / 2^i is less than 1 for every integer i
So the sum of them has limit and must smaller than some constant.
thus b(n) < C.
thus T(n) < 2^n * C.
It is obvious that T(n) >= 2^n.
So T(n) is O(2^n)
Check by plugging the answer in the equation.
2^N = 2.2^(N-1) + N = 2^N + N
or
N^2 = 2 (N-1)^2 + N
Keeping only the dominant terms, you have
2^N ~ 2^N
or
N^2 ~ 2 N^2.
Conclude.
I was reading a time complexity calculation related question on SO but I can't comment there (not enough reps).
What's the time complexity of this algorithm for Palindrome Partitioning?
I have a question regarding going from 1st to 2nd equation here:
Now you can write the same expression for H(n-1), then substitute back
to simplify:
H(n) = 2 H(n-1) + O(n) =========> Eq.1
And this solves to
H(n) = O(n * 2^n) =========> Eq.2
Can someone illustrate how he got Eq.2 from Eq.1? Thank you.
Eq 1. is a recurrence relation. See the link for a tutorial on how to solve these types of equations, but we can solve via expansion as below:
H(n) = 2H(n-1) + O(n)
H(n) = 2*2H(n-2) + 2O(n-1) + O(n)
H(n) = 2*2*2H(n-3) + 2*2O(n-2) + 2O(n-1) + O(n)
...
H(n) = 2^n*H(1) + 2^(n-1)*O(1) + ... + 2O(n-1) + O(n)
since H(1) = O(n) (see the original question)
H(n) = 2^n*O(n) + 2^(n-1)*O(1) + ... + 2O(n-1) + O(n)
H(n) = O(n * 2^n)
We need to homogenize the equation, in this simple case just by adding a constant to each side. First, designate O(n) = K to avoid ealing with the O notation at this stage:
H(n) = 2 H(n-1) + K
Then add a K to each side:
H(n) + K = 2 (H(n-1) + K)
Let G(n) = H(n) + K, then
G(n) = 2 G(n-1)
This is a well-known homogeneous 1-st order recurrence, with the solution
G(n) = G(0)×2n = G(1)×2n-1
Since H(1) = O(n), G(1) = H(1) + K = O(n) + O(n) = O(n),
G(n) = O(n)×2n-1 = O(n×2n-1) = O(n×2n)
and
H(n) = G(n) - K = O(n×2n) - O(n) = O(n×2n)
They are wrong.
Let's assume that O refers to a tight bound and substitute O(n) with c * n for some constant c. Unrolling the recursion you will get:
When you finish to unroll recursion n = i and b = T(0).
Now finding the sum:
Summing up you will get:
So now it is clear that T(n) is O(2^n) without any n
For people who are still skeptical about the math:
solution to F(n) = 2F(n-1) + n
solution to F(n) = 2F(n-1) + 99n
I am trying to find the time complexity for selection sort which has the following equation T(n)=T(n-1)+O(n)
First I supposed its T(n)=T(n-1)+n .. n is easier though..
Figured T(n-1) = T(n-2) + (n-1)
and T(n-2) = T(n-3) + (n-2)
This makes T(n) = (T(n-3) + (n-2)) + (n-1) + n so its T(n) = T(n-3) + 3n - 3..
K instead of (3) .. T(n) = T(n-k) + kn - k and because n-k >= 0 .. ==> n-k = 0 and n=k Back to the eqaution its.. T(n) = T(0)// which is C + n*n - n which makes it C + n^2 -n.. so its O(n^2).. is what I did ryt??
Yes, your solution is correct. You are combining O(n) with O(n-1), O(n-2) ... and coming up with O(n^2). You can apply O(n) + O(n-1) = O(n), but only finitely. In a series it is different.
T(n) = (0 to n)Σ O(n - i)
Ignore i inside O(), your result is O(n^2)
The recurrence relationship you gave T(n)=T(n-1)+O(n) is true for Selection Sort, which has overall time complexity as O(n^2). Check this link to verify
In selection sort:
In iteration i, we find the index min of smallest remaining entry.
And then swap a[i] and a[min].
As such the selection sort uses
(n-1)+(n-2)+....+2+1+0 = (n-1)*(n-2)/2 = O(n*n) compares
and exactly n exchanges(swappings).
FROM ABOVE
And from the recurrence relation given above
=> T(n) = T(n-1)+ O(n)
=> T(n) = T(n-1)+ cn, where c is some positive constant
=> T(n) = cn + T(n-2) + c(n-1)
=> T(n) = cn + c(n-1) +T(n-3)+ c(n-2)
And this goes on and we finally get
=> T(n) = cn + c(n-1) + c(n-2) + ...... c (total no of n terms)
=> T(n) = c(n*(n-1)/2)
=> T(n) = O(n*n)
EDIT
Its always better to replace theta(n) as cn, where c is some constant. Helps in visualizing the equation more easily.
I need to solve: T(n) = T(n-1) + O(1)
when I find the general T(n) = T(n-k) + k O(1)
what sum is it? I mean when I reach the base case: n-k=1; k=n-1
Is it "sum k, k=1 to n"? but the result of this sum is n(n-1)/2 and I know that the result is O(n).
So I know that I don't need a sum with this relation but what sum is correct for this recurrence relation?
Thanks
If we make the (reasonable) assumption that T(0) = 0 (or T(1) = O(1)), then we can apply your
T(n) = T(n - k) + k⋅O(1) to k = n and obtain
T(n) = T(n - n) + n⋅O(1) = 0 + n⋅O(1) = O(n).
Edit: if you insist on representing the recurrence as a sum, here it is:
T(n) = T(n - 1) + O(1) = T(n - 2) + O(1) + O(1) = ... = Σk = 1,...n O(1) = n⋅O(1) = O(n)
I'm unaware of the mathematics in this algorithm and could use some help.
Algorithm:
if n<2 then
return n
else return fibonacci(n-1) + fibonacci(n-2)
Statement
n < 2 is O(1)
Time n >=2 is O(1)
Return n is O(1)
Time n>=2 is -
Return fib(n-1) + fib(n-2) is -
and time n>=2 is T(n-1) + T(n-2) +O(1)
Total: O(1) T(n-1) + T(n-2) + O(1)
T(n) = O(1) if n < 2
T(n) = T(n-1) + T(n-2) + O(1) if n>=2
I think you're supposed to notice that the recurrence relation for this function is awfully familiar. You can learn exactly how fast this awfully familiar recurrence grows by looking it up by name.
However, if you fail to make the intuitive leap, you can try to bound the runtime using a simplified problem. Essentially, you modify the algorithm in ways guaranteed to increase the runtime while making it simpler. Then you figure out the runtime of the new algorithm, which gives you an upper bound.
For example, this algorithm must take longer and is simpler to analyze:
F(n): if n<2 then return n else return F(n-1) + F(n-1)
By induction: if calculation of fib(k) takes less than C*2^k for all k < n, for the calculation tome of fib(n) we've got
T(n) = T(n-1) + T(n-2) + K < C*2^(n-1) + С*2^(n-2) + K
= 0.75*C*2^n + K < C*2^n
for sufficiently big C (for C > K/0.25, as 2^n > 1). This proves that T(n) < C*2^n, i.e. T(n) = O(2^n).
(Here T(n) is the time for calculation of fib(n), K is the constant time needed for calculating fib(n) when both fib(n-1) and fib(b-1) are [already] calculated.)
You need to solve the recurrence equation:
T(0) = 1
T(1) = 1
T(n) = T(n-1) + T(n-2), for all n > 1