How can one determine the time complexity of this loop:
for(int i = N-1; i >= 0; i--)
{
for(int j = 1; j <= i; j++)
{
if(numbers[j-1] > numbers[j])
{
temp = numbers[j-1];
numbers[j-1] = numbers[j];
numbers[j] = temp;
}
}
}
As you may have noticed this is the algorithm for bubble sort. Also is the frequency count of this algorithm for comparison and assignment the same?
Calculate the complexity
You need to add the basic operations/machine instructions that are being executed. (as a function of the size of it's input)
Calculation
for(int i = N-1; i >= 0; i--)
{ | | |
c1 c2 c3
for(int j = 1; j <= i; j++)
{ | | |
c4 c5 c6
if(numbers[j-1] > numbers[j])--c7
{
temp = numbers[j-1];
numbers[j-1] = numbers[j];
numbers[j] = temp;
}
}
}
c1,c2,c3,c4,c5,c6,c7 are the costs to execute the machine instructions corresponding to these constructs (like i>=0,j<=i etc)
Now for i=N-1 the innerloop is executed N-1 times
for i=N-2 the innerloop is executed N-2 times
....
for i=0 the innerloop is executed 0 times
So the innerloop is executed (N-1)+(N-2)+...1+0 times which is
= N*(N-1)/2
Look carefully the cost is
= c1+ c2*(N+1) + c3*N+ c4*N+((N*(N-1)/2)+1)*(c5)+ (N(N-1)/2)*(c6+c7);
= c1+c2+c5+ N*(c2+c3-(c5+c6+c7)/2) + N^2 * (c5/2 + c6/2 + c7/2)
= c8 + N*c9 + N^2 *(c10) [c8,c9,c10 are constants]
Why do we multiply N+1 with c2 ? that is because of the last check when actually i=-1.
Now for large values of N, N^2 dominates N.
So the time complexity is O(N^2).
So, T(N)=O(N^2)
The complexity of your current implementation is O(n^2) for both best and worst cases, and is the same if count only comparisons, only assignments, or both.
Here are the detailed calculations, K being a constant depending on which operations you want to take into account for time complexity:
If you want a more efficient Bubble sort algorithm, check out the pseudo codes on the Wiki page or this answer, you will find algorithms with O(n) best case complexity.
Related
I have seen that in some cases the complexity of nested loops is O(n^2), but I was wondering in which cases we can have the following complexities of nested loops:
O(n)
O(log n) I have seen somewhere a case like this, but I do not recall the exact example.
I mean is there any kind of formulae or trick to calculate the complexity of nested loops? Sometimes when I apply summation formulas I do not get the right answer.
Some examples would be great, thanks.
Here is an example for you where the time complexity is O(n), but you have a double loop:
int cnt = 0;
for (int i = N; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
cnt += 1;
}
}
You can prove the complexity in the following way:
The first iteration, the j loop runs N times. The second iteration, the j loop runs N / 2 times. i-th iteration, the j loop runs N / 2^i times.
So in total: N * ( 1 + 1/2 + 1/4 + 1/8 + … ) < 2 * N = O(N)
It would be tempting to say that something like this runs in O(log(n)):
int cnt = 0;
for (int i = 1; i < N; i *= 2) {
for (int j = 1; j < i; j*= 2) {
cnt += 1;
}
}
But I believe that this runs in O(log^2(N)) which is polylogarithmic
I am having problem understanding the answer to the following question about analyzing two algorithms below.
for (int i = n ; i >= 1; i = i/2) {
for ( int j = 1; j <= n ; j++) {
//do something
}
}
The algorithm above has complexity of O(n) according to the answers. Shouldn't it be lower since the outer loop always halves the amount we have to go through. I thought that it should be something along the lines of O(n/2 * )?
for ( int j = 1; j <= n ; j++ ) {
for ( int i = n ; i >= j ; i = i / 2 ) {
//do something
}
}
This one is O(n log n) if I am correct?
The first iteration will execute n steps, the second will execute n/2, the third will execute n/4, and so on.
If you compute the sum of n/(2^i) for i=0..log n you will get roughly 2n and that is why it is O(n).
If you take n out of the summation and sum only the 1/(2^i) part, you will get 2. Take a look at an example:
1 + 1/2 + 1/4 + 1/8 + 1/16 + ... = 1 + 0.5 + 0.25 + 0.125 + 0.0625 + ... = 2
Each next element is twice smaller, therefore the sum will never exceed 2.
You are right with the second nested loop example - it is O(n log n).
EDIT:
After the comment from ringø I re-read the question and in fact the algorithm is different from what I understood. ringø is right, the algorithm as described in the question is O(n log n). However, judging from the context I think that the OP meant an algorithm where the inner loop is tied to i and not n.
This answer relates to the following algorithm:
for (int i = n ; i >= 1; i = i/2) {
for ( int j = 1; j <= i ; j++) {
//do something
}
}
What is the order of growth of the worst case running time of the following code fragment as a function of N?
int cnt = 0;
for (int i = 1; i <= N; i = i*4)
for (int j = 0; j < i; j++)
{ cnt++; }
I now for example that first loop execute ~log(4, N) times and the second loop execute ~N times. But how to combine this knowlege to find the answer?
What is the general way to find that kind of complexity?
Maybe, we need to know how much time the body of the inner loop is executed?
For example 1 + 4 + 16 + 64 + ... + N
Geometric progression = (x^n - 1)/(x-1) where n=Log(4,N), so the result is
(x^log(x, N) - 1)/ (x-1) = (4N - 1)/3
Let's N belong to the interval [4^k; 4^(k+1)), then we have got sum:
sum 4^i, i=0..k = (4^(k+1)-1)/3 = O(n)
I was late some minutes and minus...
Can someone please explain how the worst case running time is O(N) and not O(N^2)in the following excercise. There is double for loop, where for every i we need to compare j to i , sum++ and then increment and again repeat the operation until reach N.
What is the order of growth of the worst case running time of the following code fragment
as a function of N?
int sum = 0;
for (int i = 1; i <= N; i = i*2)
for (int j = 0; j < i; j++)
sum++;
Question Explanation
The answer is : N
The body of the inner loop is executed 1 + 2 + 4 + 8 + ... + N ~ 2N times.
I think you already stated the answer in your question -- the inner loop is executed 2N times, which is O(N). In asymptotic (or big-O) notation any multiples are dropped because for very, very large values, the graph of 2N looks just like N, so it isn't considered significant. In this case, the complexity of the problem is equal to the number of times "sum++" is called, because the algorithm is so simple. Does that make sense?
Complexity doesn't depends upon number of nested loops
it is O(Nc):
Time complexity of nested loops is equal to the number of times theinnermost statement is executed.For example the following sample loops have O(N2) time complexity
for (int i = 1; i <=n; i += c) {
for (int j = 1; j <=n; j += c) {
// some O(1) expressions
}
}
for (int i = n; i > 0; i += c) {
for (int j = i+1; j <=n; j += c) {
// some O(1) expressions
}
I have got a program, and trying to compute its complexity. I want to be sure i am not mistaken
for(int i=4; i<=n; i=i*4)
{
cout<<"counter for first loop: "<<++count1<<endl;
for(int j=i;j>=0;j=j-4)
{
cout<<"counter for second loop: "<<++count2<<endl;
for(int k=0;k<=n;k++)
{
cout<<"counter for third loop: "<<++count3<<endl;
}
}
}
Here, the complexity of third loop is O(n), then together with the second loop, the complexity becomes O(n.log4i), and the complexity of whole program is O(n.(log4i)2). Am i right in my answer? Thanks
The complexity of the inner most loop is O(n). The complexity of the middle one is O(i/4), which in turn is O(i). The complexity of the outer most loop is O(log4n). There for the total complexity of the code is O(n.i.log4n) which is equal to O (n.(log4n)2).
You can proceed formally like the following:
Executing this fragment:
sum = 0;
for( i = 4 ; i <= n; i = i * 4 ) {
for( j = i ; j >= 0 ; j = j - 4 ) {
for( k = 0 ; k <= n ; k ++ ) {
sum ++;
}
}
}
We obtain:
Results exactly compatible with the formula above.
Besides, both inner loops' runtime is O(n) ... which means that, when executed together, we get O(n²).