complexity analysis for double loops - complexity-theory

If I have the following code in . net:
for i=0 to n
for j=0 to n
m=i*j
next j
next i
so I have done the following complexity analysis:
is this correct? Also in which case a double loop could give a O(n) complexity?

In the simplest case, the complexity of two nested loops is equal to the complexity of one loop multiplied by the complexity of the other. The demonstration is almost identical to what you posted.
Therefore, if you want the complexity of the two nested loops to be O(n), you could make one of the loops execute in constant time:
for i=0 to 10
for j=0 to n
m = i*j
next j
next i
or you could have each of them execute in O(n^1/2):
for i=0 to sqrt(n)
for j=0 to sqrt(n)
m = i*j
next j
next i
or other variations of the sort.
A particularly interesting solution could be when the number of iterations in the inner loop depends on the counter for the outer loop, but i don't have an example for you.

Related

What is the time complexity (big-O) of the following code? It is hard to analyze

int cnt = 0;
for (int i = 1; i < n; i++) {
for (int j = i; j < n; j++) {
for (int k = j * j; k < n; k++) {
++cnt;
}
}
}
I have no idea of it.
How to analyze the time complexity of it?
It's easy to see that the code is Omega(n²) (that is, is at least quadratic) - the two outer loops execute around n²/2 times.
The inner k loop executes zero times unless j is less than sqrt(n). Even though it executes zero times, it takes some computation to compute the conditions for the loop, so it's O(1) work in these cases.
When j is less than sqrt(n), i must also be less than sqrt(n), since by the construction of the loops, j is always greater than or equal to i. In these cases, the k loop does n-j² iterations. We can construct a bound for the total amount of work in this inner loop in these cases: both i and j are less than sqrt(n), and there's at worst O(n) work done in the k loop, so there's at most O(n²) (ie: sqrt(n) * sqrt(n) * n) total work done in the inner loop.
There's also at most O(n²) total work done for the cases where the inner loop is trivial (ie: when j>sqrt(n)).
This gives a proof that the runtime complexity of the code is θ(n²).
Methods involving looking at nested loops individually and constructing big-O bounds for them do not in general give tight bounds, and this is an example question where such a method fails.
The first approach would be to look at the loops separately, meaning that we have three O(.) that are connected by a product. Hence,
Complexity of the Algorithm = O(OuterLoop)*O(MiddleLoop)*O(InnerLoop)
Now look at each loop separately:
Outerloop: This is the most simple one. Incrementing from 1 to n, resulting in O(n)
Middleloop: This is non-obvious, the terminate condition of the loop is still n, but the starting iterator value is i, meaning that the larger i gets, less time it will take to finish the loop. But this factor is quadratical-asymptotically only a constant, meaning that it is still O(n), hence O(n^2) "until" the second loop.
Inner loop: We see, that the iterator increases quadratically. But we also see that the quadratic-increasing depends on the second loop, which we said to be O(n). Since, we again only look at the complexity asymptomatically, means that we can assume that j rises linearly, and since k rises quadratically until n, it will take \square(n) iterations until n is reached. Meaning that the inner most loop has a running time of O(\square(n)).
Putting all these results together,
O(n * n* square(n))=O(n^2*square(n))

Time complexity for a for loop with a multi-variable condition statement

A. What is the complexity (big-O) of the following code fragment?
for (i = 0; (i < n) || (i < m); i++) {
sequence of statements
}
The best i could come up with is the following..
if n is less then m O(n)
else O(m)
I have no clue how to write big-o in the case where there are two variables.
I know this is a very corner case and basic time complexity question so i do not mind removing it after I get some clarification.
The time complexity is O( max(m,n) ) assuming that the body of the loop is O(1).
You say in the question that it would be O(n) if n < m -- that would be the case if the condition on the for loop used an and clause not an or clause. The way it is written the loop will iterate as long as i is less than the bigger of m and n, e.g. if m is a million and n is 0, it is going to iterate a million times. The time complexity scales with the maximum of the two variables.

What is the time complexity and space complexity of following code

I want to know the time complexity of following piece of code
for (i = 0; i < n; i++) {
for (j = i + 1; j < n; j++) {
printf("hi")
}
}
Time complexity in loop
http://www.geeksforgeeks.org/analysis-of-algorithms-set-4-analysis-of-loops/
http://www.geeksforgeeks.org/analysis-of-algorithms-set-4-analysis-of-loops/
i think following link is used full for this problem :)
Time complexity is nothing but the number of instructions executed in your program. Now, in your program you have two loops. Outer loop will iterate i=0 to i=N-1, which is total of N instructions that is O(N). Since you also have a inner loop which will again iterate from j=i+1 to j=N-1 for each i.
Hence, time complexity will be O(N^2).
At each iteration of the inner loop work done is a follows
n
n-1
n-2
....
...
1
So total work done will be n+(n-1)+(n-2)+...+...+1, which turn out to be the sum of n natural numbers i.e n*(n+1)/2 so time complexity is O(n^2). As no auxiliary space is used so space complexity is O(1)

Complexity of a double for loop

I am trying to figure out the complexity of a for loop using Big O notation. I have done this before in my other classes, but this one is more rigorous than the others because it is on the actual algorithm. The code is as follows:
for(cnt = 0, i=1; i<=n; i++) //for any size n
{
for(j = 1; j <= i; j++)
{
cnt++;
}
}
AND
for(cnt = 0, i=1; i<=n; i*=2) //for any size n
{
for(j = 1; j <= i; j++)
{
cnt++;
}
}
I have arrived that the first loop is of O(n) complexity because it is going through the list n times. As for the second loop I am a little lost. I believe that it is going through the loop i times for each n that is tested. I have (incorrectly) assumed that this means that the loop is O(n*i) for each time it is evaluated. Is there anything that I'm missing in my assumption. I know that cnt++ is constant time.
Thank you for the help in the analysis. Each loop is in its own space, they are not together.
The outer loop of the first example executes n times. For each iteration of the outer loop, the inner loop gets executed i times, so the overall complexity can be calculated as follows: one for the first iteration plus two for the second iteration plus three for the third iteration and so on, plus n for the n-th iteration.
1+2+3+4+5+...+n = (n*(n-1))/2 --> O(n^2)
The second example is trickier: since i doubles every iteration, the outer loop executes only Log2(n) times. Assuming that n is a power of 2, the total for the inner loop is
1+2+4+8+16+...+n
which is 2^Log2(n)-1 = n-1 for the complexity O(n).
For ns that are not powers of two the exact number of iterations is (2^(Log2(n)+1))-1, which is still O(n):
1 -> 1
2..3 -> 3
4..7 -> 7
8..15 -> 15
16..31 -> 31
32..63 -> 63
and so on.
Hopefully this isn't homework, but I do see that you at least made an attempt here, so here's my take on this:
cnt is incremented n*(n+1)/2 times, which makes the entire set of both loops O(n^2). The second loop is O(n/2) on the average, which is O(n).
The first example is O(N^2) and What is the Big-O of a nested loop, where number of iterations in the inner loop is determined by the current iteration of the outer loop? would be the question that answers that where the key is to note that the inner loop's number of rounds is dependent on n.
The second example is likely O(n log n) since the outer loop is being incremented on a different scale than linear. Look at binary search for an example of a logarithmic complexity case. In the second example, the outer loop is O(log n) and the inner loop is O(n) which combine to give a O(n log n) complexity.

Running time complexity of a code fragment

Here is the fragment:
sum1=0;
for(i=1;i<=n;i++)
for(j=1;j<=n;j++)
sum1++
sum2=0
for(k=1;k<=n;k*=2)
for(j=1;j<=k;j++)
sum2++
Below is the answer:
2 assignment statements – O(1) each
1st nested loop – O(n2)
2nd nested loop – O(n)
Running time complexity of code fragment = O(1) + O(n^2) + O(1) + O(n) = O(n2)
But here is how I worked it out:
2 assignments:- O(1).
First nested loop: O(n*n)=O(n^2)
Second nested loop:
Outer loop runs n times..
Now the inner loop will be executed (1+2+3+.....+(n-1)+n) times
which gives n(n+1)/2 =O(n^2)
Total running time = O(n^2)+O(n^2)+O(1)=O(n^2)
And yes I've done some research and I came across the following:
In a loop if an index jumps by an increasing amount in each iteration the sequence has complexity log n.
In that case I suppose the second loop will have complexity (n-1)/2*logn...which will be equal to O(n*log n).
I'm really confused with the second loop whether it should be O(n)..O(n^2) or O(nlogn)..
HELP PLEASE
Since you k increased double each time . your calculation is not correct. It should be (1+2+4+....n/2+n)
for(k=1;k<=n;k*=2)
So, O(nlogn) is right.

Resources