Big O time complexity for nested j = i + 1 loop - big-o

Can anyone please let me what would be big O time complexity for the following piece of code:
for (int i = 0; i < array.length - 1; i++) {
for (int j = i + 1; j < array.length; j++) {
// do something
}
}
It can't be O(n^2) since j = i + 1 ? Thanks!

There are n-1 iterations of the outer loop. On each iteration, the inner loop iterates n-i-1 times. So in total the inner loop iterates n-1 + n-2 + ... + 1 times. So the number of times that do something executes is equal to the sum of the numbers from 1 to n-1. That sum is n*(n-1)/2, which is in Theta(n^2) and thus also in O(n^2).

Related

Big O complexity on dependent nested loops

Can I get some help in understanding how to solve this tutorial question! I still do not understand my professors explanation. I am unsure of how to count the big 0 for the third/most inner loop. She explains that the answer for this algorithm is O(n^2) and that the 2nd and third loop has to be seen as one loop with the big 0 of O(n). Can someone please explain to me the big O notation for the 2nd / third loop in basic layman terms
Assuming n = 2^m
for ( int i = n; i > 0; i --) {
for (int j =1; j < n; j *= 2){
for (int k =0; k < j; k++){
}
}
}
As far as I understand, the first loop has a big O notation of O(n)
Second loop = log(n)
Third loop = log (n) (since the number of times it will be looped has been reduced by logn) * 2^(2^m-1)( to represent the increase in j? )
lets add print statement to the innermost loop.
for (int j =1; j < n; j *= 2){
for (int k =0; k < j; k++){
print(1)
}
}
output for
j = 1, 1 1
j = 2, 1 1 1
j = 4, 1 1 1 1 1
...
j = n, 1 1 1 1 1 ... n+1 times.
The question boils down to how many 1s will this print.
That number is
(2^0 + 1) + (2^1 + 1) + (2^2 + 1) + ... + (n + 1)
= (2^0 + 1) + (2^1 + 1) + (2^2 + 1) + ... + (n + 1)
= log n + (1 + 2 + 4 + ... + n)
= O(log n + n)
= O(n).
assuming you know why (1 + 2 + 4 + ... + n) = O(n)
O-notation is an upperbound. You can say it has O(n^2). For least upperbound, I believe it should be O(n*log(n)*log(n)) which belongs to O(n^2).
It’s because of the logarithm. If you have log(16) raised to the power 2 is 16. So log(n) raised to the power of 2 is n. That is why your teacher says to view the second and third loop as O(n) together.
If the max iterations for the second loop are O(log(n)) then the second and third loops will be: O(1 + 2 + 3 + ... + log(n)) = O(log(n)(log(n) + 1)/2) = O((log(n)^2 + log(n))/2) = O(n)
for ( int i = n; i > 0; i --) { // This runs n times
for (int j =1; j < n; j *= 2){ // This runs atmost log(n) times, i.e m times.
for (int k =0; k < j; k++){ // This will run atmost m times, when the value of j is m.
}
}
}
Hence, the overall complexity will be the product of all three, as mentioned in the comments under the question.
Upper bound can be loose or tight.
You can say that it is loosely bound under O(n^2) or tightly bound under O(n * m^2).

some examples of algorithm complexity of nested loops?

I have seen that in some cases the complexity of nested loops is O(n^2), but I was wondering in which cases we can have the following complexities of nested loops:
O(n)
O(log n) I have seen somewhere a case like this, but I do not recall the exact example.
I mean is there any kind of formulae or trick to calculate the complexity of nested loops? Sometimes when I apply summation formulas I do not get the right answer.
Some examples would be great, thanks.
Here is an example for you where the time complexity is O(n), but you have a double loop:
int cnt = 0;
for (int i = N; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
cnt += 1;
}
}
You can prove the complexity in the following way:
The first iteration, the j loop runs N times. The second iteration, the j loop runs N / 2 times. i-th iteration, the j loop runs N / 2^i times.
So in total: N * ( 1 + 1/2 + 1/4 + 1/8 + … ) < 2 * N = O(N)
It would be tempting to say that something like this runs in O(log(n)):
int cnt = 0;
for (int i = 1; i < N; i *= 2) {
for (int j = 1; j < i; j*= 2) {
cnt += 1;
}
}
But I believe that this runs in O(log^2(N)) which is polylogarithmic

Troubles with Big O Estimate

I'm asked to give a big-O estimates for some pieces of code but I'm having a little bit of trouble
int sum = 0;
for (int i = 0; i < n; i = i + 2) {
for (int j = 0; j < 10; j + +)
sum = sum + i + j;
I'm thinking that the worst case would be O(n/2) because the outer for loop is from i to array length n. However, I'm not sure if the inner loop affects the Big O.
int sum = 0;
for (int i = n; i > n/2; i − −) {
for (int j = 0; j < n; j + +)
sum = sum + i + j;
For this one, I'm thinking it would be O(n^2/2) because the inner loop is from j to n and the outer loop is from n to n/2 which gives me n*(n/2)
int sum = 0;
for (int i = n; i > n − 2; i − −) {
for (int j = 0; j < n; j+ = 5)
sum = sum + i + j;
I'm pretty lost on this one. My guess is O(n^2-2/5)
Your running times for the first two examples are correct.
For the first example, the inner loop of course always executes 10 times. So we can say the total running time is O(10n/2).
For the last example, the outer loop only executes twice, and the inner loop n/5 times, so the total running time is O(2n/5).
Note that, because of the way big-O complexity is defined, constant factors and asymptotically smaller terms are negligible, so your complexities can / should be simplified to:
O(n)
O(n2)
O(n)
If you were to take into account constant factors (using something other than big-O notation of course - perhaps ~-notation), you may have to be explicit about what constitutes a unit of work - perhaps sum = sum + i + j constitutes 2 units of work instead of just 1, since there are 2 addition operations.
You're NOT running nested loops:
for (int i = 0; i < n; i = i + 2);
^----
That semicolon is TERMINATING the loop definition, so the i loop is just counting from 0 -> n, in steps of 2, without doing anything. The j loop is completely independent of the i loop - both are simply dependent on n for their execution time.
For the above algorithms worst case/best case are the same.
In case of Big O notation, lower order terms and coefficient of highest order term can be ignored as Big O is used for describing asymptotic upper bound.
int sum = 0;
for (int i = 0; i < n; i = i + 2) {
for (int j = 0; j < 10; j + +)
sum = sum + i + j;
Total number of outer loop iteration =n/2.for each iteration of outer loop, number of inner loop iterations=10.so total number of inner loop iterations=10*n/2=5n. so clearly it is O(n).
Now think about rest two programs and determine time complexities on your own.

worst case running time calculation

It's from homework, but I'm asking for a general method.
Calculate the following code's worst case running time.
int sum = 0;
for (int i = 0; i*i < N; i++)
for (int j = 0; j < i*i; j++)
sum++;
the answer is N^3/2, could anyone help me through this?
Is there a general way to calculate this?
This is what I thought:
when i = 0, sum++ will be called 0 time
when i = 1, sum++ will be called 1 time
when i = 2, sum++ will be called 4 times
...
when i = i, sum++ will be called i^2 times
so the worst time will be
0 + 1 + 4 + 9 + 16 + ... + i^2
but what next?? I'm lost here...
You want to count how many times the innermost cycle will run.
The outer one will run from i = 0, to i = sqrt(N) (since i*i < N).
For each iteration of the outer one the inner one will run i^2 times.
Thus the total number of times the inner one will run is:
1^2 + 2^2 + 3^2 + ... + sqrt(N)^2
There is a formula:
1^2 + 2^2 + ... + k^2 = k(k+1)(2k+1) / 6 = O(k^3).
In your case k = sqrt(N).
This the total complexity is O(sqrt(N)^3) = O(N^(3/2)).
Your algorithm can be converted to the following shape:
int sum = 0;
for (int i = 0; i < Math.sqrt(N); i++)
for (int j = 0; j < i*i; j++)
sum++;
Therefore, we may straightforwardly and formally do the following:
then just calculate this sum
(i^2)/2 * N^(1/2) = N/2 * N^(1/2) = N^(3/2)
You are approaching this problem in the wrong way. To count the worst time, you need to find the maximum number of operations that will be performed. Because you have only a single operation in a double loop, it is enough to find out how many times the inner loop will execute.
You can do this by examining the limits of your loops. For the outer loop it is:
i^2 < N => i < sqrt(N)
The limit for your inner loop is
j < i^2
You can substitute in the second equasion to get j < N.
Because these are nested loops you multiply their limits to get the final result:
sqrt(N)*N = N^3/2

Running time of for loop - part #2

This would be part # 2 of my question about analysis of for loop running time
http://faculty.simpson.edu/lydia.sinapova/www/cmsc250/LN250_Weiss/L03-BigOhSolutions.htm#PR4 contains solutions, and I have question about two particular "for" loops
Could someone explain to me how to figure out running time for both of them. Thanks !
1.
sum = 0;
for( i = 0; i < n; i++)
for( j = 0; j < i*i; j++)
for( k = 0; k < j; k++)
sum++;
2.
sum = 0;
for( i = 0; i < n; i++)
for( j = 0; j < i*i; j++)
if (j % i ==0)
for( k = 0; k < j; k++)
sum++;
The first snippet is O(n^5).
Top Loop = 0 - O(n) = O(n) iterations
Middle Loop = 0 - O(n^2) = O(n^2) iterations
Inner Loop = 0 - O(n^2) = O(n^2) iterations
Total = O(n^5)
Here's the closed-form solution of the first snippet: (computed via Mathematica)
sum = -(1/10)*n + (1/4)*n^2 - (1/4)*n^4 + (1/10)*n^5
This is a 5th order polynomial, therefore it is: O(n^5)
The second snippet appears to be O(n^4).
Top Loop = 0 - O(n) = O(n) iterations
Middle Loop = 0 - O(n^2) = O(n^2) iterations
If statement enters: O(1 / n) times
Inner Loop = 0 - O(n^2) = O(n^2) iterations
Total = O(n^4)
Here's the closed-form solution of the second snippet: (computed via Mathematica)
sum = -(1/12)*n + (3/8)*n^2 - (5/12)*n^3 + (1/8)*n^4
This is a 4th order polynomial, therefore it is: O(n^4)
Further explanation of the effect of the if-statement:
The middle loop iterates from 0 to i*i. The if-statement checks if j is divisible by i. But that is only possible when j is a multiple of i.
How many times is j a multiple of i if 0 <= j < i*i? Exactly i times. Therefore only 1/i of the iterations of the middle loop will fall through to the inner-most loop.
The relationship of 'n' as well as the other variables in the second for loop statement ( ..., x<=n, ...) would really define how fast it would be. Try to visualize a for loop as a racem and the second statement says how many laps you would make. So for example, variable 'n' = 1000, then you would have to run the same lap for 1000 times, truly time wasting. Hope that got you a better view on things.

Resources