What is the Big O analysis of this algorithm? - algorithm

I'm working on a data structures course and I'm not sure how to proceed w/ this Big O analysis:
sum = 0;
for(i = 1; i < n; i++)
for(j = 1; j < i*i; j++)
if(j % i == 0)
for(k = 0; k < j; k++)
sum++;
My initial idea is that this is O(n^3) after reduction, because the innermost loop will only run when j/i has no remainder and the multiplication rule is inapplicable. Is my reasoning correct here?

Let's ignore the outer loop for a second here, and let's analyze it in terms of i.
The mid loop runs i^2 times, and is invoking the inner loop whenever j%i == 0, that means you run it on i, 2i, 3i, ...,i^2, and at each time you run until the relevant j, this means that the inner loop summation of running time is:
i + 2i + 3i + ... + (i-1)*i = i(1 + 2 + ... + i-1) = i* [i*(i-1)/2]
The last equality comes from sum of arithmetic progression.
The above is in O(i^3).
repeat this to the outer loop which runs from 1 to n and you will get running time of O(n^4), since you actually have:
C*1^3 + C*2^3 + ... + C*(n-1)^3 = C*(1^3 + 2^3 + ... + (n-1)^3) =
= C/4 * (n^4 - 2n^3 + n^2)
The last equation comes from sum of cubes
And the above is in O(n^4), which is your complexity.

Related

What's the big O for this triple nested loop?

Outer loop is O(n), 2nd loop is O(n^2) and 3rd loop is also O(n^2), but the 3rd loop is conditional.
Does that mean the 3rd loop only happens 1/n (1 every n) times and therefore total big O is O(n^4)?
for (int i = 1; i < n; i++) {
for (int j = 1; j < (n*n); j++) {
if (j % i == 0) {
for (int k = 1; k < (n*n); k++) {
// Simple computation
}
}
}
}
For any given value of i between 1 and n, the complexity of this part:
for (int j = 1; j < (n*n); j++) {
if (j % i == 0) {
for (int k = 1; k < (n*n); k++) {
// Simple computation
}
}
}
is O(n4/i), because the if-condition is true one ith of the time. (Note: if i could be larger than n, then we'd need to write O(n4/i + n2) to include the cost of the loop iterations where the if-condition was false; but since i is known to be small enough that n4/i ≥ n2, we don't need to worry about that.)
So the total complexity of your code, adding together the different loop iterations across all values of i, is O(n4/1 + n4/2 + n4/3 + ⋯ + n4/n) = O(n4 · (1/1 + 1/2 + 1/3 + ⋯ + 1/n)) = O(n4 log n).
(That last bit relies on the fact that, since ln(n) is the integral of 1/x from 1 to n, and 1/x is decreasing over that interval, we have ln(n) < ln(n+1) < (1/1 + 1/2 + 1/3 + ⋯ + 1/n) < 1 + ln(n).)

Big O complexity on dependent nested loops

Can I get some help in understanding how to solve this tutorial question! I still do not understand my professors explanation. I am unsure of how to count the big 0 for the third/most inner loop. She explains that the answer for this algorithm is O(n^2) and that the 2nd and third loop has to be seen as one loop with the big 0 of O(n). Can someone please explain to me the big O notation for the 2nd / third loop in basic layman terms
Assuming n = 2^m
for ( int i = n; i > 0; i --) {
for (int j =1; j < n; j *= 2){
for (int k =0; k < j; k++){
}
}
}
As far as I understand, the first loop has a big O notation of O(n)
Second loop = log(n)
Third loop = log (n) (since the number of times it will be looped has been reduced by logn) * 2^(2^m-1)( to represent the increase in j? )
lets add print statement to the innermost loop.
for (int j =1; j < n; j *= 2){
for (int k =0; k < j; k++){
print(1)
}
}
output for
j = 1, 1 1
j = 2, 1 1 1
j = 4, 1 1 1 1 1
...
j = n, 1 1 1 1 1 ... n+1 times.
The question boils down to how many 1s will this print.
That number is
(2^0 + 1) + (2^1 + 1) + (2^2 + 1) + ... + (n + 1)
= (2^0 + 1) + (2^1 + 1) + (2^2 + 1) + ... + (n + 1)
= log n + (1 + 2 + 4 + ... + n)
= O(log n + n)
= O(n).
assuming you know why (1 + 2 + 4 + ... + n) = O(n)
O-notation is an upperbound. You can say it has O(n^2). For least upperbound, I believe it should be O(n*log(n)*log(n)) which belongs to O(n^2).
It’s because of the logarithm. If you have log(16) raised to the power 2 is 16. So log(n) raised to the power of 2 is n. That is why your teacher says to view the second and third loop as O(n) together.
If the max iterations for the second loop are O(log(n)) then the second and third loops will be: O(1 + 2 + 3 + ... + log(n)) = O(log(n)(log(n) + 1)/2) = O((log(n)^2 + log(n))/2) = O(n)
for ( int i = n; i > 0; i --) { // This runs n times
for (int j =1; j < n; j *= 2){ // This runs atmost log(n) times, i.e m times.
for (int k =0; k < j; k++){ // This will run atmost m times, when the value of j is m.
}
}
}
Hence, the overall complexity will be the product of all three, as mentioned in the comments under the question.
Upper bound can be loose or tight.
You can say that it is loosely bound under O(n^2) or tightly bound under O(n * m^2).

Calculating the complexity of an algorithm with 3 loops

I tried to solve the following exercise :
What is the order of growth of the worst case running time of the following code fragment
as a function of N?
int sum = 0;
for (int i = 1; i <= N; i++)
for (int j = 1; j <= i*i; j++)
for (int k = 1; k <= j*j; k++)
sum++;
and I found that the complexity is O(n4), however the correct answer is :
The answer is : N7
For a given value of i, the body of the innermost loop is executed 12 + 22 + 32 + ... + (i2)2 ~ 1/3 i6 times. Summing up over all values of i yields ~ 1/21 N7.
I would like some help to understand this answer and the correct way to calculate complexity in this case.
EDIT : In particular, I don't understand this statement :
12 + 22 + 32 + ... + (i2)2 ~ 1/3 i6
Because for me, 2 + 22 + 32 + ... + (i2)2 ~ i4
EDIT:
I'll add a bit of explanation to clear up your confusion about the quote in your question. Let's consider a fixed value of i and focus on the innermost two loops:
for (int j = 1; j <= i*i; j++)
for (int k = 1; k <= j*j; k++)
sum++;
How many times is the j-loop iterated? The answer is i^2 times. On each of those iterations, the k-loop is iterated j^2 times, which is different for each outer iteration because the value of j increases from 1 all the way to i^2.
When j = 1, the k-loop iterates 1^2 times. When j = 2, the k-loop iterates 2^2 times. When j=3, 3^2 times. Tallying up the total number of the iterations of the k-loop over all values of j, you have 1^2 + 2^2 + 3^2 + ... + (i^2)^2, since j runs between 1 and i^2. Hopefully that clarifies how you arrive at the following statement:
For a given value of i, the body of the innermost loop is executed 12 + 22 + 32 + ... + (i2)2 ~ 1/3 i6 times.
The total number of iterations can be expressed in sum form. The innermost loop has exactly j^2 iterations for each (varying) value of j, the middle loop has i^2 iterations for each value of i, and the outermost loop has N iterations. More neatly, the exact number of iterations is:
Multiplying through, you'll find this is a 7th order polynomial in N, so it is apparent why this is O(N^7).
In case you doubt the answer above is correct, simply run your own code and compare the value of sum you get with the formula provided above:
var sum = 0;
var N = 10;
for (var i = 1; i <= N; i++)
for (var j = 1; j <= i*i; j++)
for (var k = 1; k <= j*j; k++)
sum++;
function T(N) { return (1/420)*N*(1+N)*(1+2*N)*(8+11*N+21*N*N+20*N*N*N+10*N*N*N*N); }
console.log(sum===T(N));
Here's a demo: http://jsfiddle.net/wby9deax/. No matter what value of N you put in the answer will be correct (note: be careful with large values for N, it will probably freeze up your browser, since the number of iterations grows very rapidly).
int sum = 0;
for (int i = 1; i <= N; i++) -- O(N^1)
for (int j = 1; j <= i*i; j++) -- O(N^2)
for (int k = 1; k <= j*j; k++) -- O(N^4)
sum++;
Since they're nested (and since they're all linear) you get O(N1 × N2 × N4) = O(N1+2+4) = O(N7)
EDIT : In particular, I don't understand this statement :
12 + 22 + 32 + ... + (i2)2 ~ 1/3 i6
Keep in mind that you may have N terms hiding in the "…" part.
because will be
N^1 loops - first for
N^2 loops - second for
N^4 loops - third for
and N^1 * N^2 * N^4 = N^7
I think it is good idea to substitute variables (i,j and k) by N values.
for (int i = 1; i <= N; i++) //<- i = N
for (int j = 1; j <= i*i; j++) //<- i*i = N*N
for (int k = 1; k <= j*j; k++) //<- j*j = (i*i)*(i*i) = N*N*N*N
In the first loop number of iterations will be N, that's simple part.
In second loop number of iterations will be N*N. In the third - N*N*N*N.
So finally, number of iterations will be N (first loop), times N*N (second), times N*N*N*N (third), so N*(N*N)*(N*N*N*N) = N^7

Troubles with Big O Estimate

I'm asked to give a big-O estimates for some pieces of code but I'm having a little bit of trouble
int sum = 0;
for (int i = 0; i < n; i = i + 2) {
for (int j = 0; j < 10; j + +)
sum = sum + i + j;
I'm thinking that the worst case would be O(n/2) because the outer for loop is from i to array length n. However, I'm not sure if the inner loop affects the Big O.
int sum = 0;
for (int i = n; i > n/2; i − −) {
for (int j = 0; j < n; j + +)
sum = sum + i + j;
For this one, I'm thinking it would be O(n^2/2) because the inner loop is from j to n and the outer loop is from n to n/2 which gives me n*(n/2)
int sum = 0;
for (int i = n; i > n − 2; i − −) {
for (int j = 0; j < n; j+ = 5)
sum = sum + i + j;
I'm pretty lost on this one. My guess is O(n^2-2/5)
Your running times for the first two examples are correct.
For the first example, the inner loop of course always executes 10 times. So we can say the total running time is O(10n/2).
For the last example, the outer loop only executes twice, and the inner loop n/5 times, so the total running time is O(2n/5).
Note that, because of the way big-O complexity is defined, constant factors and asymptotically smaller terms are negligible, so your complexities can / should be simplified to:
O(n)
O(n2)
O(n)
If you were to take into account constant factors (using something other than big-O notation of course - perhaps ~-notation), you may have to be explicit about what constitutes a unit of work - perhaps sum = sum + i + j constitutes 2 units of work instead of just 1, since there are 2 addition operations.
You're NOT running nested loops:
for (int i = 0; i < n; i = i + 2);
^----
That semicolon is TERMINATING the loop definition, so the i loop is just counting from 0 -> n, in steps of 2, without doing anything. The j loop is completely independent of the i loop - both are simply dependent on n for their execution time.
For the above algorithms worst case/best case are the same.
In case of Big O notation, lower order terms and coefficient of highest order term can be ignored as Big O is used for describing asymptotic upper bound.
int sum = 0;
for (int i = 0; i < n; i = i + 2) {
for (int j = 0; j < 10; j + +)
sum = sum + i + j;
Total number of outer loop iteration =n/2.for each iteration of outer loop, number of inner loop iterations=10.so total number of inner loop iterations=10*n/2=5n. so clearly it is O(n).
Now think about rest two programs and determine time complexities on your own.

worst case running time calculation

It's from homework, but I'm asking for a general method.
Calculate the following code's worst case running time.
int sum = 0;
for (int i = 0; i*i < N; i++)
for (int j = 0; j < i*i; j++)
sum++;
the answer is N^3/2, could anyone help me through this?
Is there a general way to calculate this?
This is what I thought:
when i = 0, sum++ will be called 0 time
when i = 1, sum++ will be called 1 time
when i = 2, sum++ will be called 4 times
...
when i = i, sum++ will be called i^2 times
so the worst time will be
0 + 1 + 4 + 9 + 16 + ... + i^2
but what next?? I'm lost here...
You want to count how many times the innermost cycle will run.
The outer one will run from i = 0, to i = sqrt(N) (since i*i < N).
For each iteration of the outer one the inner one will run i^2 times.
Thus the total number of times the inner one will run is:
1^2 + 2^2 + 3^2 + ... + sqrt(N)^2
There is a formula:
1^2 + 2^2 + ... + k^2 = k(k+1)(2k+1) / 6 = O(k^3).
In your case k = sqrt(N).
This the total complexity is O(sqrt(N)^3) = O(N^(3/2)).
Your algorithm can be converted to the following shape:
int sum = 0;
for (int i = 0; i < Math.sqrt(N); i++)
for (int j = 0; j < i*i; j++)
sum++;
Therefore, we may straightforwardly and formally do the following:
then just calculate this sum
(i^2)/2 * N^(1/2) = N/2 * N^(1/2) = N^(3/2)
You are approaching this problem in the wrong way. To count the worst time, you need to find the maximum number of operations that will be performed. Because you have only a single operation in a double loop, it is enough to find out how many times the inner loop will execute.
You can do this by examining the limits of your loops. For the outer loop it is:
i^2 < N => i < sqrt(N)
The limit for your inner loop is
j < i^2
You can substitute in the second equasion to get j < N.
Because these are nested loops you multiply their limits to get the final result:
sqrt(N)*N = N^3/2

Resources