Asymptotic analysis of three interdependent nested for loops - for-loop

The code fragment I am to analyse is below:
int sum = 0;
for (int i = 0; i < n; i++)
for (int j = 0; j < i * i; j++)
for (int k = 0; k < j; k++)
sum++;
I know that the first loop is O(n) but that's about as far as I've gotten. I think that the second loop may be O(n^2) but the more I think about it the less sense it makes. Any guidance would be much appreciated.

The first loop executes n times. Each time, the value i grows. For each such i, the second loop executes i*i times. That means the second loop executes 1*1 + 2*2 + 3*3 + ... + n*n times.
This is a summation of squares, and the formula for this is well-known. Hence we have the second loop executing (n(1 + n)(1 + 2 n))/6 times.
Thus, we know that in total there will be (n(1 + n)(1 + 2 n))/6 values of j, and that for each of these the third loop will execute 1 + 2 + ... + j = j(j+1)/2 times. Actually calculating how many times the third loop executes in total would be very difficult. Luckily, all you really need is a least upper bound for the order of the function.
You know that for each of the (n(1 + n)(1 + 2 n))/6 values of j, the third loop will execute less than n(n+1)/2 times. Therefore you can say that the operation sum++ will execute less than [(n(1 + n)(1 + 2 n))/6] * [n(n+1)/2] times. After some quick mental math, that amounts to a polynomial of maximal degree 5, therefore your program is O(n^5).

int sum = 0;
for (int i = 0; i < n; i++) // Let's call this N
for (int j = 0; j < i * i; j++) // Let's call this M
for (int k = 0; k < j; k++) // Let's call this K
sum++;
N is the number of steps of the entire program, M is the number of steps the two inner loops do and lastly K is the number of steps the last loop does.
It is easy to see that K = j, it takes j steps to do.
Then M = Sum(j=0,i^2,K) = Sum(j=0, i^2, j)
(First param is the iterator, second is the upper bound and last param is what we are adding)
This is actually now a sum of n numbers to i*i. Which means we can apply the formula ((n+1)*n)/2
M = Sum(j=0,i^2,j) = ((i^2+1)*(i^2))/2 = (i^4 + i^2)/2
N = Sum(i=0, n, M) = 1/2 * ( Sum(i=0, n, (i^4)) + Sum(i=0, n, (i^2)) )
These are both well known formulas and after a little playing you get:
N = (n^5)/10 + (n^4)/4 + (n^3)/3 + (n^2)/4 + n/15
This should be the exact number of steps the loop takes, but if you are interested in the O notation you can note that n^5 is the strongest growing part so the solution is O(n^5)

If you proceed methodically using Sigma Notation, you'll end up with the following result:

Try to count how many times the inner loop is executed:
The middle loop runs
0*0 times when i == 0
1*1 times when i == 1
2*2 times when i == 2
...
n*n = n^2 times when i == n.
So it is O(n^2).

Related

What is the time complexity of this code that has 3 nested loops but j stops when it's equal to i * 2?

for (int i = 1 to n) {
for (int j = i to n) {
for (int k = j to n) {
sum += a[i] * b[j] * c[k]; //O(1)
}
if (j == 2 * i) {
j = n;
}
}
}
I've spent so many hours tracing the code and tried different n values. I realize that j doesn't run more than [n/2 + 1] times. I made a table that looks like this for n = 8, 9, 12.
(Apologies for the picture, I am new to Stack Overflow)
i and j are values of i and j, whereas the k is the number of times that innermost loop runs
So it turns into something like:
[n+(n-1)] + [(n-1)+(n-2)+(n-3)] + [(n-2)+(n-3)+(n-4)+(n-5)] + ...+ [3+2+1] + [2+1] + 1
I'm not sure how to put this into a arithmetic progression or summation. Please help.
There's n-j work in the inner loop.
For any particular value of j, there's approximately j/2 + 1 possible values of i. For example j can take the value 10 when i is 5,6,7,8,9 or 10.
So the total time complexity is sum((j/2+1) * (n-j), j=1..n), which is n^3/12+n^2/2-7/12 = O(n^3)
I've been sloppy with rounding here, but it doesn't affect the complexity.

Could someone explain the time complexity for these code snippets?

I am practicing time complexity and some of them if a bit too complicated for me.
I would really appreciate of someone could explain these for me.
A) The time complexity is O(n). How is that?
for (int i = N; i > 0; i = i/2) {
for (int j = i+i; j > 0; j--) {
doSomething(i, j);
}
}
B) The time complexity is O(n logn). How is that?
for (int i = N+N; i > 0; i--) {
for (int j = N; j > 0; j = j/2) {
doSomething(i, j);
}
}
I suppose we must assume that the execution of doSomething takes constant time, independent of the values it gets as arguments.
Algorithm A:
On the first iteration of the outer loop, the inner loop iterates 2𝑁 times. Every next iteration of the outer loop, the number of iterations of the inner loop is halved. So we get this series:
      2𝑁 + 𝑁 + 𝑁/2 + 𝑁/4 + 𝑁/8 + ... + 2
Given that this series is finite, but has the pattern of 1/2 + 1/4 + 1/8 + 1/16 + ..., we can conclude that this is less than 4𝑁, and so it is O(𝑁).
Algorithm B:
Here the number of iterations of the inner loop does not depend on the value of 𝑖, so it is always the same: each time it performs log2𝑁 iterations (since 𝑗 is halved each iteration). As the outer loop iterates 2𝑁 times, doSomething is called 2𝑁log2𝑁, which is O(𝑁log𝑁)
problem A
Here the first loop will execute log2(n)+1 times, and the second loop will execute i+i times. So what will the value of i in every second loop.
for n, it will be like
n + n/2 + n/4 + n/8 + n/16 + .......
summation of this will be the answer.
as we know
a + ar + ar^2 + ar^3 + ar^4 .... + ar^m = (1-a^(m+1))/(1-a)
here a = n, r = 1/2 and m = log2(n)+1
n + n/2 + n/4 + n/8 + ... n/(2^(m)) =2n−n/2^m = 2n-1;
so the complexity is O(2n-1) = O(n)
problem B
here the first loop will execute n times. And for every first loop execution, the second loop will be executed log2(n)+1 time.
for (int j = n; j > 0; j = j/2)
for example n = 10 ,
value of j will be 10, 5, 2, 1 ,0. For 10 it will execute 4 times or log2(10)+1 times .
so for every first loop it will execute log2(n)+1 times. so the complexity is
O(n(log2(n)+1)) = O (nlog(n))

What is the time complexity of this algorithm where the limit is changing inside the loop?

How do you calculate the time complexity or big-O of this algorithm, where it is not clear how many iterations the loop iterates?
cin >> n;
int i = 0;
for (int j=1; i <= n; j++) {
i += j;
}
Appreciate that the series in j looks like this:
1 + 2 + 3 + 4 + ... + n (not the n in your question)
The sum of this series is given by the Gaussian formula:
n * (n + 1) / 2
This means that the sum varies as n^2, where n is the number of terms or steps in the loop. Therefore, the loop should vary as:
O(sqrt(n))
Where now n here is the n from your loop code, i.e. the upper bound of the loop.

Big O Notation for two code fragments

I've have two fragments of code and an explanation of what Big O category they fall into. However, try as I might, I can't tally the explanation with what I can come up either by looking at it or doing sample runs.
The first:
long count = 0;
long n = 1000;
long i, j, k;
for(i = 0; i < n; i++)
for (j = 0; j < i * i; j++)
for (k = 0; k < j; k++)
count++;
Sample runs of this consistently give me N^4, but the answer I've been given is "j can be as large as i^2, which could be as large as N^2. k can be as large as j, which is N^2. The running time is thus proportional to N^N^2^N^2, which is O(N^5)"
Second snippet:
long i, j, k;
long n = 1000;
long count = 0;
for (i = 1; i < n; i++)
for (j = 1; j < i * i; j++)
if (j % i == 0)
for (k = 0; k < j; k++)
count++;
For this the notes say "The if statement is executed at most N3 times, by previous arguments, but it is true only O(N^2) times (because it is true exactly i times for each i). Thus the innermost loop is only executed O(N^2) times. Each time through, it takes O(j^2) = O(N^2) time, for a total of O(N^4)"
For this the notes seem to be accurate enough for the N^4 (although I keep getting a result of N^4 / 10). I don't follow the modulo calculation only being true i times for each i however, it seems to enter that loop a lot less.
So the question is can anyone clarify what I'm not understanding?
For the first one:
sum from i = 0 to n-1 of
sum from j = 0 to i*i-1 of
sum from k = 0 to j-1 of
1
We know the sum of 1 m times is equal to m, so we can reduce this to
sum from i = 0 to n-1 of
sum from j = 0 to i*i-1 of
j
We know the sum 1 + 2 + ... + m = m * (m + 1) / 2, so we can reduce further:
sum from i = 1 to n-1 of
(i * i - 1) * i * i / 2 = (1/2) * (i * i * i * i - i * i)
We can make this easier by taking the (1/2) outside the summation and then splitting up the i * i * i * i and i * i terms; however, the summations are still harder and less well-known than for i alone. It does turn out to be Theta(n^5) hence O(n^5); to at least get an intuitive feeling for why this turns out, recognize that the difference f(n+1) - f(n) = (1/2)(n^4-n^2) which is on the order of n^4, so if f were a continuous function and this difference were the derivative, then the order of f would be one higher.
For the second case:
sum from i = 0 to n-1 of
sum from j = 0 to i-1 of
sum from k = 0 to i*j-1
1
Note that j now assumes only i different values for the purposes of the innermost loop: 0, i, 2i, ..., (i-1)i. The inner loop runs for i times as many iterations as the counter value for j. We do this multiplication shifting to avoid introducing a "step" notation so we can use our usual mathematical results.
sum from i = 0 to n-1 of
sum from j = 0 to i-1 of
i*j
sum from i = 0 to n-1 of
i * (1/2) * i * (i - 1) = (1/2)(i * i * i - i)
Again, we can cheat or do the math or we can use our intuition again to (correctly) surmise this turns out to be Theta(n^4).

Algorithmic order of growth of the code

I am attending a course online and stuck at the following question. What is the order of growth of the worst case running time of the following code fragment as a function of N?
int sum = 0;
for (int i = 1; i <= N*N; i++)
for (int j = 1; j <= i; j++)
for (int k = 1; k <= j; k++)
sum++;
I thought that it is of the order of N^4 but it seems this answer is wrong. can you please explain?
It is of order O(N^6). You should note that it is not true that every loop simply add an order N to the complexity. Considering the following example:
int sum = 0;
for (int i = 1; i <= M; i++)
for (int j = 1; j <= i; j++)
for (int k = 1; k <= j; k++)
sum++;
You should be able to figure out it is of order O(M^3) easily, so if you replace M=N^2, then you will get the answer. The key point is that every inner loop are of order O(N^2) in this case, but not O(N).
Let's denote n = N^2. Then, the loop will execute each time that k <= i <= j. This will be approximately n^3/6 times. Thus, the runtime is O(n^3)= O(N^6)
Explanation: Ignoring for a moment the cases where k==i or j==i or j==k,
we take 1 out of 6 distinct triples :
(a1,a2,a3)
(a1,a3,a2)
(a2,a1,a3)
(a2,a3,a1)
(a3,a2,a1)
(a3,a1,a2)
Overall, there are n^3 triples. Only one out of 6 triples obeys the order.
One run of the inner loop increments sum exactly j times.
One run of the middle loop invokes the inner loop exactly i times, with values of j between 1 and i (inclusive). So it increments sum exactly 1+2+3+...i times, which is i.(i+1)/2 by the well-known "triangular numbers" formula.
The outer loop invokes the middle loop exactly N^2 times (let us denote it as M), with values of i between 1 and M (inclusive). So it increments sum exactly 1+3+6+...M.(M+1)/2 times. Similarly, this is M.(M+1).(M+2)/6, by the not-so-well-known "tetrahedral numbers" formula (http://en.wikipedia.org/wiki/Tetrahedral_number).
All in all, the final value of sum is N^2.(N^2+1).(N^2+2)/6.
Thinking in asymptotic terms, the inner loop is O(j), the middle one O(i^2) (by summation) and the outer one O(M^3) (by summation), i.e. O(N^6).
Also see Faulhaber's formula, which shows that the sum of n^k is O(N^(k+1)) (http://en.wikipedia.org/wiki/Faulhaber%27s_formula).
Any given run of the innermost (k) loop has a time proportional to j, but we've got to do one of those for each of j = 1 through j = i, and that sum 1 + 2 + … + i grows like i^2. So for any given i we've got an O(i^2) running time, but of course we've got to deal with i = 1 through i = N^2. The sum of i^2 for i = 1 through N^2 happens to grow like N^6.

Resources