I am attending a course online and stuck at the following question. What is the order of growth of the worst case running time of the following code fragment as a function of N?
int sum = 0;
for (int i = 1; i <= N*N; i++)
for (int j = 1; j <= i; j++)
for (int k = 1; k <= j; k++)
sum++;
I thought that it is of the order of N^4 but it seems this answer is wrong. can you please explain?
It is of order O(N^6). You should note that it is not true that every loop simply add an order N to the complexity. Considering the following example:
int sum = 0;
for (int i = 1; i <= M; i++)
for (int j = 1; j <= i; j++)
for (int k = 1; k <= j; k++)
sum++;
You should be able to figure out it is of order O(M^3) easily, so if you replace M=N^2, then you will get the answer. The key point is that every inner loop are of order O(N^2) in this case, but not O(N).
Let's denote n = N^2. Then, the loop will execute each time that k <= i <= j. This will be approximately n^3/6 times. Thus, the runtime is O(n^3)= O(N^6)
Explanation: Ignoring for a moment the cases where k==i or j==i or j==k,
we take 1 out of 6 distinct triples :
(a1,a2,a3)
(a1,a3,a2)
(a2,a1,a3)
(a2,a3,a1)
(a3,a2,a1)
(a3,a1,a2)
Overall, there are n^3 triples. Only one out of 6 triples obeys the order.
One run of the inner loop increments sum exactly j times.
One run of the middle loop invokes the inner loop exactly i times, with values of j between 1 and i (inclusive). So it increments sum exactly 1+2+3+...i times, which is i.(i+1)/2 by the well-known "triangular numbers" formula.
The outer loop invokes the middle loop exactly N^2 times (let us denote it as M), with values of i between 1 and M (inclusive). So it increments sum exactly 1+3+6+...M.(M+1)/2 times. Similarly, this is M.(M+1).(M+2)/6, by the not-so-well-known "tetrahedral numbers" formula (http://en.wikipedia.org/wiki/Tetrahedral_number).
All in all, the final value of sum is N^2.(N^2+1).(N^2+2)/6.
Thinking in asymptotic terms, the inner loop is O(j), the middle one O(i^2) (by summation) and the outer one O(M^3) (by summation), i.e. O(N^6).
Also see Faulhaber's formula, which shows that the sum of n^k is O(N^(k+1)) (http://en.wikipedia.org/wiki/Faulhaber%27s_formula).
Any given run of the innermost (k) loop has a time proportional to j, but we've got to do one of those for each of j = 1 through j = i, and that sum 1 + 2 + … + i grows like i^2. So for any given i we've got an O(i^2) running time, but of course we've got to deal with i = 1 through i = N^2. The sum of i^2 for i = 1 through N^2 happens to grow like N^6.
Related
I am having trouble with this. The inner loop depends on the outer loop and from trying out values of n, the loop runs 1+2+4...+sqrt(n) times. Any help would be greatly appreciated!
int sum = 0;
for (int k = 1; k*k <= n; k = k*2)
for (int j = 0; j < k; j++)
sum++;
If K is the largest power of 2 with K*K <= n, then your sum is 1+2+4+8+...+K = 2K+1.
K is clearly less than or equal to sqrt(n), but it's also greater than sqrt(n)/4 (because if not, then 2K*2K would be less than or equal to n, contradicting the fact that K is the largest power of 2 with K*K <= n.
So sqrt(n)/4 < K <= sqrt(n), and your runtime (2K+1) is between sqrt(n)/2+1 and 2sqrt(n)+1, and thus the complexity is Θ(sqrt(n)).
for(i=0; i<n; i++) // time complexity n+1
{
k=1; // time complexity n
while(k<=n) // time complexity n*(n+1)
{
for(j=0; j<k; j++) // time complexity ??
printf("the sum of %d and %d is: %d\n",j,k,j+k); time complexity ??
k++;
}
What is the time complexity of the above code? I stuck in the second (for) and i don't know how to find the time complexity because j is less than k and not less than n.
I always having problems related to time complexity, do you guys got some good article on it?
especially about the step count and loops.
From the question :
because j is less than k and not less than n.
This is just plain wrong, and I guess that's the assumption that got you stuck. We know what values k can take. In your code, it ranges from 1 to n (included). Thus, if j is less than k, it is also less than n.
From the comments :
i know the the only input is n but in the second for depends on k an not in n .
If a variable depends on anything, it's on the input. j depends on k that itself depends on n, which means j depends on n.
However, this is not enough to deduce the complexity. In the end, what you need to know is how many times printf is called.
The outer for loop is executed n times no matter what. We can factor this out.
The number of executions of the inner for loop depends on k, which is modified within the while loop. We know k takes every value from 1 to n exactly once. That means the inner for loop will first be executed once, then twice, then three times and so on, up until n times.
Thus, discarding the outer for loop, printf is called 1+2+3+...+n times. That sum is very well known and easy to calculate : 1+2+3+...+n = n*(n+1)/2 = (n^2 + n)/2.
Finally, the total number of calls to printf is n * (n^2 + n)/2 = n^3/2 + n^2/2 = O(n^3). That's your time complexity.
A final note about this kind of codes. Once you see the same patterns a few times, you quickly start to recognize the kind of complexity involved. Then, when you see that kind of nested loops with dependent variables, you immediately know that the complexity for each loop is linear.
For instance, in the following, f is called n*(n+1)*(n+2)/6 = O(n^3) times.
for (i = 1; i <= n; ++i) {
for (j = 1; j <= i; ++j) {
for (k = 1; k <= j; ++k) {
f();
}
}
}
First, simplify the code to show the main loops. So, we have a structure of:
for(int i = 0; i < n; i++) {
for(int k = 1; k <= n; k++) {
for(int j = 0; j < k; j++) {
}
}
}
The outer-loops run n * n times but there's not much you can do with this information because the complexity of the inner-loop changes based on which iteration of the outer-loop you're on, so it's not as simple as calculating the number of times the outer loops run and multiplying by some other value.
Instead, I would find it easier to start with the inner-loop, and then add the outer-loops from the inner-most to outer-most.
The complexity of the inner-most loop is k.
With the middle loop, it's the sum of k (the complexity above) where k = 1 to n. So 1 + 2 + ... + n = (n^2 + n) / 2.
With the outer loop, it's done n times so another multiplication by n. So n * (n^2 + n) / 2.
After simplifying, we get a total of O(n^3)
The time complexity for the above code is : n x n x n = n^3 + 1+ 1 = n^3 + 2 for the 3 loops plus the two constants. Since n^3 carries the heaviest growing rate the constant values can be ignored, so the Time complexity would be n^3.
Note: Take each loop as (n) and to obtained the total time, multiple the (n) values in each loop.
Hope this will help !
Given the following code:
for (int i = 0; i < n-1; ++i)
{
for (int j = i+1; j < n; ++j)
{
// Do work.
}
}
What is the Big-Oh value for it (over n)? I'm thinking it's O(N^2) but I'm not sure.
I did find a similar question here: complexity for nested loops
but it's not quite the same I think.
Yes, that's O(N^2). Pair up the iterations of the inner loop in the beginning and at the end of the outer loop, like this:
The inner loop will execute...
N-1 times on the first iteration of the outer loop, and 1 time on the last iteration
N-2 times on the second iteration of the outer loop, and 2 times on the second to last iteration
N-3 times on the third iteration of the outer loop, and 3 times on the third to last iteration
... and so on; you will have N/2 pairs like that; when N is odd, the last pair is incomplete.
You can see that each of the pairs executes a total of N times, and you have N/2 such pairs, for a total of N*(N-1)/2 times.
The way the formula is derived comes from the derivation of the formula for the sum of arithmetic progression with the common difference of 1.
It should be possible to check it this way.
int z = 0, n = 10; // try 20 etc
for (int i = 0; i < n-1; ++i)
{
for (int j = i+1; j < n; ++j)
{
z++;
}
}
Now, check the value of z.
With n = 10; z becomes 45
With n = 20; z becomes 190
With n = 40; z becomes 780
A doubling in n caused z to become ~4 times its value. Hence, it is approximately O(n^2).
Methodically, using Sigma notation (empirically verified), you can obtain the exact number of iterations plus the order of growth complexity:
Stuck with me HW - Need to try complexity
time=0;
for (i=n; i>=1; i = sqrt(i))
for (j=1; j<=i; j++)
time++;
What I did - First loop going like this:
i=n, n^(1/2), n^(1/4)...1
than we get:
n^(1/2)^k = 1
if I log both sides one side get 0... what should I do?
I suppose there is a typo somewhere because otherwise it's Θ(∞) if the input n is not smaller than 1. (For i == 1, the update i = sqrt(i) doesn't change i, so that's an infinite loop.)
So let us suppose it's actually
time = 0;
for (i = n; i > 1; i = sqrt(i))
for (j = 1; j <= i; j++)
time++;
Then, to get the complexity of nested loops, you need to sum the complexity of the inner loop for each iteration of the outer loop. Here, the inner loop runs i times, obviously, so we need to sum the values i runs through in the outer loop. These values are n, n^0.5, n^0.25, ..., n^(1/2^k), where k is characterised by
n^(1/2^(k+1)) < 2 <= n^(1/2^k)
or, equivalently,
2^(2^k) <= n < 2^(2^(k+1))
2^k <= lg n < 2^(k+1)
k <= lg (lg n) < k+1
k = floor(lg(lg n))
Now it remains to estimate the sum from above and below to get the Θ of the algorithm. This estimate is very easy if you start writing down the sums for a few (large) values of n.
The code fragment I am to analyse is below:
int sum = 0;
for (int i = 0; i < n; i++)
for (int j = 0; j < i * i; j++)
for (int k = 0; k < j; k++)
sum++;
I know that the first loop is O(n) but that's about as far as I've gotten. I think that the second loop may be O(n^2) but the more I think about it the less sense it makes. Any guidance would be much appreciated.
The first loop executes n times. Each time, the value i grows. For each such i, the second loop executes i*i times. That means the second loop executes 1*1 + 2*2 + 3*3 + ... + n*n times.
This is a summation of squares, and the formula for this is well-known. Hence we have the second loop executing (n(1 + n)(1 + 2 n))/6 times.
Thus, we know that in total there will be (n(1 + n)(1 + 2 n))/6 values of j, and that for each of these the third loop will execute 1 + 2 + ... + j = j(j+1)/2 times. Actually calculating how many times the third loop executes in total would be very difficult. Luckily, all you really need is a least upper bound for the order of the function.
You know that for each of the (n(1 + n)(1 + 2 n))/6 values of j, the third loop will execute less than n(n+1)/2 times. Therefore you can say that the operation sum++ will execute less than [(n(1 + n)(1 + 2 n))/6] * [n(n+1)/2] times. After some quick mental math, that amounts to a polynomial of maximal degree 5, therefore your program is O(n^5).
int sum = 0;
for (int i = 0; i < n; i++) // Let's call this N
for (int j = 0; j < i * i; j++) // Let's call this M
for (int k = 0; k < j; k++) // Let's call this K
sum++;
N is the number of steps of the entire program, M is the number of steps the two inner loops do and lastly K is the number of steps the last loop does.
It is easy to see that K = j, it takes j steps to do.
Then M = Sum(j=0,i^2,K) = Sum(j=0, i^2, j)
(First param is the iterator, second is the upper bound and last param is what we are adding)
This is actually now a sum of n numbers to i*i. Which means we can apply the formula ((n+1)*n)/2
M = Sum(j=0,i^2,j) = ((i^2+1)*(i^2))/2 = (i^4 + i^2)/2
N = Sum(i=0, n, M) = 1/2 * ( Sum(i=0, n, (i^4)) + Sum(i=0, n, (i^2)) )
These are both well known formulas and after a little playing you get:
N = (n^5)/10 + (n^4)/4 + (n^3)/3 + (n^2)/4 + n/15
This should be the exact number of steps the loop takes, but if you are interested in the O notation you can note that n^5 is the strongest growing part so the solution is O(n^5)
If you proceed methodically using Sigma Notation, you'll end up with the following result:
Try to count how many times the inner loop is executed:
The middle loop runs
0*0 times when i == 0
1*1 times when i == 1
2*2 times when i == 2
...
n*n = n^2 times when i == n.
So it is O(n^2).