I am trying to determine the Big-O runtimes for these loops. I believe the answers I have are correct, but I would like to check with the community.
int sum = 0;
for (int i = 1; i <= n*2; i++ )
sum++;
My answer is O(n)
This is because the loop iterates n x 2 times. We drop the 2 and are left with n. Therefore O(n).
int sum = 0;
for ( int i = 1; i <= n; i++)
for ( int j = n; j > 0; j /= 2)
sum++;
My answer is O(n lgn)
The outer loop iterates n times. The inner loop iterates from n down to 0, but only through half of the items. This works out to be Log base 2 of n. We drop the 2 and keep log n. The inner loop (log n) times the outer loop (n), gives us O(n lgn).
int sum = 0;
for ( int i = 1; i <= n; i++)
for ( int j = i; j <= n; j += 2)
sum++;
My answer is O(n^2)
This one is easy. The inner loop and outer loop each iterate n times. n x n = n^2. Therefore O(n^2).
int sum = 0;
for ( int i = 1; i <= n * n; i++)
for ( int j = 1; j < i; j++ )
sum++;
My answer is O(n^3)
Are my answers correct? If not, how can I correct them?
Only the last one is wrong. It should be O(n⁴).
You can see it this way: substitute n * n with x. The number of operations will be the usual O(x*(x+1)/2) = O(x²). Now substitute n * n back into x.
Extra challenge for you.
You correctly said that:
int sum = 0;
for ( int i = 1; i <= n; i++)
for ( int j = n; j > 0; j /= 2)
sum++;
Is O(n log n). But what about this one:
int sum = 0;
for ( int i = 1; i <= n; i++)
for ( int j = i; j > 0; j /= 2)
sum++;
I only changed j = n to j = i.
Related
Please refer to the code fragment below:
sum = 0;
for( i = 0; i < n; i++ )
for( j = 0; j < i * i; j++ )
for( k = 0; k < j; k++ )
sum++;
If one was to analyze the run-time of this fragment in Big-Oh Notation, what would be the most appropriate value? Would it be O(N^4)?
I have to say O(n^5). I maybe wrong. n is the length of the array in this question I am assuming. If you substitute it, i will stop when i < n j will stop when j < n*n and k will stop when k < n * n. Since these are nested for-loops, the run time is O(n^5)
I think it's O(n^5)
We know that
1 + 2 + 3 + ... + n = n(n+1) / 2 = O(n^2)
1^2 + 2^2 + ... + n^2 = n(n+1)(2n+1) / 6 = O(n^3)
I think that similary statement is true for sum of bigger powers.
Let's think about these two lines
for( j = 0; j < i * i; j++ )
for( k = 0; k < j; k++ )
This loop's time complexity for fixed i is
1 + 2 + ... + (i-1)^2 = (i-1)^2(1 + (i-1)^2) / 2 = O(i^4)
But in our code i changes from 0 to n-1, so we like add fourth powers, and as I said before
1^k + 2^k + ... + n^k = O(n^(k+1))
That's why time complexity is O(n^5)
Given this code, consider each loop separately.
sum = 0;
for( i = 0; i < n; i++ ) //L1
for( j = 0; j < i * i; j++ ) //L2
for( k = 0; k < j; k++ ) //L3
sum++; //
Consider the outermost loop (L1) alone. Clearly this loop is O(n).
for( i = 0; i < n; i++ ) //L1
L2(i);
Consider the next loop (L2). This one is harder, but the loop is O(n^2).
See this SO answer for why the loop is O(n^2): Big O, how do you calculate/approximate it?
for( j = 0; j < i * i; j++ ) //L2
L3(j);
Consider the next loop (L3). Since you know that the L2 loop is O(N^2), what is this loop? (leaving reader to complete their homework).
for( k = 0; k < j; k++ ) //L3
sum++; //
I need to analyze the O-Notation complexity of the following code. I think I have some idea, but need some help. Do my O-Notations for each loop correct?
int sum = 0;
for (int i = 1; i <= n*2; i++ ) //N operations, ignore the multiplication by 2
sum++;
O(N)
int sum = 0;
for ( int i = 1; i <= n; i++) //N operations
for ( int j = n; j > 0; j /= 2) //N operations, ignore division by constant?
sum++;
O(N^2)
int sum = 0;
for ( int i = 1; i <= n; i++) //N ops
for ( int j = i; j <= n; j += 2) // N ops
sum++;
O(N^2)
int sum = 0;
for ( int i = 1; i <= n * n; i++) // N * N ops
for ( int j = 1; j < i; j++ ) // N opts
sum++;
O(N^3)
You iterate through 2*N elements, so its O(2N) = O(N)
You iterate through N elements and in each iteration, you go through logN elements, so its O(N*logN) - this time its different from 1., because you divide the counter j, not the limiting variable N, so , that means, if N is 20, you get this:
first iteration: j = 20
second iteration: j = 10
third iteration: j = 5
fourth iteration j = 2
fifth iteration j = 1
so its log(20) = 4.something, and that means 5 iterations...
So, it depends where is calculation perform, either in limiting factor or iteration calculation...
This is O(N^2), because you have for loop with n iterations, and in each iteration you have n/2 iterations, so its O(N * N / 2) = O(N^2)
In first loop you have N*N iterations. In each iteration you have to go to i, which depends on n, so you have:
i=1 => no iteration in inner for loop
i=2 => 1 iteration
i=3 => 2 iterations
i=4 => 3 iterations
...
i=n*n => n*n - 1 iterations
Sum of it is:
0+1+2+3+...+n*n-1 = 2+3+...n*n = 1+2+3+...+m-1(where m is n*n) = m*(m+1)/2 - m
So its: O(M^2) = O(N^4)
I have an assignment I am not sure with; I have to calculate the time complexity of the following code:
int a[][] = new int[m][n]; //O(1)
int w = 0; //O(1)
for (int i = 0; i < m; i++) //O(n)
for (int j = 0; j <n; j++) //O(n)
if (a[i] [j] % 2 == 0) //O(logn)
w++; //O(1)
So from my O estimations I add them up:
O(1) + O(1) + O(n) * ( O(n) * ( O(logn) + O(1) / 2 ) )
O(1) + O(1) + O(n) * ( O(nlogn) + O(n) / 2 )
O(1) + O(1) + (O(n2logn) + O(n2) / 2)
=O(n2logn)
I'm not sure if my train of thought is correct, could somebody help?
for (int i = 0; i < m; i++) //O(m)
for (int j = 0; j <n; j++) //O(n)
if (a[i] [j] % 2 == 0) //O(1)
w++; //O(1)
So the total complexity in terms of big-o is:
O(m)*(O(n) + O(1) + O(1)) = O(m)*O(n) = O(m*n).
for (int i = 0; i < m; i++) //O(m)
{
for (int j = 0; j <n; j++) //O(n)
{
// your code
}
}
So the i loop will go on m times, and for the j loop would run n times.
So in total the code will go on m*n times which would be its time complexity: O(m.n)
The final complexity is O(n^2)
Your logic is close except...
int a[][] = new int[m][n]; //O(1)
int w = 0; //O(1)
for (int i = 0; i < m; i++) //O(n)
for (int j = 0; j <n; j++) //O(n)
if (a[i] [j] % 2 == 0) //O(1)
w++; //O(1)
Your if statement embedded in your second for loop is simply referencing an element in an array and doing a basic comparison. This is of time complexity O(1). Also, typically you would not consider initializing variables in a time complexity problem.
I've got to analyze this loop, among others, and determine its running time using Big-O notation.
for ( int i = 0; i < n; i += 4 )
for ( int j = 0; j < n; j++ )
for ( int k = 1; k < j*j; k *= 2 )`
Here's what I have so far:
for ( int i = 0; i < n; i += 4 ) = n
for ( int j = 0; j < n; j++ ) = n
for ( int k = 1; k < j*j; k *= 2 ) = log^2 n
Now the problem I'm coming to is the final running time of the loop. My best guess is O(n^2), however I am uncertain if this correct. Can anyone help?
Edit: sorry about the Oh -> O thing. My textbook uses "Big-Oh"
First note that the outer loop is independent from the remaining two - it simply adds a (n/4)* multiplier. We will consider that later.
Now let's consider the complexity of
for ( int j = 0; j < n; j++ )
for ( int k = 1; k < j*j; k *= 2 )
We have the following sum:
0 + log2(1) + log2(2 * 2) + ... + log2(n*n)
It is good to note that log2(n^2) = 2 * log2(n). Thus we re-factor the sum to:
2 * (0 + log2(1) + log2(2) + ... + log2(n))
It is not very easy to analyze this sum but take a look at this post. Using Sterling's approximation one can that it is belongs to O(n*log(n)). Thus the overall complexity is O((n/4)*2*n*log(n))= O(n^2*log(n))
In terms of j, the inner loop is O(log_2(j^2)) time, but sine
log_2(j^2)=2log(j), it is actually O(log(j)).
For each iteration of middle loop, it takes O(log(j)) time (to do the
inner loop), so we need to sum:
sum { log(j) | j=1,..., n-1 } log(1) + log(2) + ... + log(n-1) = log((n-1)!)
And since log((n-1)!) is in O((n-1)log(n-1)) = O(nlogn), we can conclude middle middle loop takes O(nlogn) operations .
Note that both middle and inner loop are independent of i, so to
get the total complexity, we can just multiply n/4 (number of
repeats of outer loop) with complexity of middle loop, and get:
O(n/4 * nlogn) = O(n^2logn)
So, total complexity of this code is O(n^2 * log(n))
Time Complexity of a loop is considered as O(n) if the loop variables is incremented / decremented by a constant amount (which is c in examples below):
for (int i = 1; i <= n; i += c) {
// some O(1) expressions
}
for (int i = n; i > 0; i -= c) {
// some O(1) expressions
}
Time complexity of nested loops is equal to the number of times the innermost statement is executed. For example the following sample loops have O(n²) time complexity:
for (int i = 1; i <=n; i += c) {
for (int j = 1; j <=n; j += c) {
// some O(1) expressions
}
}
for (int i = n; i > 0; i += c) {
for (int j = i+1; j <=n; j += c) {
// some O(1) expressions
}
Time Complexity of a loop is considered as O(logn) if the loop variables is divided / multiplied by a constant amount:
for (int i = 1; i <=n; i *= c) {
// some O(1) expressions
}
for (int i = n; i > 0; i /= c) {
// some O(1) expressions
}
Now we have:
for ( int i = 0; i < n; i += 4 ) <----- runs n times
for ( int j = 0; j < n; j++ ) <----- for every i again runs n times
for ( int k = 1; k < j*j; k *= 2 )` <--- now for every j it runs logarithmic times.
So complexity is O(n²logm) where m is n² which can be simplified to O(n²logn) because n²logm = n²logn² = n² * 2logn ~ n²logn.
I'm doing an online course and i'm stuck on this question. I know there are similar questions but they don't help me.
What is the order of growth of the worst case running time of the
following code fragment as a function of N?
int sum = 0;
for (int i = 0; i*i*i < N; i++)
for (int j = 0; j*j*j < N; j++)
for (int k = 0; k*k*k < N; k++)
sum++;
I thought that the order would be n^3 but I don't think this is correct because the loops only go through a third of n each time. So would that make it nlogn?
Also
int sum = 0;
for (int i = 1; i <= N; i++)
for (int j = 1; j <= N; j++)
for (int k = 1; k <= N; k = k*2)
for (int h = 1; h <= k; h++)
sum++;
I think this one would be n^4 because you have n * n * 0.5n * 0.5n
The loops in fact only go up to the cube root of N. (i^3 < n, etc.)
The 3 nested loops of this length, give O(cube root of N, cubed). This O(N)
Of note, if you were correct and they each went to one third of N, then cubing this still gives O(N^3/9), 1/9 is constant, so this is O(n^3)
If you examine the value of sum for various values of N, then it becomes pretty clear what the time complexity of the algorithm is:
#include <iostream>
int main()
{
for( int N=1 ; N<=100 ; ++N ) {
int sum = 0;
for (int i = 0; i*i*i < N; i++)
for (int j = 0; j*j*j < N; j++)
for (int k = 0; k*k*k < N; k++)
sum++;
std::cout << "For N=" << N << ", sum=" << sum << '\n';
}
return 0;
}
You can then draw your own conclusions with greater insight.