Running time of for loop - performance

I seem to understand the basic concepts of easier loops like so...the first loop runs in O(n), as does the inner loop. Because they're both nested, you multiply to get a total running time of O(n^2).
sum = 0;
for ( i = 0; i < n; i++ )
for j = 0; j < n; j++ )
++sum;
Though when things start getting switched around, I get completely lost as to how to figure it out. Could someone explain to me how to figure out running time for both of the following? Also, any links to easy to understand references that could further help me improve is also appreciated. Thanks!
sum = 0;
for( i = 0; i < n; i += 2 )
for( j = 0; j < n; j++ )
++sum;
The only thing I can gather from this is that the inner loop runs in O(n). The i+=2 really throws me off in the outer loop.
sum = 0;
for( i = 1; i < n; i *= 2 )
for( j = 0; j < n; j++ )
++sum;
From my attempt...outer loop is O(log(n)), inner is O(n), so total is O(n log(n))?

A good way of thinking about Big-O performance is to pretend each element of the code is a mathematical function that takes in n items and returns the number of computations performed on those items.
For example, a single for loop like for ( i = 0; i < n; i++ ) would be equivalent to a function i(), where i(n) = n, indicating that one computation is performed for each input n.
If you have two nested loops, then the functional equivalent for
for ( i = 0; i < n; i++ )
for j = 0; j < n; j++ )
would look like these two functions:
i(n) = n * j(n)
j(n) = n
Working these two functions out produces an end result of n*n = n^2, since j(n) can be substituted for n.
What this means is that as long as you can solve for the Big-O of any single loop, you can then apply those solutions to a group of nested loops.
For example, let's look at your second problem:
for( i = 0; i < n; i += 2 )
for( j = 0; j < n; j++ )
i+=2 means that for an input set of n items (n0, n1, n2, n3, n4) you're only touching every other element of that set. Assuming you initialize so that i=0, that means you're only touching the set of (n0,n2,n4). This means you're halving the size of the data set that you're using for processing, and means the functional equivalents work out like:
i(n) = (n/2) * j(n)
j(n) = n
Solving these gets you (n/2) * n = (n^2)*(1/2). Since this is Big-O work, we remove the constants to produce a Big-O value of (n^2).
The two key points to remember here:
Big-O math starts with a set of n data elements. If you're trying to determine the Big-O of a for loop that iterates through that set of n elements, your first step is to look at how the incrementor changes the number of data elements that the for routine actually touches.
Big-O math is math. If you can solve for each for expression individually, you can use those solutions to build up into your final answer, just like you can solve for a set of equations with common definitions.

Related

Finding Big O Notation of For loop which is inside an If condition

sum = 0;
for( i = 1; i < n; ++i )
for( j = 1; j < i * i; ++j )
if( j % i == 0 )
for( k = 0; k < j; ++k )
++sum;
How do I find Big O notation for this code? I'm new to this big o notation thing. So I'll appreciate if someone can explain me it simply with details.. Thank you!
Big O is an asymptotic upper bound of a function. So in your case the the for loops take the most time, if the if condition evaluates always to true, so you can just assume this an get a correct upper bound, which is maybe not tight. But there are a lot of cases where you cannot do better than this.
In some cases you can try to remove the if while maintaining the number of operations roughly. E.g. in your case you could replace j = 1 by j = i and ++j by j += i. This is not to change the algorithm, but only for the complexity analysis to change the way you look at it. You still have to remember that the middle for loop takes i*i steps. Now you have this:
sum = 0;
for( i = 1; i < n; ++i )
O(i * i) Operations
for( j = i; j < i * i; j += i )
for( k = 0; k < j; ++k )
++sum;
You also can assume that the if condition is always false. This way you get a lower bound. In some cases the upper and lower bound match, meaning that the part you hat trouble to analyze is actually irrelevant for the overall complexity.

On understanding how to compute the Big-O of code snippets

I understand that simple statements like:
int x = 5; // is 1 or O(1)
And a while loop such as:
while(i<); // is n+1 or O(n)
And same with a for a single for loop (depending).
With nested while or for loop such as:
for(int i = 0; i<n; i++){ // this is n + 1
for(int j = 0; j<n; j++){ // this is (n+1)*n, total = O(n^2)
}
Also anytime we have a doubling effect it's log_2(n), tripling effect log_3(n) and so on. And if the control varible is being halved or quarted that's also either log_2(n) or log_4(n).
But I am dealing with much more complicated examples. How would one figure these examples out. I have the answers I just don't know how to work them out on paper come an examination.
Example1:
for (i = 1; i < (n*n+3*n+17) / 4 ; i += 1)
System.out.println("Sunshine");
Example2:
for (i = 0; i < n; i++)
if ( i % 2 == 0) // very confused by what mod would do to runtime
for (j = 0; j < n; j++)
System.out.print("Bacon");
else
for (j = 0; j < n * n; j++)
System.out.println("Ocean");
Example3:
for (i = 1; i <= 10000 * n: i *= 2)
x += 1;
Thank you
Example 1 is bounded by the term (n*n+3*n+17) and therefore should be O(n^2). The reason this is O(n^2) is because the largest, and therefore dominant, term in the expression is n^2.
The second example is a bit more tricky. The outer loop in i will iterate n times, but what executes on the inside depends on whether that value of i be odd or even. When even, another loop over n happens, but when odd a loop in n^2 happens. The odd case will dominate the running time eventually, so example 2 should be O(n^3).
The third example iterates until hitting 10000*n, but does so by doubling the loop counter i at each step. This will have an O(lgn) performance, where lg means the log base 2. To see why, imagine we wanted to reach n=32, starting at i=1 and doubling each time. Well we would have 2, 4, 8, 16, and 32, i.e. 6 steps, which grows as lg(32).

Please explain in simple terms why this code fragment has O(n^4) complexity?

Evaluate Big-Oh of the following code fragment:
sum = 0
for( i = 1; i < n; ++i )
for( j = 1; j < i * i; ++j )
if( j % i == 0 )
for( k = 0; k < j; ++k )
++sum
This is a homework problem in a textbook for my algorithms class. The answer as stated in the textbook is O(n^4). I've tried doing the problem many ways, but I am always getting O(n^5).
I'm using the summation method and mathematically evaluating from the innermost nested loop outward. The summations are not shown here because I don't know how to express my math in this space, but please follow my work below.
Here is my logic for the innermost loop:
for( k = 0; k < j; ++k )
My thinking is that the inner loop makes j+1 iterations, which can be as big as i*i, which itself can be as big as n, so this loop as an upper bound of O(n^2).
Here is my logic for the middle loop:
for( j = 1; j < i * i; ++j )
j iterates as high as i^2 times, which itself can go as high as n, so this loop has an upper bound of O(n^2).
Here is my logic for the outer loop:
for( i = 1; i < n; ++i )
i iterates as high as n times, so the loop has an upper-bound of O(n).
O(n * n^2 * n^2) = O(n^5)
Again, the answer is O(n^4). Please help me, using mathematical loops to aid your answer. Please use simple language. I am still new to algorithm analysis.
The trick is in this line:
if( j % i == 0 )
What this does is ensures the inner loop only executes when j is an exact multiple of i; otherwise no work is done.
So one shortcut you could think about is saying that this is O(n * n^2 / n * n^2) = O(n^4).
Another way you could think about it is that this is equivalent to writing:
sum = 0
for( i = 1; i < n; ++i )
for( j = 1; j < i * i; j += i )
for( k = 0; k < j; ++k )
++sum
which is O(N^4) by inspection.

Troubles with Big O Estimate

I'm asked to give a big-O estimates for some pieces of code but I'm having a little bit of trouble
int sum = 0;
for (int i = 0; i < n; i = i + 2) {
for (int j = 0; j < 10; j + +)
sum = sum + i + j;
I'm thinking that the worst case would be O(n/2) because the outer for loop is from i to array length n. However, I'm not sure if the inner loop affects the Big O.
int sum = 0;
for (int i = n; i > n/2; i − −) {
for (int j = 0; j < n; j + +)
sum = sum + i + j;
For this one, I'm thinking it would be O(n^2/2) because the inner loop is from j to n and the outer loop is from n to n/2 which gives me n*(n/2)
int sum = 0;
for (int i = n; i > n − 2; i − −) {
for (int j = 0; j < n; j+ = 5)
sum = sum + i + j;
I'm pretty lost on this one. My guess is O(n^2-2/5)
Your running times for the first two examples are correct.
For the first example, the inner loop of course always executes 10 times. So we can say the total running time is O(10n/2).
For the last example, the outer loop only executes twice, and the inner loop n/5 times, so the total running time is O(2n/5).
Note that, because of the way big-O complexity is defined, constant factors and asymptotically smaller terms are negligible, so your complexities can / should be simplified to:
O(n)
O(n2)
O(n)
If you were to take into account constant factors (using something other than big-O notation of course - perhaps ~-notation), you may have to be explicit about what constitutes a unit of work - perhaps sum = sum + i + j constitutes 2 units of work instead of just 1, since there are 2 addition operations.
You're NOT running nested loops:
for (int i = 0; i < n; i = i + 2);
^----
That semicolon is TERMINATING the loop definition, so the i loop is just counting from 0 -> n, in steps of 2, without doing anything. The j loop is completely independent of the i loop - both are simply dependent on n for their execution time.
For the above algorithms worst case/best case are the same.
In case of Big O notation, lower order terms and coefficient of highest order term can be ignored as Big O is used for describing asymptotic upper bound.
int sum = 0;
for (int i = 0; i < n; i = i + 2) {
for (int j = 0; j < 10; j + +)
sum = sum + i + j;
Total number of outer loop iteration =n/2.for each iteration of outer loop, number of inner loop iterations=10.so total number of inner loop iterations=10*n/2=5n. so clearly it is O(n).
Now think about rest two programs and determine time complexities on your own.

Big-Oh analysis of the running time?

For each of the following program fragments, give a Big-Oh analysis of the running time. I have two problems that I am not 100% sure if there right, can somebody help me
Fragment 1:
for( int i = 0; i < n; i++ )
for( int j = 0; j < n * n; j++ )
for( int k = 0; k < j; k++ )
sum++;
Answer: O(n^5) not really sure n*n??
Fragment 2:
for( int i = 1; i <= n; i++ )
for( int j = 1; j <= i * i; j++ )
if (j % i == 0)
for( int k = 0; k < j; k++)
sum++;
Answer:O(n^4)
Decompose the problem space per loop. Start from the outermost loop. What are the loops really going up to?
For the first problem, we have the following pattern.
The outer loop will run n times.
The outer inner loop will run n2 times, and is not bound by the current value of the inner loop.
The innermost loop will run up to j times, which causes it to be bound by the current value of the outer inner loop.
All of your steps are in linear chunks, meaning you will go from 0 to your ending condition in a linear fashion.
Here's what the summation actually looks like.
So, what would that translate into? You have to unroll the sums. It's not going to be O(n5), though.
For the second problem, we have the following pattern.
The outer loop runs up to and including n times.
The outer inner loop runs up to and including i2 times.
The innermost loop runs up to j times, on the condition that j % i == 0. That means that the innermost loop isn't executed every time.
I'll leave this problem for you to solve out. You do have to take the approach of unrolling the sums and reducing them to their algebraic counterparts.
for Fragment 1:
lets say m = n^2
Sigma(i=0...n) m Sigma(j=0.....m) j
=> n * (m(m+1)/2)
=> n ^ 5
Answer: O(n^5)
for Fragment 2:
last loop runs for i-1 times ...
Sigma(i=0...n) Sigma(j=0.....i-1) Sigma(k=0.....j) k
approximately its Sigma(i=0...n) i^2
~=> n^3
Answer:O(n^3)

Resources