Big o notation for nested loops with an inner loop that depends on the outer loop - big-o

I'm just learning big O notation and I'm confused about nested loops:
for (int x = 0; x < n; x++)
for (int y = 0; y < n; y++)
for (int z = 0; z < y; z++)
anything();
From my understanding the inner loop above executes n(n+1)/2 times, the second loop executes n times and the first loop executes n times. Shouldn't this mean the big O is n x n x n(n+1)/2 = O(n^4)? Why doesn't the second loop get included in the big O formulation?

The question is: how often does anything() get called, as a function of n?
the inner loop above executes n(n+1)/2 times
No, the inner loop executes y times (per times you enter it), and y averages n/2.
So the equation is n * n * n/2 = O(n^3).

Big O notation is to express the upperbound of the algorithm. So, if a loop based algorithm is running n²+n times, we still say in big O notation as O(n²).
Refer this article for a more detailed explanation.

Related

What is the Big O of such an algorithm?

If you had code that looks like this, what would the big O be? I'm uncertain as to how if statements affect big O.
n = some arbitrary number
for(i = 0; i < n; i++)
for(j = 0; j < n; j++)
if(i <= j)
for(k = i; k <= j; k++)
//do some simple operation
y = x+1
else
//do some simple operation
y = y+1
I'm not considering compiler optimizations. I know this is somewhere between O(n^2) and O(n^3) but am not sure as the if statement does not always execute the inner most loop.
O(N * N * N) which we can just say is O(N^3)
First Loop happens N times.
Second Loop happens N times.
Those multiply together to get O(N^2)
Out of all the possible N^2 loops, the third loop would run about half the time, which is O(N/2) which is equivalent to O(N).
And that's how you get O(N * N * N) or O(N^3)
In fact you can count (almost exactly) how many operations you do:
i: 0 to n-1 = N operations
x
j: 0 to n-1 = N operations
x
only when i<=j, from i to j, or another task O(1)
The other task gives you NxN operations, then O(NxN)
that is, if you invert
for every j (0 to n-1) : N operations
then for every i from 0 to j, you do an operation from i to j, that is j-i+1
exactly the same as every 0 to j operations. Then you have (j+1)x(j+2)/2 operations.
then finally, you get Sum (j+1)x(j+2)/2 from 0 to N which is
1/2 ((N+1)x(N+2)/2 + (N+1)^3/3+(N+1)^2/2+(N+1)/6) operations, so O(N^3)
Perhaps, I forgot some +/-1
You can analyse your algorithm using Sigma notation:
From this, it's obvious the time complexity will depend on cubic n terms, and hence your algorithm is in O(n^3).
This is O(N^3).
Proof: http://www.wolframalpha.com/input/?i=sum+sum+%28j+-+i+%2B+1%29%2C+j+%3D+i+to+n+-+1%2C+i+%3D+0+to+n+-+1
Last cycle runs (j - i + 1) times.
How to find this sum manually?
This is not a rocket math.
Try to read about https://en.wikipedia.org/wiki/Telescoping_series
Just to save time it's easier to use wolframalpha for that purpose.

big theta for quad nested loop with hash table lookup

for (int i = 0; i < 5; i++) {
for (int j = 0; j < 5; j++) {
for (int k = 0; k < 5; k++) {
for (int l = 0; l < 5; l++) {
look up in a perfect constant time hash table
}
}
}
}
what would the running time of this be in big theta?
my best guess, a shot in the dark: i always see that nested for loops are O(n^k) where k is the number of loops, so the loops would be O(n^4), then would i multiply by O(1) for constant time? what would this all be in big theta?
If you consider that accessing a hash table is really theta(1), then this algorithm runs in theta(1) too, because it makes only a constant number (5^4) lookups at the hashtable.
However, if you change 5 to n, it will be theta(n^4) because you'll do exactly n^4 constant-time operations.
The big-theta running time would be Θ(n^4).
Big-O is an upper bound, where-as big-theta is a tight bound. What this means is that to say the code is O(n^5) is also correct (but Θ(n^5) is not), whatever's inside the big-O just has to be asymptotically bigger than or equal to n^4.
I'm assuming 5 can be substituted for another value (i.e. is n), if not, the loop would run in constant time (O(1) and Θ(1)), since 5^4 is constant.
Using Sigma notation:
Indeed, instructions inside the innermost loop will execute 625 times.

derivation of algorithm complexity

Refreshing up on algorithm complexity, I was looking at this example:
int x = 0;
for ( int j = 1; j <= n; j++ )
for ( int k = 1; k < 3*j; k++ )
x = x + j;
I know this loops ends up being O(n^2). I'm believing inner loop is executed 3*n times( 3(1+2+...n) ), and the outer loop executes n times. So, O(3n*n) = O(3n^2) = O(n^2).
However, the source I'm looking at expands the execution of the inner loop to: 3(1+2+3+...+n) = 3n^2/2 + 3n/2. Can anyone explain the 3n^2/2 + 3n/2 execution times?
for each J you have to execute J * 3 iterations of internal loop, so you command x=x+j will be finally executed n * 3 * (1 + 2 + 3 ... + n) times, sum of Arithmetic progression is n*(n+1)/2, so you command will be executed:
3 * n * (n+1)/2 which is equals to (3*n^2)/2 + (3*n)/2
but big O is not how much iterations will be, it is about assymptotic measure, so in expression 3*n*(n+1)/2 needs to remove consts (set them all to 0 or 1), so we have 1*n*(n+0)/1 = n^2
Small update about big O calculation for this case: to make big O from the 3n(n+1)/2, for big O you can imagine than N is infinity, so:
infinity + 1 = infinity
3*infinity = infinity
infinity/2 = infinity
infinity*infinity = infinity^2
so you after this you have N^2
The sum of integers from 1 to m is m*(m+1)/2. In the given problem, j goes from 1 to n, and k goes from 1 to 3*j. So the inner loop on k is executed 3*(1+2+3+4+5+...+n) times, with each term in that series representing one value of j. That gives 3n(n+1)/2. If you expand that, you get 3n^2/2+3n/2. The whole thing is still O(n^2), though. You don't care if your execution time is going up both quadratically and linearly, since the linear gets swamped by the quadratic.
Big O notation gives an upper bound on the asymptotic running time of an algorithm. It does not take into account the lower order terms or the constant factors. Therefore O(10n2) and O(1000n2 + 4n + 56) is still O(n2).
What you are doing is try to count the number the number of operations in your algorithm. However Big O does not say anything about the exact number of operations. It simply provides you an upper bound on the worst case running time that may occur with an unfavorable input.
The exact precision of your algorithm can be found using Sigma notation like this:
It's been empirically verified.

What's the Big-Oh for nested loops i=0..n-2, j=i+1..n-1?

Given the following code:
for (int i = 0; i < n-1; ++i)
{
for (int j = i+1; j < n; ++j)
{
// Do work.
}
}
What is the Big-Oh value for it (over n)? I'm thinking it's O(N^2) but I'm not sure.
I did find a similar question here: complexity for nested loops
but it's not quite the same I think.
Yes, that's O(N^2). Pair up the iterations of the inner loop in the beginning and at the end of the outer loop, like this:
The inner loop will execute...
N-1 times on the first iteration of the outer loop, and 1 time on the last iteration
N-2 times on the second iteration of the outer loop, and 2 times on the second to last iteration
N-3 times on the third iteration of the outer loop, and 3 times on the third to last iteration
... and so on; you will have N/2 pairs like that; when N is odd, the last pair is incomplete.
You can see that each of the pairs executes a total of N times, and you have N/2 such pairs, for a total of N*(N-1)/2 times.
The way the formula is derived comes from the derivation of the formula for the sum of arithmetic progression with the common difference of 1.
It should be possible to check it this way.
int z = 0, n = 10; // try 20 etc
for (int i = 0; i < n-1; ++i)
{
for (int j = i+1; j < n; ++j)
{
z++;
}
}
Now, check the value of z.
With n = 10; z becomes 45
With n = 20; z becomes 190
With n = 40; z becomes 780
A doubling in n caused z to become ~4 times its value. Hence, it is approximately O(n^2).
Methodically, using Sigma notation (empirically verified), you can obtain the exact number of iterations plus the order of growth complexity:

Detailed Big-Oh question

So, I'm slightly confused by this question on my homework.
for ( int j = 0; j < 2*n; j++){
for ( int k = 0; k < n * n * n; k += 3)
sum++;
}
So I am at this conclusion after a bit of confusion
for( 1, 2n, n)
for( 1/3( 1, 3n, 1)
I have it as 1/3 because it's going up by 3. I'm just not sure if I'm right, we were just introduced to this so I'm sorta lost.
I'm not completely sure that I understand what you are asking... Assuming that the question is what the Big-O notation for this nested loop would be (and assuming that the addition operation is the base operation)
The outer loop is executed 2n times
The inner loop is executed n^3/3 times for each iteration of the outer loop
That means that the inner statement is executed 2n * n^3/3 = (2/3)*n^4. For Big O notation, we ignore the constants, so this nested loop is O(n^4).

Resources