Explain the answer of this Big O complexity code - big-o

for( int bound = 1; bound <= n; bound *= 2 ) {
for( int i = 0; i < bound; i++ ) {
for( int j = 0; j < n; j += 2 ) {
... // constant number of operations
}
for( int j = 1; j < n; j *= 2 ) {
... // constant number of operations
}
}
}
The correct answer is O(n2). As I understand, the Of the first two for loops, they are "nested". Since bound goes from 1 to n, and i goes to bound. The third and fourth loops are not nested.
The third for loop has complexity O(n+2) , The fourth for loop has complexity O(log n)
Someone told me since n2 > n+2 + log n, algorithm is O(n2). (I'm really confused about this step)
I thought you are suppose to multiply each loop, so shouldn't it be log n(outer loop) * n(2nd outer loop) * n(third) * log N(fourth). how do I know which loop I need to add or multiply, and which loop's value should I take(do I take N over log N for the first two loops because N is smaller to process?

The complexity should be O(n3).
First consider the inner-most loops:
for( int j = 0; j < n; j += 2 ) {
... // constant number of operations
}
Index j takes values 0, 2, 4, 6, 8, ... up to n, so j can take at most n / 2 + 1 possible values, hence the complexity of this loop is O(n).
And for another inner-most loop:
for( int j = 1; j < n; j *= 2 ) {
... // constant number of operations
}
Index j takes values 1, 2, 4, 8, 16, ... up to n, so j can take at most log(n) + 1 possible values, hence the complexity of this loop is O(log(n)).
Therefore for any fixed bound and i, the total work done by the two inner-most loops is O(n) + O(log(n)) = O(n).
Now consider the two outer-most loops. Note that if the variable bound for the outer-most loop is k, then the loop indexed by i will loop exactly k times.
for( int bound = 1; bound <= n; bound *= 2 ) {
for( int i = 0; i < bound; i++ ) {
// work done by the inner-most loops
}
}
Since bound takes values 1, 2, 4, 8, ..., these two nested loops will loop:
1^2 + 2^2 + 4^2 + 8^2 + ... + (2^⌊log(n)⌋)^2
Note that this is just a geometric series with common ratio 4, so the summation gives us:
( 4(^(⌊log(n)⌋+1) - 1 ) / 3
= O(2^(2log(n))
= O(2^log(n^2))
= O(n^2)
Since the work done of the inner-most loops is independent of these two loops, the total complexity is given by O(n2) * O(n) = O(n3).

The first and second successive innermost loops have O(n) and O(log n)
complexity, respectively. Thus, the overall complexity of the innermost
part is O(n). The outermost and middle loops have complexity O(log n)
and O(n), so a straightforward (and valid) solution is that the overall
complexity is O(n^2*log(n))

Related

Time complexity of nested while with changing condition

I'm trying to solve the complexity of this loop
for(int i= 0; i < n; i++) {
c = i;
while(c > 1){
O(1);
c = c / 2;
}
}
as the while condition changes in every loop I don't know how to calculate that strange series.
I mean, if the loop where
for(int i= 0; i < n; i++) {
c = n;
while(c > 1){
O(1);
c = c / 2;
}
}
I know the while has a complexity of O(logn) and it repeats itself n times, so the complexity would be O(nlogn).
The problem I have with previous loop is "c=i". As c=i, first time (c=0) the loop would reproduce 0 times, when c=1 it would reproduce 0 times again, when c=2 it would reproduce 1 time, then the series would follow and it is 0, 0, 1, 2, 2, 3, 3... (while reproductions each time of for loop)
O(logn) would not repeat itself n times, would repeat a number of times I can't come up with, so I don't know how to solve it.
This need a bit of math involved.Given that log is well defined for a and b:
log(a) + log(b) = log(ab)
Here you have
log(1) + log(2) +....+ log(n) = log(1*....*n) = log(n!)
There is a mathematical approximation for log(n!), namely
log(n!) ~ nlog(n) - n + 1
which reveal O(log(n!)= O(nlog(n))

Time complexity of nested for loop with inner iteration variable dependent on the outer one

This is the loop structure :
for (int i = 1 ; i < n ; i++) {
for (int j = 0 ; j < n ; j += i) {
// do stuff
}
}
My guess was O(nlogn) as it clearly cannot be O(n^2) since the increment in j is increasing and it clearly cannot be O(n sqrt(n)) since the increment is not that high. But I have no idea how to prove it formally.
Each time complexity of the inner loop is based on the value of i is n/i. Hence, total time would be n + n/2 + n/3 + ... + n/n = n(1+1/2+1/3+...+1/n).
As we know 1+1/2+1/3+...+1/n is a harmonic sereis and asymptotically is log(n). Hence, the algorithm is run in O(nlog(n)).

Time complexity for two nested for loops

This question is for revision from a past test paper just needed advice if I on the right track.
Work out the time complexity T(n) of the following piece of code in terms of number of operations for a given integer n:
for ( int k = n; k >0; k /= 3 ) {
for ( int i = 0; i < n; i += 2 ) {
// constant number C of elementary operations
}
for ( int j = 2; j < n; j = (j*j)) {
// constant number C of elementary operations
}
}
So I thought the outer loop would be O(logn), the first inner loop would be O(n) and the second inner loop would be O(logn). Just wanted to know if I had a rough idea and how to move forward from here.
There was recently a question somewhat similar few days ago for which I provided a step-by-step description of complexity analysis: https://stackoverflow.com/a/49093954/926701
The outer loop is indeed O(log3(n))
The first inner loop is indeed O(n)
The second inner loop is O(log2(log2(n)))
Informally, for the second loop, with j(k) the sequence of values taken by the index j of the for loop, we can write:
j(1) = 2, j(2) = j(1)^2 = 4, j(3) = j(2)^2 = 16, ..., j(k) = j(k-1)^2 >= n
=> j(k) = j(k-1)^2 = j(k-2)^4 = ... = j(1)^(2^k) = 2^(2^k)
=> k = log2(log2(n))
Since the number of operations in the inner loops is independent from that of the outer loop, we can multiply the complexity:
T(n) = O(log3(n) * (n + log2(log2(n))))
= O(n.log3(n))
because log2(log2(n)) << n as n -> +Inf.

What is the Big-O of a nested loop, where the outer loop is n^2

for(int i = 1; i < n **2; i++)
{
for(int j = 1; j < i; j++)
{
s = s;
}
}
Since the Big O of the outter loop is O(n^2) would it still be multiplied by the inner loop making the total Big O notation be n(n^2) -> O(n^3)?
In the outer loop, i can take values from 1 to n^2. Then for each of those values, the inner loop goes from 1 to i. The number of operations performed for i=1 is 1, i=2 is 2, ..., i = n^2 is n^2.
So the total number of operations is the sum for i from 1 to n^2 of i. This is a well known series which has the closed form of (n^2)(n^2 + 1)/2 and that is O(n^4)

Asymptotic analysis of three interdependent nested for loops

The code fragment I am to analyse is below:
int sum = 0;
for (int i = 0; i < n; i++)
for (int j = 0; j < i * i; j++)
for (int k = 0; k < j; k++)
sum++;
I know that the first loop is O(n) but that's about as far as I've gotten. I think that the second loop may be O(n^2) but the more I think about it the less sense it makes. Any guidance would be much appreciated.
The first loop executes n times. Each time, the value i grows. For each such i, the second loop executes i*i times. That means the second loop executes 1*1 + 2*2 + 3*3 + ... + n*n times.
This is a summation of squares, and the formula for this is well-known. Hence we have the second loop executing (n(1 + n)(1 + 2 n))/6 times.
Thus, we know that in total there will be (n(1 + n)(1 + 2 n))/6 values of j, and that for each of these the third loop will execute 1 + 2 + ... + j = j(j+1)/2 times. Actually calculating how many times the third loop executes in total would be very difficult. Luckily, all you really need is a least upper bound for the order of the function.
You know that for each of the (n(1 + n)(1 + 2 n))/6 values of j, the third loop will execute less than n(n+1)/2 times. Therefore you can say that the operation sum++ will execute less than [(n(1 + n)(1 + 2 n))/6] * [n(n+1)/2] times. After some quick mental math, that amounts to a polynomial of maximal degree 5, therefore your program is O(n^5).
int sum = 0;
for (int i = 0; i < n; i++) // Let's call this N
for (int j = 0; j < i * i; j++) // Let's call this M
for (int k = 0; k < j; k++) // Let's call this K
sum++;
N is the number of steps of the entire program, M is the number of steps the two inner loops do and lastly K is the number of steps the last loop does.
It is easy to see that K = j, it takes j steps to do.
Then M = Sum(j=0,i^2,K) = Sum(j=0, i^2, j)
(First param is the iterator, second is the upper bound and last param is what we are adding)
This is actually now a sum of n numbers to i*i. Which means we can apply the formula ((n+1)*n)/2
M = Sum(j=0,i^2,j) = ((i^2+1)*(i^2))/2 = (i^4 + i^2)/2
N = Sum(i=0, n, M) = 1/2 * ( Sum(i=0, n, (i^4)) + Sum(i=0, n, (i^2)) )
These are both well known formulas and after a little playing you get:
N = (n^5)/10 + (n^4)/4 + (n^3)/3 + (n^2)/4 + n/15
This should be the exact number of steps the loop takes, but if you are interested in the O notation you can note that n^5 is the strongest growing part so the solution is O(n^5)
If you proceed methodically using Sigma Notation, you'll end up with the following result:
Try to count how many times the inner loop is executed:
The middle loop runs
0*0 times when i == 0
1*1 times when i == 1
2*2 times when i == 2
...
n*n = n^2 times when i == n.
So it is O(n^2).

Resources