Order of Growth in a for loop - algorithm

How can I analyze this code fragment to conclude that it is O(N)?
int sum = 0;
for (int i = 1; i < N; i *= 2)
for (int j = 0; j < i; j++)
sum++;

The value of i in the outer loop increases exponentially, you can think of it as increasing a binary digit each time. The number of digits it takes to represent N is log(N). So the outer loop will execute log(N) times The inner loop will execute
2^0 + 2^1 + 2^2 + ... + 2^log(N)
The formula for this geometric series is (updated from Niklas B's comment)
1(1 - 2^log(N))/(1 - 2)
= 2^(log(N) + 1) - 1
~= 2N
Over all the algorithm will be O(2N + log(N))
but for big-O notation, the 2N component will overwhelm log(N) so overall the complexity is O(N)

Related

Time complexity of the inner loop

Can someone help me with calculating the time complexity of the inner loop? As far as I understand, the outer one will be O(n). But I have no idea how to calculate what happens inside the second one.
for (int i = 2; i < n; i++) {
for (int j = 2; i * j < n; j++) {
}
For every iteration of "outer loop", inner loop runs n/i times
So, total complexity of this will be given by:
n/2 + n/3 + n/4 + ...
= n * (1/2 + 1/3 + 1/4 ...)
For the right term above, upper bound is ln(n)
Hence, complexity of this code is O(n log n).
The inner loop runs from 2 up to but not including n/i times. You can express it as n/i - 2.
If we run the inner loop n - 2 times (since that's the number of times the outer loop runs), we get the following summation:
(n/2 - 2) + (n/3 - 2) + ... + (3 - 2)
I have a hunch but can't remember 100% that this series sums up to log_e(n) * n or similar. So in terms of time complexity, this becomes O(log n * n).
The loop exits as soon as i * j ≥ n, i.e. when j = ceiling(n / i) ~ n / i. As it starts from j=2, the number of iterations is ceiling(n / i) - 1.

Time complexity of a three nested loops?

What is the time complexity for this piece of code?
for(int = n; i > 0; i--){
for(int j = 1; j < n; j*=2){
for(int k = 0; k < j; k++) {
...//constant number C of operations
}
}
}
I find that the two innerloops creates the time complexity of O(n*logn)(?). Together with outer loop this results in time complexity of O(n^2 * logn) for the whole piece of code (?).
According to the answer, the result should be O(n^2) and not O(n^2 * logn).
Can someone help me understand why?
You're right that the first for loop runs O(N) times. Lets look at the inner two.
So looking at j it will be 1,2,4,8,16 .. n, this means the inner k loop will run
1 + 2 + 4 .. + n
times, or written another way
2^0 + 2^1 + 2^2 ... 2^log(n)
you can look up this summation and find its O(2^log(n + 1))) = O(n).
So if we multiply the two inner and outer loops we would get O(n^2)

Time complexity of nested for loop with inner iteration variable dependent on the outer one

This is the loop structure :
for (int i = 1 ; i < n ; i++) {
for (int j = 0 ; j < n ; j += i) {
// do stuff
}
}
My guess was O(nlogn) as it clearly cannot be O(n^2) since the increment in j is increasing and it clearly cannot be O(n sqrt(n)) since the increment is not that high. But I have no idea how to prove it formally.
Each time complexity of the inner loop is based on the value of i is n/i. Hence, total time would be n + n/2 + n/3 + ... + n/n = n(1+1/2+1/3+...+1/n).
As we know 1+1/2+1/3+...+1/n is a harmonic sereis and asymptotically is log(n). Hence, the algorithm is run in O(nlog(n)).

Time complexity of following code with explanation?

What is the time complexity of this algorithm, and why?
int count = 0;
for (int i = N; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
count += 1;
}
}
The correct answer is O(n), but I am getting O(nlogn). Can anyone tell me why it's O(n)?
On the first iteration, the inner loop executes N times, and then N/2 times, and then N/4 times, and so on. This can be expressed as an infinite sum:
N + N/2 + N/4 + N/8 + N/16 + N/32 + ...
If you factor out the N from each term, you get:
N * (1 + 1/2 + 1/4 + 1/8 + 1/16 + 1/32 + ...)
The infinite sequence in parentheses converges on the value 2 (more info on Wikipedia). Therefore, the number of operations can be simplified to:
N * 2
In terms of Big-O, the asymptotic value is within:
O(N)
You can check this by observing that the relationship in the output between N and count is linear: Ideone Demo

Why codes that does N/2 steps are considered O(N)?

Consider a nested loop. The outer loop starts at i=0 and iterates N times, and the inner loop starts at j=i+1 and iterates up to j=N. So the inner loop will roughly do n/2 steps. At the end, however, the runtime is considered O(N2) Why is the inner loop considered O(N) and not O(N/2), since we have other codes that have O(log n) runtimes?
It seems that you're mixing two different cases (division in the final formula - N**2/C - where C can be ignored: O(N**2/C) == O(N**2); and division in the loop: for (int j = N; j >= 1; j /= C) where C leads to logarithm):
for (int i = 1; i <= N; ++i)
for (int j = i + 1; j <= N; ++j)
SomeOperation(i, j);
Let's count the number of SomeOperation(i, j) to be performed:
i j
-------------------
1 N - 1
2 N - 2
3 N - 3
..
N 0
So we have
(N - 1) + (N - 2) + ... + 2 + 1 + 0 ==
N * (N - 1) / 2 ==
N**2 / 2 - N / 2 ==
O(N**2 / 2 - N / 2) == O(N**2 / 2) == O(N**2)
On the contrary (please, notice j /= 2 instead of ++j) which means far fewer inner loops
for (int i = 1; i <= N; ++i)
for (int j = N; j >= 1; j /= 2)
SomeOperation(i, j);
i j
-------------------
1 log(N)
2 log(N)
3 log(N)
..
N log(N)
And here we have
log(N) + log(N) + ... + log(N) ==
N * log(N) ==
O(N * log(N))
Big-O notation represents the complexity of time it takes a segment of code to execute, in proportion to some metric. Usually, the symbols used in braces represent quantities like input size, container size, etc.
In an intuitive sense, O(N) refers to the number of times a code runs in proportion to the symbols included inside the braces, as opposed to the exact number of times it runs. It may run K = N/2 times in reality, but the point Big-O notation tries to underscore is the fact that, the value of K is estimated by how large N is, and it is directly proportional to K.
To further clarify, notice that for a large enough N, the division by 2 does not really matter, as it is simply a constant factor. The notion that for a large enough N, constants are negligible is critical to understand to get a good grasp of various complexity notations, including Big-O.

Resources