What is the asymtotic complexity of the following code - complexity-theory

I have some doubts about the complexity of the following code
The outer loop is going to execute for O(N) times
I have doubts about the inner loop whether it is going to execute for O(1) or O(n)
for (int i=0; i<n; i++)
for (int j=i; j< i; j+=i)
{
print(“*”);
}
}

As someone pointed out, as the code stands the inner loop will execute 0 times, so I'm assuming you meant:
for (int i=0; i<n; i++)
{
for (int j=0; j<n; j+=i)
{
print(“*”);
}
}
In that case, the number of executions of the inner loop will be N, N/2, N/3, N/4, ... in the subsequent executions of the outer loop. So the total time will be N + N/2 + N/3 + ... = N * (1 + 1/2 + 1/3 + ...). Now, we can see that (1 + 1/2 + 1/3 + 1/4 + ...) <= (1 + 1/2 + 1/4 + 1/4 + 1/8 + 1/8 + 1/8 + 1/8 + ...). This second expression requires twice as many terms for each additional 1/2 (for example you need twice as many 1/8 as 1/4 to make a 1/2) so it will be O(logN). So the total complexity is O(N*logN).

The inner is O(i), which increases everytime, so the global complexity would be:
Number of times the inner loop is executed: 1+2+3+...+n, that is:
Therefore O(n^2)

Because j is initialized to i in the inner loop, that loop never executes as the condition requires j be less than i. Therefore, the complexity is O(n): all this does is increment i in the outer loop and verify during each iteration of the outer loop body that i is not less than i.

Related

Time complexity of the inner loop

Can someone help me with calculating the time complexity of the inner loop? As far as I understand, the outer one will be O(n). But I have no idea how to calculate what happens inside the second one.
for (int i = 2; i < n; i++) {
for (int j = 2; i * j < n; j++) {
}
For every iteration of "outer loop", inner loop runs n/i times
So, total complexity of this will be given by:
n/2 + n/3 + n/4 + ...
= n * (1/2 + 1/3 + 1/4 ...)
For the right term above, upper bound is ln(n)
Hence, complexity of this code is O(n log n).
The inner loop runs from 2 up to but not including n/i times. You can express it as n/i - 2.
If we run the inner loop n - 2 times (since that's the number of times the outer loop runs), we get the following summation:
(n/2 - 2) + (n/3 - 2) + ... + (3 - 2)
I have a hunch but can't remember 100% that this series sums up to log_e(n) * n or similar. So in terms of time complexity, this becomes O(log n * n).
The loop exits as soon as i * j ≥ n, i.e. when j = ceiling(n / i) ~ n / i. As it starts from j=2, the number of iterations is ceiling(n / i) - 1.

Could someone explain the time complexity for these code snippets?

I am practicing time complexity and some of them if a bit too complicated for me.
I would really appreciate of someone could explain these for me.
A) The time complexity is O(n). How is that?
for (int i = N; i > 0; i = i/2) {
for (int j = i+i; j > 0; j--) {
doSomething(i, j);
}
}
B) The time complexity is O(n logn). How is that?
for (int i = N+N; i > 0; i--) {
for (int j = N; j > 0; j = j/2) {
doSomething(i, j);
}
}
I suppose we must assume that the execution of doSomething takes constant time, independent of the values it gets as arguments.
Algorithm A:
On the first iteration of the outer loop, the inner loop iterates 2𝑁 times. Every next iteration of the outer loop, the number of iterations of the inner loop is halved. So we get this series:
      2𝑁 + 𝑁 + 𝑁/2 + 𝑁/4 + 𝑁/8 + ... + 2
Given that this series is finite, but has the pattern of 1/2 + 1/4 + 1/8 + 1/16 + ..., we can conclude that this is less than 4𝑁, and so it is O(𝑁).
Algorithm B:
Here the number of iterations of the inner loop does not depend on the value of 𝑖, so it is always the same: each time it performs log2𝑁 iterations (since 𝑗 is halved each iteration). As the outer loop iterates 2𝑁 times, doSomething is called 2𝑁log2𝑁, which is O(𝑁log𝑁)
problem A
Here the first loop will execute log2(n)+1 times, and the second loop will execute i+i times. So what will the value of i in every second loop.
for n, it will be like
n + n/2 + n/4 + n/8 + n/16 + .......
summation of this will be the answer.
as we know
a + ar + ar^2 + ar^3 + ar^4 .... + ar^m = (1-a^(m+1))/(1-a)
here a = n, r = 1/2 and m = log2(n)+1
n + n/2 + n/4 + n/8 + ... n/(2^(m)) =2n−n/2^m = 2n-1;
so the complexity is O(2n-1) = O(n)
problem B
here the first loop will execute n times. And for every first loop execution, the second loop will be executed log2(n)+1 time.
for (int j = n; j > 0; j = j/2)
for example n = 10 ,
value of j will be 10, 5, 2, 1 ,0. For 10 it will execute 4 times or log2(10)+1 times .
so for every first loop it will execute log2(n)+1 times. so the complexity is
O(n(log2(n)+1)) = O (nlog(n))

Find time complexity of a given nested loop

What is the time complexity for this loop below?
int sum = 0;
for(int n = N; n > 0; n/=2)
for(int i = 0; i < n; i++)
sum++
I figured out that the inner loop runs N + N/2 + N/4 + N/8 + ... + 1 times, but now I don't know how to proceed. How do I get the tilde or big-O from this loop?
Thanks in advance.
Big-O from this loop is O(N) such as the dependence of the number of iterations on N is linear.
About 1/2 + 1/4 + 1/8 + ... https://en.wikipedia.org/wiki/1/2_%2B_1/4_%2B_1/8_%2B_1/16_%2B_%E2%8B%AF
About Big-O https://www.khanacademy.org/computing/computer-science/algorithms/asymptotic-notation/a/big-o-notation
N + N/2 + N/4 + N/8 + ... + 1 forms a GP(geometric progress) which sums up to 2N. While defining time complexity in terms of big O, we discard the constant. So time complexity of your problem is O(N).

Time complexity of following code with explanation?

What is the time complexity of this algorithm, and why?
int count = 0;
for (int i = N; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
count += 1;
}
}
The correct answer is O(n), but I am getting O(nlogn). Can anyone tell me why it's O(n)?
On the first iteration, the inner loop executes N times, and then N/2 times, and then N/4 times, and so on. This can be expressed as an infinite sum:
N + N/2 + N/4 + N/8 + N/16 + N/32 + ...
If you factor out the N from each term, you get:
N * (1 + 1/2 + 1/4 + 1/8 + 1/16 + 1/32 + ...)
The infinite sequence in parentheses converges on the value 2 (more info on Wikipedia). Therefore, the number of operations can be simplified to:
N * 2
In terms of Big-O, the asymptotic value is within:
O(N)
You can check this by observing that the relationship in the output between N and count is linear: Ideone Demo

Order of Growth in a for loop

How can I analyze this code fragment to conclude that it is O(N)?
int sum = 0;
for (int i = 1; i < N; i *= 2)
for (int j = 0; j < i; j++)
sum++;
The value of i in the outer loop increases exponentially, you can think of it as increasing a binary digit each time. The number of digits it takes to represent N is log(N). So the outer loop will execute log(N) times The inner loop will execute
2^0 + 2^1 + 2^2 + ... + 2^log(N)
The formula for this geometric series is (updated from Niklas B's comment)
1(1 - 2^log(N))/(1 - 2)
= 2^(log(N) + 1) - 1
~= 2N
Over all the algorithm will be O(2N + log(N))
but for big-O notation, the 2N component will overwhelm log(N) so overall the complexity is O(N)

Resources