Correct time Complexity [duplicate] - for-loop

This question already has an answer here:
How the time complexity of the following code is O(n)?
(1 answer)
Closed 6 years ago.
I have come across this question which asks to find the time complexity.
int count = 0;
for (int i = N; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
count += 1;
}
}
It says that it's time complexity is O(n), should it be O(nlogn) as the first loop is logn and second is n.

It says that it`s time complexity is O(n), should it be O(nlogn) as
the first loop is logn and second is n.
The inner-loop is based on outer loop. So, your claim is not valid.
And, += (addition assignment operator) complexity is O(1).
For first iteration of the outer-loop, the inner-loop will execute for N times.
For second iteration of the outer-loop, the inner-loop will execute for N/2 times.
And, so on...
Therefore, total execution steps
= N + N/2 + ... + 1
// log2 N times Geometric Progression...
~ N / (1-(1/2)) (Infinite GP Summation Formula) //though the series would go up to 1
~ 2N.
// ~ means approximately.
Therefore, the time complexity of the code comes out to be O(N).
So, the answer given is correct.

Related

Why does this code has a time complexity of O(N*N)? [duplicate]

This question already has answers here:
How can I find the time complexity of an algorithm?
(10 answers)
Closed 5 years ago.
int a = 0;
for (i = 0; i < N; i++) {
for (j = N; j > i; j--) {
a = a + i + j;
}
}
Why does this code has time complexity of O(NN) ? Are all nested loops with n,m,q,.... bounds have a complexity of O(nm*q....) even though few iterations of the loops will not happen?
The reason for this is that constant factors are ignored in Big-O notation.
Your outer loop runs N times, while the inner loop runs on average N/2 times for each of the outer iterations.
This gives O(N * 1/2 * N) executions of the statements within the inner loop. Since we ignore constant factors, we end up with O(N * N) which is O(N^2).
The reason for omitting constants is simple: Big-O notation is about what happens when N is big. If you look at it this way - O((N^2)/2) - you see that increasing N has much more influence in the term, than whether or not we omit the division by two.
I wouldn't say all nested loops have that complexity, but this one certainly does. The number of iterations through the inner loop is something like this (there might be an off-by-one error):
i = 0: N times
i = 1: N - 1 times
i = 2: N - 2 times
. . .
i = N-1: 1 times
i = N: 0 times
Let's count the total number of times the innermost code is called. In this case, you can figure this out arithmetically. The innermost code is executed about N * (N + 1) / 2 times (this is the sum of elements from 1 to N). This simplifies to 0.5 * N^2 + 0.5 * N and that has a complexity of N^2.

how i can find the time complexity of the above code

for(i=0; i<n; i++) // time complexity n+1
{
k=1; // time complexity n
while(k<=n) // time complexity n*(n+1)
{
for(j=0; j<k; j++) // time complexity ??
printf("the sum of %d and %d is: %d\n",j,k,j+k); time complexity ??
k++;
}
What is the time complexity of the above code? I stuck in the second (for) and i don't know how to find the time complexity because j is less than k and not less than n.
I always having problems related to time complexity, do you guys got some good article on it?
especially about the step count and loops.
From the question :
because j is less than k and not less than n.
This is just plain wrong, and I guess that's the assumption that got you stuck. We know what values k can take. In your code, it ranges from 1 to n (included). Thus, if j is less than k, it is also less than n.
From the comments :
i know the the only input is n but in the second for depends on k an not in n .
If a variable depends on anything, it's on the input. j depends on k that itself depends on n, which means j depends on n.
However, this is not enough to deduce the complexity. In the end, what you need to know is how many times printf is called.
The outer for loop is executed n times no matter what. We can factor this out.
The number of executions of the inner for loop depends on k, which is modified within the while loop. We know k takes every value from 1 to n exactly once. That means the inner for loop will first be executed once, then twice, then three times and so on, up until n times.
Thus, discarding the outer for loop, printf is called 1+2+3+...+n times. That sum is very well known and easy to calculate : 1+2+3+...+n = n*(n+1)/2 = (n^2 + n)/2.
Finally, the total number of calls to printf is n * (n^2 + n)/2 = n^3/2 + n^2/2 = O(n^3). That's your time complexity.
A final note about this kind of codes. Once you see the same patterns a few times, you quickly start to recognize the kind of complexity involved. Then, when you see that kind of nested loops with dependent variables, you immediately know that the complexity for each loop is linear.
For instance, in the following, f is called n*(n+1)*(n+2)/6 = O(n^3) times.
for (i = 1; i <= n; ++i) {
for (j = 1; j <= i; ++j) {
for (k = 1; k <= j; ++k) {
f();
}
}
}
First, simplify the code to show the main loops. So, we have a structure of:
for(int i = 0; i < n; i++) {
for(int k = 1; k <= n; k++) {
for(int j = 0; j < k; j++) {
}
}
}
The outer-loops run n * n times but there's not much you can do with this information because the complexity of the inner-loop changes based on which iteration of the outer-loop you're on, so it's not as simple as calculating the number of times the outer loops run and multiplying by some other value.
Instead, I would find it easier to start with the inner-loop, and then add the outer-loops from the inner-most to outer-most.
The complexity of the inner-most loop is k.
With the middle loop, it's the sum of k (the complexity above) where k = 1 to n. So 1 + 2 + ... + n = (n^2 + n) / 2.
With the outer loop, it's done n times so another multiplication by n. So n * (n^2 + n) / 2.
After simplifying, we get a total of O(n^3)
The time complexity for the above code is : n x n x n = n^3 + 1+ 1 = n^3 + 2 for the 3 loops plus the two constants. Since n^3 carries the heaviest growing rate the constant values can be ignored, so the Time complexity would be n^3.
Note: Take each loop as (n) and to obtained the total time, multiple the (n) values in each loop.
Hope this will help !

Time complexity analysis inconsistency

I have this code :
int fun(int n)
{
int count = 0;
for (int i = n; i > 0; i /= 2)
for (int j = 0; j < i; j++)
count += 1;
return count;
}
The time complexity of this code can be thought of as O(n) because O(n+n/2+n/4+...) = O(n)
By that logic, the time complexity of this snippet can also be argued to be O(n) :
for(i = 1; i < n; i *= 2)
//O(1) statements
Since O(1+2+4+..+n/4+n/2) = O(n). But since the loop runs log(n) times, it can be log(n) too.
Why is the former one not : log(n) times the outer loop * log(n) times the inner loop so, log(n)log(n)
What am I doing wrong ?
The first snippet has the outer loop that executes O(log n) times, and each iteration the inner loop executes O(i) times. If you sum any number of terms of the form n / 2^k, you'll get O(n).
The second piece of code has O(log n) iterations of O(1) operations, and sum of logarithmic amount of constants is still logarithmic.
In the first example, you don't have an O(1) statement inside your loop, as you have for (int j = 0; j < i; j++) count += 1. If in your second example you put the same inner loop of the first example, you are back to the same complexity. The first loop is not O(n*log(n)); this is easy to demonstrate because you can find an upper bound in O(2n) which is equivalent to O(n).
The time complexity of the 2nd one should not be calculated as a series O(1+2+4+..+n/4+n/2) = O(n), because it is not that series.
Notice the first one. It is being calculated as a series because one counts the number of times the inner for loop executes and then add all of them (series) to get the final time complexity.
When i=n inner for loop executes n times
When i=(n/2) inner for loop executes n/2 times
When i=(n/4) inner for loop executes n/4 times
and so on..
But in the second one, there is no series to add. It just comes to a formula (2^x) = n, which evaluates to x = logn.
(2^x) = n this formula can be obtained by noticing that i starts with 1, and when it becomes 2 it is multiplied by 2 until it reaches n.
So one needs to find out how many times 2 needs to be multiplied by 2 to reach n.
Thus the formula (2^x) = n, and then solve for x.

Calculate Big O Notation

I currently have the following pseudo code, and I am trying to figure out why the answer to the question is O(n).
sum = 0;
for (i = 0; i < n; i++) do
for (j = n/3;j < 2*n; j+= n/3) do
sum++;
I thought the answer would be O(n^2) since the first for loop would run 'n' times and the second for loop has += n/3, giving it another (n divided by something times), which would just simplify to O(n^2). Could somebody explain why it is O(n)?
This is because the second loop runs in constant amount of operations (does not depend on n). From n/3 to 2n with a step n/3 which is similar to from 1/3 to 2 with a step 1/3.
This will run 5-6 times for reasonable n (not 0) (the number is not important and depends on how do you calculate /)
The inner loop increments by a multiple of n, not by 1, so its runtime is bounded by a constant (6?). So the total number of steps is bounded by a constant multiple of n (namely 6n).

Loop Analysis - Analysis of Algorithms

This question is based off of this resource http://algs4.cs.princeton.edu/14analysis.
Can someone break down why Exercise 6 letter b is linear? The outer loop seems to be increasing i by a factor of 2 each time, so I would assume it was logarithmic...
From the link:
int sum = 0;
for (int n = N; n > 0; n /= 2)
for (int i = 0; i < n; i++)
sum++;
This is a geometric series.
The inner loops runs i iterations per iteration of the outer loop, and the outer loop decreases by half each time.
So, summing it up gives you:
n + n/2 + n/4 + ... + 1
This is geometric series, with r=1/2 and a=n - that converges to a/(1-r)=n/(1/2)=2n, so:
T(n) <= 2n
And since 2n is in O(n) - the algorithm runs in linear time.
This is a perfect example to see that complexity is NOT achieved by multiplying the complexity of each nested loop (that would have got you O(nlogn)), but by actually analyzing how many iterations are needed.
Yes its simple
See the value of n is decreasing by half each time and I runs n times.
So for the first time i goes from 1 to n
next time 0 to n/2
and hence 0 to n/k on kth term.
Now total time inner loop would run = Log(n)
So its a GP the number of times i is running.
with terms
n,n/2,n/4,n/8....0
so we can find the sum of the GP
2^(long(n) +1)-1 / (2-1)
2^(long(n)+1) = n
hence n-1/(1) = >O(n)

Resources