Is this loop O(nlog(n))? - algorithm

I have a nested for loop that I am trying to analyze the efficiency of. The loop looks like this:
int n = 1000;
for (int i = 0; i < n; i++) {
for (int j = 0; j < i; j++) {
System.out.print("*");
}
}
I don't believe that this algorithm is O(n^2) because the inner loop does not run n times, it only runs i times. However, it certainly is not O(n). So I hypothesize that it must be between the two efficiencies, which gives O(nlog(n)). Is this accurate or is it really a O(n^2) algorithm and I'm misunderstanding the effect the inner loop has on the efficiency?

Your algorithm will run a triangular number of times:
n * (n + 1) / 2
In the above case, n = 999 because the first j loop doesn't run:
(999 * 1000) / 2 = 499500
It is lower than n**2, but it still is O(n**2), because n * (n + 1) / 2 is n**2 / 2 + n / 2. When n is large, you can ignore n / 2 compared to n**2 / 2, and you can also ignore the constant 1 / 2 factor.

I kind of get your doubts, but try to think in this way: what value will i have in the worst case scenario? Answer is n-1, right? So, as the complexity is evaluated by considering the worst case scenario it turns out that it is O(n^2) as n * (n-1) ~ n^2.

The number of iterations is sum from i=0 to n-1 (sum from j=0 to i-1 (1)). The inner sum is obviously equal to i. sum from i=0 to n-1 (i) = n * (n-1) / 2 = O(n^2) is well known.

Related

Time complexity of the inner loop

Can someone help me with calculating the time complexity of the inner loop? As far as I understand, the outer one will be O(n). But I have no idea how to calculate what happens inside the second one.
for (int i = 2; i < n; i++) {
for (int j = 2; i * j < n; j++) {
}
For every iteration of "outer loop", inner loop runs n/i times
So, total complexity of this will be given by:
n/2 + n/3 + n/4 + ...
= n * (1/2 + 1/3 + 1/4 ...)
For the right term above, upper bound is ln(n)
Hence, complexity of this code is O(n log n).
The inner loop runs from 2 up to but not including n/i times. You can express it as n/i - 2.
If we run the inner loop n - 2 times (since that's the number of times the outer loop runs), we get the following summation:
(n/2 - 2) + (n/3 - 2) + ... + (3 - 2)
I have a hunch but can't remember 100% that this series sums up to log_e(n) * n or similar. So in terms of time complexity, this becomes O(log n * n).
The loop exits as soon as i * j ≥ n, i.e. when j = ceiling(n / i) ~ n / i. As it starts from j=2, the number of iterations is ceiling(n / i) - 1.

What is the time complexity of this algorithm where the limit is changing inside the loop?

How do you calculate the time complexity or big-O of this algorithm, where it is not clear how many iterations the loop iterates?
cin >> n;
int i = 0;
for (int j=1; i <= n; j++) {
i += j;
}
Appreciate that the series in j looks like this:
1 + 2 + 3 + 4 + ... + n (not the n in your question)
The sum of this series is given by the Gaussian formula:
n * (n + 1) / 2
This means that the sum varies as n^2, where n is the number of terms or steps in the loop. Therefore, the loop should vary as:
O(sqrt(n))
Where now n here is the n from your loop code, i.e. the upper bound of the loop.

Why codes that does N/2 steps are considered O(N)?

Consider a nested loop. The outer loop starts at i=0 and iterates N times, and the inner loop starts at j=i+1 and iterates up to j=N. So the inner loop will roughly do n/2 steps. At the end, however, the runtime is considered O(N2) Why is the inner loop considered O(N) and not O(N/2), since we have other codes that have O(log n) runtimes?
It seems that you're mixing two different cases (division in the final formula - N**2/C - where C can be ignored: O(N**2/C) == O(N**2); and division in the loop: for (int j = N; j >= 1; j /= C) where C leads to logarithm):
for (int i = 1; i <= N; ++i)
for (int j = i + 1; j <= N; ++j)
SomeOperation(i, j);
Let's count the number of SomeOperation(i, j) to be performed:
i j
-------------------
1 N - 1
2 N - 2
3 N - 3
..
N 0
So we have
(N - 1) + (N - 2) + ... + 2 + 1 + 0 ==
N * (N - 1) / 2 ==
N**2 / 2 - N / 2 ==
O(N**2 / 2 - N / 2) == O(N**2 / 2) == O(N**2)
On the contrary (please, notice j /= 2 instead of ++j) which means far fewer inner loops
for (int i = 1; i <= N; ++i)
for (int j = N; j >= 1; j /= 2)
SomeOperation(i, j);
i j
-------------------
1 log(N)
2 log(N)
3 log(N)
..
N log(N)
And here we have
log(N) + log(N) + ... + log(N) ==
N * log(N) ==
O(N * log(N))
Big-O notation represents the complexity of time it takes a segment of code to execute, in proportion to some metric. Usually, the symbols used in braces represent quantities like input size, container size, etc.
In an intuitive sense, O(N) refers to the number of times a code runs in proportion to the symbols included inside the braces, as opposed to the exact number of times it runs. It may run K = N/2 times in reality, but the point Big-O notation tries to underscore is the fact that, the value of K is estimated by how large N is, and it is directly proportional to K.
To further clarify, notice that for a large enough N, the division by 2 does not really matter, as it is simply a constant factor. The notion that for a large enough N, constants are negligible is critical to understand to get a good grasp of various complexity notations, including Big-O.

how i can find the time complexity of the above code

for(i=0; i<n; i++) // time complexity n+1
{
k=1; // time complexity n
while(k<=n) // time complexity n*(n+1)
{
for(j=0; j<k; j++) // time complexity ??
printf("the sum of %d and %d is: %d\n",j,k,j+k); time complexity ??
k++;
}
What is the time complexity of the above code? I stuck in the second (for) and i don't know how to find the time complexity because j is less than k and not less than n.
I always having problems related to time complexity, do you guys got some good article on it?
especially about the step count and loops.
From the question :
because j is less than k and not less than n.
This is just plain wrong, and I guess that's the assumption that got you stuck. We know what values k can take. In your code, it ranges from 1 to n (included). Thus, if j is less than k, it is also less than n.
From the comments :
i know the the only input is n but in the second for depends on k an not in n .
If a variable depends on anything, it's on the input. j depends on k that itself depends on n, which means j depends on n.
However, this is not enough to deduce the complexity. In the end, what you need to know is how many times printf is called.
The outer for loop is executed n times no matter what. We can factor this out.
The number of executions of the inner for loop depends on k, which is modified within the while loop. We know k takes every value from 1 to n exactly once. That means the inner for loop will first be executed once, then twice, then three times and so on, up until n times.
Thus, discarding the outer for loop, printf is called 1+2+3+...+n times. That sum is very well known and easy to calculate : 1+2+3+...+n = n*(n+1)/2 = (n^2 + n)/2.
Finally, the total number of calls to printf is n * (n^2 + n)/2 = n^3/2 + n^2/2 = O(n^3). That's your time complexity.
A final note about this kind of codes. Once you see the same patterns a few times, you quickly start to recognize the kind of complexity involved. Then, when you see that kind of nested loops with dependent variables, you immediately know that the complexity for each loop is linear.
For instance, in the following, f is called n*(n+1)*(n+2)/6 = O(n^3) times.
for (i = 1; i <= n; ++i) {
for (j = 1; j <= i; ++j) {
for (k = 1; k <= j; ++k) {
f();
}
}
}
First, simplify the code to show the main loops. So, we have a structure of:
for(int i = 0; i < n; i++) {
for(int k = 1; k <= n; k++) {
for(int j = 0; j < k; j++) {
}
}
}
The outer-loops run n * n times but there's not much you can do with this information because the complexity of the inner-loop changes based on which iteration of the outer-loop you're on, so it's not as simple as calculating the number of times the outer loops run and multiplying by some other value.
Instead, I would find it easier to start with the inner-loop, and then add the outer-loops from the inner-most to outer-most.
The complexity of the inner-most loop is k.
With the middle loop, it's the sum of k (the complexity above) where k = 1 to n. So 1 + 2 + ... + n = (n^2 + n) / 2.
With the outer loop, it's done n times so another multiplication by n. So n * (n^2 + n) / 2.
After simplifying, we get a total of O(n^3)
The time complexity for the above code is : n x n x n = n^3 + 1+ 1 = n^3 + 2 for the 3 loops plus the two constants. Since n^3 carries the heaviest growing rate the constant values can be ignored, so the Time complexity would be n^3.
Note: Take each loop as (n) and to obtained the total time, multiple the (n) values in each loop.
Hope this will help !

O(n log log n) time complexity

I have a short program here:
Given any n:
i = 0;
while (i < n) {
k = 2;
while (k < n) {
sum += a[j] * b[k]
k = k * k;
}
i++;
}
The asymptotic running time of this is O(n log log n). Why is this the case? I get that the entire program will at least run n times. But I'm not sure how to find log log n. The inner loop is depending on k * k, so it's obviously going to be less than n. And it would just be n log n if it was k / 2 each time. But how would you figure out the answer to be log log n?
For mathematical proof, inner loop can be written as:
T(n) = T(sqrt(n)) + 1
w.l.o.g assume 2 ^ 2 ^ (t-1)<= n <= 2 ^ (2 ^ t)=>
we know 2^2^t = 2^2^(t-1) * 2^2^(t-1)
T(2^2^t) = T(2^2^(t-1)) + 1=T(2^2^(t-2)) + 2 =....= T(2^2^0) + t =>
T(2^2^(t-1)) <= T(n) <= T(2^2^t) = T(2^2^0) + log log 2^2^t = O(1) + loglogn
==> O(1) + (loglogn) - 1 <= T(n) <= O(1) + loglog(n) => T(n) = Teta(loglogn).
and then total time is O(n loglogn).
Why inner loop is T(n)=T(sqrt(n)) +1?
first see when inner loop breaks, when k>n, means before that k was at least sqrt(n), or in two level before it was at most sqrt(n), so running time will be T(sqrt(n)) + 2 ≥ T(n) ≥ T(sqrt(n)) + 1.
Time Complexity of a loop is O(log log n) if the loop variables is reduced / increased exponentially by a constant amount. If the loop variable is divided / multiplied by a constant amount then complexity is O(Logn).
Eg: in your case value of k is as follow. Let i in parenthesis denote the number of times the loop has been executed.
2 (0) , 2^2 (1), 2^4 (2), 2^8 (3), 2^16(4), 2^32 (5) , 2^ 64 (6) ...... till n (k) is reached.
The value of k here will be O(log log n) which is the number of times the loop has executed.
For the sake of assumption lets assume that n is 2^64. Now log (2^64) = 64 and log 64 = log (2^6) = 6. Hence your program ran 6 times when n is 2^64.
I think if the codes are like this, it should be n*log n;
i = 0;
while (i < n) {
k = 2;
while (k < n) {
sum += a[j] * b[k]
k *= c;// c is a constant bigger than 1 and less than k;
}
i++;
}
Okay, So let's break this down first -
Given any n:
i = 0;
while (i < n) {
k = 2;
while (k < n) {
sum += a[j] * b[k]
k = k * k;
}
i++;
}
while( i<n ) will run for n+1 times but we'll round it off to n times.
now here comes the fun part, k<n will not run for n times instead it will run for log log n times because here instead of incrementing k by 1,in each loop we are incrementing it by squaring it. now this means it'll take only log log n time for the loop. you'll understand this when you learn design and analysis of algorithm
Now we combine all the time complexity and we get n.log log n time here I hope you get it now.

Resources