Algorithmic complexity of this nested loop - algorithm

Is it O(n) or O(n*logn) of the code below:
for(int j=n, int sum = 0; j>0 ; j--)
for(int k=j; k >0; k--) sum++;
List of iterations:
j = 5: k = 5, 4, 3, 2, 1
j = 4: k = 4, 3, 2, 1,
j = 3: k = 3, 2, 1
j = 2: k = 2, 1
j = 1: k = 1
We have 15 iterations in total.
But, if it is O(n), then only 5 iterations must be.
And if it is O(n*logn) answer would be only around 11-12 iterations.

It's O(n^2). Why? Well it takes:
Look, that for n = 5 the number of calculations i 15 in deed. On the other hand, for n=100 it will be 5050. Which is far away from 100log100 which is around 460.
According to Wikipedia:
Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. It is a member of a family of notations invented by Paul Bachmann, Edmund Landau, and others, collectively called Bachmann-Landau notation or asymptotic notation.

Related

What is the time complexity for this loop

How can we find the time complexity for this loop
int c = 0;
int j = 1;
while (j< n^3) {
c+=1;
System.out.println(c);
j=j*4;
}
Since every time j is multiplied by 4 we can say after every iteration it can be written as :
1, 4, (4^2), ..., (4^k)
Now for loop to be false, (4^k) >= n^3
4^k >= n^3
k = log(n^3) to the base 4
You can further simplify it to:
3log(n) to base 4 and remove 3 as we do for constants.
k = log(n)
This should be the complexity of your loop.

summation function prove it is big oh and big theta

I don't get how to solve the summation problem proving that it is big oh of n^4 and big omega of n^4.
The problem is this:
f(n) = Σ(i=1 to n) Σ(j=1 to i) Σ(k=1 to j) of k
I wrote the code in C++ for what I think the summation is saying.
for(int i = 1; i <=n; i++)
for(int j = 1; j <= i; j++)
for (int k = 1; k<= j; k++)
something bigoh(1)
I know that I need to prove it is big oh and big omega of n^4
Your code is not reflecting the sum, as the last part of the summing formula is k, while your code assumes a constant for the inner part ("something bigoh(1)"). The code should be:
sum = 0
for(int i = 1; i <=n; i++)
for(int j = 1; j <= i; j++)
for (int k = 1; k <= j; k++)
for (int m = 1; m <= k; m++)
sum++
The innermost loop looks a bit overkill, because it can be replaced with
sum += k
...but writing it this way you can translate the problem to how many times sum++ is executed in the code.
Imagine you have an array of values 1,2,...n, and that you should pick four numbers from it (allowing to pick the same number again), but the order of picking is not important, then you can pick:
1, 1, 1, 1
2, 1, 1, 1
2, 2, 1, 1
2, 2, 2, 1
2, 2, 2, 2
3, 1, 1, 1
3, 2, 1, 1
...
...etc. You would not count {1, 2, 1, 1} as that is one you already counted with {2, 1, 1, 1} -- order is not distinguished. So we only count where the chosen numbers are in a non-increasing order.
Now notice how the four nested loops in this (corrected) code do exactly that: they iterate such combinations, avoiding to count a set twice (by keeping i >= j >= k >= m).
So given that the inner task has constant time complexity, this problem boils down to: how many such combinations exist?
This is a Combination with repetitions. This is denoted as C((n, m)), where in our case m=4, so we count the number of 4-multisubsets, C((n, 4)) ("n multichoose 4"). This number is equivalent to
n(n+1)(n+2)(n+3)/4!
This is evidently O(n4).
There is no way there can fewer (or more) executions of the inner part of the nested loops, so this is also a lower bound: Ω(n4)

What is the time complexity for this Algorithm?

for i = 1 to n do
j = 2
while j < i do
j = j * j
I think it's time complexity is : log(n!) = n * log(n).
but solution said that it is : n * loglog(n) and I didn't understand why?
In the explanation below, I assume that all arithmetic and comparison operations are O(1).
for i = 1 to n do
The below is repeated N times, which makes the n * part in the solution.
j = 2
while j < i do
j = j * j
The above calculates the first number of the following sequence that's >= i :
2 = 2^(2^0)
4 = 2^(2^1)
16 = 2^(2^2)
256 = 2^(2^3)
65536 = 2^(2^4)
...
So the only thing you need to do is to find the relation between i and 2^(2^i). And log(log(2^(2^i))) = log(2^i) = i.
Let's break it down and work from the inside out.
Imagine the following:
j = 2
while j < n do
j = j * 2
j goes 2, 4, 8, 16..., so if n doubles in size, it only takes roughly one more iteration for j to surpass it. That's basically the definition of logarithmic.
The inner loop in your case is a bit different:
j = 2
while j < n do
j = j * j
Now j goes 2, 4, 16, 256, 65536... and surpasses n even more easily. In the first case, j was growing exponentially per iteration, now it's growing doubly exponentially. But we're interested in the inverse--j surpasses n in log(log(n)) steps.
Then the outer loop just means you do that n times.

Find the time complexity of the algorithm?

I think it is log(logn) because the cycle repeats log(logn) times...
j=1;
i=2;
while (i <= n) do {
B[j] = A[i];
j = j + 1;
i = i * i;
}
You are right, it is O(lg(lg n)) where lg stands for base 2 logarithm.
The reason being that the sequence of values of i is subject to the rule i = prev(i) * prev(i), which turns out to be 2, 2^2, 2^4, 2^8, ... for steps 1, 2, 3, 4, .... In other words, the value of i after k iterations is 2^{2^k}.
Thus, the loop will stop as soon as 2^{2^k} > n or k > lg(lg(n)) (Just take lg twice to both sides of the inequality. The inequality remains valid because lg is an increasing function.)

Big O Time Complexity for this code

Given the following code -:
for(int i = 1; i <= N; i++)
for(int j = 1; j <= N; j = j+i)
{
//Do something
}
I know that the outer loop runs N times, and that the inner loop runs approximately log(N) times. This is because on each iteration of i, j runs ceil(N), ceil(N/2), ceil(N/4) times and so on. This is just a rough calculation through which one can guess that the time complexity will definitely be O(N log(N)).
How would I mathematically prove the same?
I know that for the ith iteration, j increments by ceil(N/2(i - 1)).
The total number of iterations of the inner loop for each value of i will be
i = 1: j = 1, 2, 3 ..., n ---> total iterations = n
i = 2: j = 1, 3, 5 ..., n ---> total iterations = n/2 if 2 divides n or one less otherwise
i = 3: j = 1, 4, 7 ..., n ---> total iterations = n/3 if 3 divides n or one less otherwise
...
i = m: j = 1, 1 + m, ... , n ---> total iterations ~ n/m
...
1
So approximately the total iterations will be (n + n/2 + n/3 ... + 1).
That sum is the Harmonic Series which has value approximately ln(n) + C so the total iterations is approximately n ln(n) and since all logarithms are related by a constant, the iterations will be O(nlogn).

Resources