Asymptotic time complexity O(sqrt(n)log(n)n) - algorithm

for(int i=1; i*i <= n; i = i+1) {
for(int j = n; j >= 1; j/4) {
for(int k = 1; k <= j; k=k+1) {
f();
}
}
}
Why is the asymptotic complexity of this function O(n^{3/2})? I think, it should be O(sqrt(n)log(n)n). Is this the same to O(sqrt(n)n)? Then, it would be O(n^{3/2})..
Outer loop is O(sqrt(n)).
first inner loop is O(log(n)).
second inner loop is O(n).

The outer two loops (over i and j) have bounds that depend on n, but the inner loop (over k) is bounded by j, not n.
Note that the inner loop iterates j times; assuming f() is O(1), the cost of the inner loop is O(j).
Although the middle loop (over j) iterates O(log n) times, the actual value of j is a geometric series starting with n. Because this geometric series (n + n/4 + n/16 + ...) sums to 4/3*n, the cost of the middle loop is O(n).
Since the outer loop iterates sqrt(n) times, this makes the total cost O(n^(3/2))

Related

Time complexity of dependent nested loops

I was trying to find the time complexity of this nested loop
for (i = 1; i <= n; i++) {
for (j = 1; j <= n; j++) {
n--;
x++;
}
}
If there wasn't a n-- it would be n*n , O(n2) right?
But what if n reduces every time second loop runs?
What's the time complexity and big O of this nested loop?
If I consider n = 5, x equals 4, the second loop runs 4 time
The time complexity of the code is O(n). n is reduced by half for every iteration of the outer loop.
So we have n/2 + n/4 + n/8 + n/16 + ... + n/2^k = O(n)
where k is the number of iterations of the outer loop (basically i).
Note that the time complexity is independent of x.
If there wasn't a n-- it would be n*n , O(n2) right?
Yes
Another way to see it's O(n): You only enter the inner loop body if j <= n, and since j is positive, n must also be positive. But you decrease n every time, which you can only do O(n) times (where n is the starting value) and still have n positive.

What can be the time complexity of this code?

i am kind of confuse in this i know in outer loop if i < n and in inner loop if j < i then the complexity will be O(n^2) but if we increase the limit from n and i respectively to n^2 and i^2 does the complexity doubles like O(n^4) or does it becomes cubic O(n^3) ?
for(long i = 1; i < n*n; i++)
{
for(long j = 1; j < i * i; j++)
{
//some code
}
}
Assuming that //some code takes O(1) operations, the time complexity is O(n^6). Since the inner loop takes i^2 - 1 iterations, you can use the sum of squares formula (or equivalently use Wolfram alpha) to get

Big O of Nested Loop (int j = 0; j < i; j++)

for (int i = n; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
//statement
}
}
Answer: O(N)
I know that the first loop for for (int i = n; i > 0; i /= 2) results in O(log N).
The second loop for (int j = 0; j < i; j++) depends on i and will iterate first i times then i / 2, i / 4, ... times. (where i depends on n)
I don't know the Big O for the second loop, but I thought the answer would be O(log N * something) where O(log N) is the outer loop and something is the inner loop?
How do you get O(N)?
The outer loop has a complexity of O(log n), because of the i /= 2. But the inner loop is a little bit more tricky.
The inner loop has a complexity of O(i), but the i is changing for each iteration of the outer loop. In combination with the outer loop you get a complexity of O(n / log n). You get this as follow:
The number of steps, the inner loop is doing, is similar to the sum of 1/(2n), as described on https://en.wikipedia.org/wiki/1/2_%2B_1/4_%2B_1/8_%2B_1/16_%2B_⋯. At first you are doing n steps, then only n/2 steps, then n/4 steps and so on until you do only 2 steps and then finally 1 step. This sums up together to the result of 2n. In total you run the inner loop log n times (as defined by the outer loop). This means the inner loop runs at average 2n / log n times. So you have a complexity of O(n / log n).
With the outer loop of O(log n) and the inner loop of O(n / log n) you get O(log n * n / log n), which can be simplified to O(n).

Complexity of two imbricated loops

I have this algorithm (in c code for convenience):
int algo(int *T, int size)
{
int n = 0;
for (int i = 1; i < size; i++) {
for (int j = i; j < size; j++) {
n += (T[i] * T[j]);
}
}
return n;
}
What is this algorithm's time complexity?
My bet is it is n * log (n) since we have two imbricated iterations on the size length one time, and onsize - i the second time, but I am not sure.
A formal proof of the complexity is welcome!
This is an O(N2) algorithm.
First iteration of the outer loop runs N-1 times
Second iteration of the outer loop runs N-2 times
Third iteration of the outer loop runs N-3 times
...
Last iteration of the outer loop runs 1 time
The total number of times is (N)+(N-1)+(N-2)+...+1, which is the sum of arithmetic progression. The formula for computing the sum is N*(N-1)/2, which is O(N2).

What is the Big-O of a nested loop, where the outer loop is n^2

for(int i = 1; i < n **2; i++)
{
for(int j = 1; j < i; j++)
{
s = s;
}
}
Since the Big O of the outter loop is O(n^2) would it still be multiplied by the inner loop making the total Big O notation be n(n^2) -> O(n^3)?
In the outer loop, i can take values from 1 to n^2. Then for each of those values, the inner loop goes from 1 to i. The number of operations performed for i=1 is 1, i=2 is 2, ..., i = n^2 is n^2.
So the total number of operations is the sum for i from 1 to n^2 of i. This is a well known series which has the closed form of (n^2)(n^2 + 1)/2 and that is O(n^4)

Resources