I have this algorithm (in c code for convenience):
int algo(int *T, int size)
{
int n = 0;
for (int i = 1; i < size; i++) {
for (int j = i; j < size; j++) {
n += (T[i] * T[j]);
}
}
return n;
}
What is this algorithm's time complexity?
My bet is it is n * log (n) since we have two imbricated iterations on the size length one time, and onsize - i the second time, but I am not sure.
A formal proof of the complexity is welcome!
This is an O(N2) algorithm.
First iteration of the outer loop runs N-1 times
Second iteration of the outer loop runs N-2 times
Third iteration of the outer loop runs N-3 times
...
Last iteration of the outer loop runs 1 time
The total number of times is (N)+(N-1)+(N-2)+...+1, which is the sum of arithmetic progression. The formula for computing the sum is N*(N-1)/2, which is O(N2).
Related
i am kind of confuse in this i know in outer loop if i < n and in inner loop if j < i then the complexity will be O(n^2) but if we increase the limit from n and i respectively to n^2 and i^2 does the complexity doubles like O(n^4) or does it becomes cubic O(n^3) ?
for(long i = 1; i < n*n; i++)
{
for(long j = 1; j < i * i; j++)
{
//some code
}
}
Assuming that //some code takes O(1) operations, the time complexity is O(n^6). Since the inner loop takes i^2 - 1 iterations, you can use the sum of squares formula (or equivalently use Wolfram alpha) to get
For the loop below,
int sum = 0;
for (int i = 1; i < N; i *= 2)
for (int j = 0; j < N; j++)
sum++;
what is the time complexity and how should I think? My guess is that the outer loop runs a total of log(N). The inner loop runs N times. Therefore, the time complexity should be Nlog(N).
Am I correct?
Thanks in advance.
For the first loop, the number of iterations is equal to log2(N), as i is doubled every iteration.
For each iteration of the first loop, the second loop runs for N times.
Therefore overall time complexity = (log2(N) * N), where the function log2(x) = log(x) to the base 2.
for(int i=1; i*i <= n; i = i+1) {
for(int j = n; j >= 1; j/4) {
for(int k = 1; k <= j; k=k+1) {
f();
}
}
}
Why is the asymptotic complexity of this function O(n^{3/2})? I think, it should be O(sqrt(n)log(n)n). Is this the same to O(sqrt(n)n)? Then, it would be O(n^{3/2})..
Outer loop is O(sqrt(n)).
first inner loop is O(log(n)).
second inner loop is O(n).
The outer two loops (over i and j) have bounds that depend on n, but the inner loop (over k) is bounded by j, not n.
Note that the inner loop iterates j times; assuming f() is O(1), the cost of the inner loop is O(j).
Although the middle loop (over j) iterates O(log n) times, the actual value of j is a geometric series starting with n. Because this geometric series (n + n/4 + n/16 + ...) sums to 4/3*n, the cost of the middle loop is O(n).
Since the outer loop iterates sqrt(n) times, this makes the total cost O(n^(3/2))
Help me solve the time complexity of this method below here:
void method(int n, int[] array)
{
int i = 0, j = 0;
for(; i < n; ++i)
{
while(j < n && array[i] < array[j])
{
j++;
}
}
}
The runtime is O(n).
In some iterations of the outer loop the inner loop might progress several times and at other it might not progress at all, but in total there will be at most n increases of j. As this is independent of when (which values of i) this happens, you might say this is O(n) for the outer loop plus O(n) for the (up to) n increases of j. O(n) + O(n) = O(n).
This is contrary to the typical 'loop inside loop' which would perform n iterations of the inner loop for every iteration of the outer loop and thus be O(n) * O(n) = O(n^2).
Given the following code:
for (int i = 0; i < n-1; ++i)
{
for (int j = i+1; j < n; ++j)
{
// Do work.
}
}
What is the Big-Oh value for it (over n)? I'm thinking it's O(N^2) but I'm not sure.
I did find a similar question here: complexity for nested loops
but it's not quite the same I think.
Yes, that's O(N^2). Pair up the iterations of the inner loop in the beginning and at the end of the outer loop, like this:
The inner loop will execute...
N-1 times on the first iteration of the outer loop, and 1 time on the last iteration
N-2 times on the second iteration of the outer loop, and 2 times on the second to last iteration
N-3 times on the third iteration of the outer loop, and 3 times on the third to last iteration
... and so on; you will have N/2 pairs like that; when N is odd, the last pair is incomplete.
You can see that each of the pairs executes a total of N times, and you have N/2 such pairs, for a total of N*(N-1)/2 times.
The way the formula is derived comes from the derivation of the formula for the sum of arithmetic progression with the common difference of 1.
It should be possible to check it this way.
int z = 0, n = 10; // try 20 etc
for (int i = 0; i < n-1; ++i)
{
for (int j = i+1; j < n; ++j)
{
z++;
}
}
Now, check the value of z.
With n = 10; z becomes 45
With n = 20; z becomes 190
With n = 40; z becomes 780
A doubling in n caused z to become ~4 times its value. Hence, it is approximately O(n^2).
Methodically, using Sigma notation (empirically verified), you can obtain the exact number of iterations plus the order of growth complexity: