I'm confused about the complexity of the following (the operation performed inside the inner loop is in constant time):
for(int i=0; i<n; i++)
for(int j=i; j<n; j++)
is this O(n^2) or O(n)? I figure O(n^2). Any ideas?
also the following makes me curious:
for(int i=0; i<n; i++)
for(j=0; j<i; j++)
Definitely O(n squared), of course. Summary explanation for both cases: 1 + 2 + ... + n is n(n+1)/2, that is, (n squared plus n) / 2 (and in big-O we drop the second, lesser part, so we're left with n squared / 2 which is of course O(n squared)).
You are correct, those nested loops are still O(n^2). The actual number of operations is something close to (n^2)/2, which, after discarding the constant 1/2 factor, is O(n^2).
Related
for(int i =1; i<n; i=i*2)
{
for(int j=1; j<i; j++){
printf("hello");
}
}
Is the time complexity O(logn) or O(nlogn)?
It's neither. It's O(n).
First, study the special cases where n = 2^k-1 e.g. 1,3,7,15,31,63,127,255 ....
Second, for any number n, prove that if 2^(k-1) <= n < 2^k then 2^k - 1 <= 2n
that's all.
The time complexity for this will be O(n^2) [n square]
I am fairly familiar with simple time complexity regarding constant, linear, and quadratic time complexities. In simple code segments like:
int i = 0;
i + 1;
This is constant. So O(1). And in:
for (i = 0; i < N; i++)
This is linear since it iterates n+1 times, but for Big O time complexities we remove the constant, so just O(N). In nested for loops:
for (i = 0; i < N; i++)
for (j = 0; j < N; j++)
I get how we multiply n+1 by n and reach a time complexity of O(N^2). My issue is with slightly more complex versions of this. So, for example:
S = 0;
for (i = 0; i < N; i++)
for (j = 0; j < N*N; j++)
S++;
In such a case, would I be multiplying n+1 by the inner for loop time complexity, of which I presume is n^2? So the time complexity would be O(n^3)?
Another example is:
S = 0;
for (i = 0; i < N; i++)
for (j = 0; j < i*i; j++)
for (k = 0; k < j; k++)
S++;
In this case, I expanded it and wrote it out and realized that the inner, middle for loop seems to be running at an n*n time complexity, and the most inner for loop at the pace of j, which is also nxn. So in that case, would I be multiplying n+1 x n^2 x n^2, which would give me O(n^5)?
Also, I am still struggling to understand what kind of code has logarithmic time complexity. If someone could give me an algorithm or segment of code that performs at log(n) or n log(n) time complexity, and explain it, that would be much appreciated.
All of your answers are correct.
Logarithmic time complexity typically occurs when you're reducing the size of the problem by a constant factor on every iteration.
Here's an example:
for (int i = N; i >= 0; i /= 2) { .. do something ... }
In this for-loop, we're dividing the problem size by 2 on every iteration. We'll need approximately log_2(n) iterations prior to terminating. Hence, the algorithm runs in O(log(n)) time.
Another common example is the binary search algorithm, which searches a sorted interval for a value. In this procedure, we remove half of the values on each iteration (once again, we're reducing the size of the problem by a constant factor of 2). Hence, the runtime is O(log(n)).
I've gone through some basic concepts of calculating time complexities. I would like to know the time complexity of the code that follows.
I think the time complexity would be O(log3n*n2). It may still be wrong and I want to know the exact answer and how to arrive at the same. Thank you :)
function(int n){
if(n == 1) return;
for(int i = 1; i <= n; i++)
for(int j = 1; j <= n; j++)
printf("*");
function(n-3);
}
Two nested loops with n iterations give O(n^2). Recursion calls the function itself for O(n)-time since it decreases n for the constant 3, thus it's called n/3 + 1 = O(n) times. In total, it's O(n^3).
The logarithm constant in your result would be in case that function is called with the value of n/3.
Here is problem in which we have to calculate the time complexity of given function
f(i) = 2*f(i+1) + 3*f(i+2)
For (int i=0; i < n; i++)
F[i] = 2*f[i+1]
What i think is the complexity of this algorithm is O(2^n) + O(n) which ultimately is O(2^n).
Please correct me if i am wrong?
Firstly, all the information you required to work these out in future is here.
To answer your question. Because you have not provided a definition of f(i) in terms of I itself it is impossible to determine the actual complexity from what you have written above. However, in general for loops like
for (i = 0; i < N; i++) {
sequence of statements
}
executes N times, so the sequence of statements also executes N times. If we assume the statements are O(1), the total time for the for loop is N * O(1), which is O(N) overall. In your case above, if I take the liberty of re-writing it as
f(0) = 0;
f(1) = 1;
f(i+2) = 2*f(i+1) + 3*f(i)
for (int i=0; i < n; i++)
f[i] = 2*f[i+2]
Then we have a well defined sequence of operations and it should be clear that the complexity for the n operations is, like the example I have given above, n * O(1), which is O(n).
I hope this helps.
I understand how:
for (int i=0; i<n; i++)
This time complexity is O(n).
for (int i=0; i<n; i++)
for (int j=0; j<n; j++)
for (k=0; k<n; k++)
this is O(n^3) right?
i=1
do
//......
i++
while (i*2 <n)
Is this O(n)? Or is it exactly O(n/2)?
O(n/2) is O(n) only with a constant coefficient of 1/2. The coefficient can be 10 billion, it would still be O(n), and not e.g. O(n^(1.0001)) which is a different complexity class.
The first one complexity O(n^3), correct.
The second one, O(cn), c constant. No matter how huge c is, according to the definition of big-O, the complexity is still O(n).
However, O-notation is considered harmful. See here.
The first one of O(n3), you're right.
Your second algorithm is O(n/2) = O(Cn) = O(n). 1/2 is a constant so we can safety discard it.
This fragment of code:
i=1
do
//......
i++
while (i*2 < n);
is equivalent to that one:
for ( i = 1; i < n / 2 ; ++ i );
Superficially, this is O(n).