I've got the following code
1 for (i = 0; i < n; i++)
2 for (j = 0; j < i; i++)
3 result = result + 1;
I know the time complexity is O(n^2) but I'm having trouble calculating it the way we're supposed to as explained by the materials we were given.
The time complexity of a loop, according to said materials, is
T = T1 + n(T1 + T2)
where T1 is the loop condition and T2 is the instruction inside the loop. When applying it to the exercise I get:
T = T1 + n(T1 + T2-3)
= T1 + n(T1 + T2 + (1+2+3...+n)(T2 + T3)).
As T1, T2 and T3 are all O(1), we get that
T = n * (1+2+3+...+n)
= n * n * (n+1) / 2
= n^3.
But that is obviously wrong, so... what am I doing wrong?
Your derivation is wrong at the expansion of T2-3:
T2-3 = T2 + i * ( T2 + T3 )
< T2 + n * ( T2 + T3 )
= O(n)
You did not analyze the inner loop in isolation but took into account the outer loop iteration a second time in writing down the summation. Therefore you came up with an extra factor of n.
Related
Trying to analyze the below code snippet.
For the below code can the time complexity be Big O(log n)?. I am new to asymptotic analysis. In the tutorial it says its O( root n).
int p = 0;
for(int i =1;p<=n;i++){
p = p +i;
}
,,,
Variable p is going to take the successive values 1, 1+2, 1+2+3, etc.
This sequence is called the sequence of triangular numbers; you can read more about it on Wikipedia or OEIS.
One thing to be noted is the formula:
1 + 2 + ... + i = i*(i+1)/2
Hence your code could be rewritten under the somewhat equivalent form:
int p = 0;
for (int i = 1; p <= n; i++)
{
p = i * (i + 1) / 2;
}
Or, getting rid of p entirely:
for (int i = 1; (i - 1) * i / 2 <= n; i++)
{
}
Hence your code runs while (i-1)*i <= 2n. You can make the approximation (i-1)*i ≈ i^2 to see that the loop runs for about sqrt(2n) operations.
If you are not satisfied with this approximation, you can solve for i the quadratic equation:
i^2 - i - 2n == 0
You will find that the loop runs while:
i <= (1 + sqrt(1 + 8n)) / 2 == 0.5 + sqrt(2n + 0.125)
Trying to understand Big O and nested loops i've been going through the notes and can't understand how the nested loop part of this question works...I have an answer of 6 + 1.5n + nlogn wrote down from lectures but don't understand how to get the n log n part
Simple Statement;
Simple Statement;
Simple Statement;
Simple Statement;
for ( int i = 0; i < ( n / 2 ); i++ ) {
Simple Statement;
Simple Statement;
Simple Statement;
}
Simple Statement;
Simple Statement;
for ( int i = 0; i < 2 * n; i++ ) {
for ( int j = 0; j < n; j = 2 * j ) {
Simple Statement;
Simple Statement;
}
}
My understanding is the 6 is from the six statements not inside a loop and the 1.5n comes from 3(n-1 + n-2 +....1)/2 so if someone could help with the last part or correct me if im wrong it would be greatly appreciated
Part im stuck on:
for ( int i = 0; i < 2 * n; i++ ) {
for ( int j = 0; j < n; j = 2 * j ) {
Simple Statement;
Simple Statement;
}
}
Well, I guess, there´s a typo in the question, the inner loop should be
// notice "j = 1", not "j = 0",
// otherwise you have an infinite loop, since 0 * 2 == 0
for (int j = 1; j < n; j = 2 * j )
in that case, the outer loop
for (int i = 0; i < 2 * n; i++ )
brings 2 * n, while the inner one (notice j = 2 * j)
for (int j = 1; j < n; j = 2 * j )
results in just log(n); finally (since loops are nested we should multiplicate complexities) we have
O(n * log(n))
Iterating from 0 to 2*n will result in a complexity of O(N). Iterating from 0 to n having the steps power of 2 will result in a complexity of O(log(N)). Multiplying this 2 complexities will result in a final complexity of O(N * log(N)).
I'm trying to study for an upcoming quiz about Big-O notation. I've got a few examples here but they're giving me trouble. They seem a little too advanced for a lot of the basic examples you find online to help. Here are the problems I'm stuck on.
1. `for (i = 1; i <= n/2; i = i * 2) {
sum = sum + product;
for (j= 1; j < i*i*i; j = j + 2) {
sum++;
product += sum;
}
}`
For this one, the i = i * 2 in the outer loop implies O(log(n)), and I don't think the i <= n/2 condition changes anything because of how we ignore constants. So the outer loop stays O(log(n)). The inner loops condition j < i*i*i confuses me because its in terms of 'i' and not 'n'. Would the Big-O of this inner loop then be O(i^3)? And thus the Big-O for the entire problem
be O( (i^3) * log(n) )?
2. `for (i = n; i >= 1; i = i /2) {
sum = sum + product
for (j = 1; j < i*i; j = j + 2) {
sum ++;
for (k = 1 ; k < i*i*j; k++)
product *= i * j;
}
}`
For this one, the outermost loop implies O(log(n)). The middle loop implies, again unsure, O(i^2)? And the innermost loop implies O(i^2*j)? I've never seen examples like this before so I'm almost guessing at this point. Would the Big-O notation for this problem be O(i^4 * n * j)?
3. `for (i = 1; i < n*n; i = i*2) {
for (j = 0; j < i*i; j++) {
sum ++;
for (k = i*j; k > 0; k = k - 2)
product *= i * j;
}
}`
The outermost loop for this one has an n^2 condition, but also a logarithmic increment, so I think that cancels out to be just regular O(n). The middle loop is O(i^2), and the innermost loop is I think just O(n) and trying to trick you. So for this problem the Big-O notation would be O(n^2 * i^2)?
4. `int i = 1, j = 2;
while (i <= n) {
sum += 1;
i = i * j;
j = j * 2;
}`
For this one I did a few iterations to better see what was happening:
i = 1, j = 2
i = 2, j = 4
i = 8, j = 8
i = 64, j = 16
i = 1024, j = 32
So clearly, 'i' grows very quickly, and thus the condition is met very quickly. However I'm not sure just what kind of Big-O notation this is.
Any pointers or hints you can give are greatly appreciated, thanks guys.
You can't add i or j to O-notation, it must be converted to n.
For the first one:
Let k be log 2 i.
Then inner loop is done 2^(k*3)/2=2^(3k-1) times for each iteration of outer loop.
k goes from 1 to log2n.
So total number of iterations is
sum of 2^(3k-1) for k from 1 to log 2 n which is 4/7(n^3-1) according to Wolfram Alpha, which is O(n^3).
For the last one, i=j1*j2*j3*...jk, and jm=2^m
i=2^1*2^2*...2^k=2^(1+2+...k)
So 1+2+3+...+k=log 2 n
(k+1)k/2 = log 2 n
Which is O(sqrt(log n))
BTW, log n^2 is not n.
This question is better to ask at computer science than here.
For the following code fragment, what is the order of growth in terms of N?
int sum = 0;
for (int i = 1; i <= N; i = i*2)
for (int j = 1; j <= N; j = j*2)
for (int k = 1; k <= i; k++)
sum++;
I have figured that there is lgN term, but I am stuck on evaluating this part : lgN(1 + 4 + 8 + 16 + ....). What will the last term of the sequence be? I need the last term to calculate the sum.
You have a geometric progression in your outer loops, so there is a closed form for the sum of which you want to take the log:
1 + 2 + 4 + ... + 2^N = 2^(N+1) - 1
To be precise, your sum is
1 + ... + 2^(floor(ld(N))
with ld denoting the logarithm to base 2.
The outer two loops are independent from each other, while the innermost loop only depends on i. There is a single operation (increment) in the innermost loop, which means that the number of visits to the innermost loop equals the summation result.
\sum_i=1..( floor(ld(N)) ) {
\sum_j=1..( floor(ld(N)) ) {
\sum_k=1..2^i { 1 }
}
}
// adjust innermost summation bounds
= \sum_i=1..( floor(ld(N)) ) {
\sum_j=1..( floor(ld(N)) ) {
-1 + \sum_k=0..2^i { 1 }
}
}
// swap outer summations and resolve innermost summation
= \sum_j=1..( floor(ld(N)) ) {
\sum_i=1..( floor(ld(N)) ) {
2^i
}
}
// resolve inner summation
= \sum_j=1..( floor(ld(N)) ) {
2^(floor(ld(N)) + 1) - 2
}
// resolve outer summation
= ld(N) * N - 2 * floor(ld(N))
This amounts to O(N log N) ( the second term in the expression vanishes asymptotically wrt to the first ) in Big-Oh notation.
To my understanding, the outer loop will take log N steps, the next loop will also take log N steps, and the innermost loop will take at most N steps (although this is a very rough bound). In total, the loop has a runtime complexity of at most ((log N)^2)*N, which can probably be improved.
I'm working on a data structures course and I'm not sure how to proceed w/ this Big O analysis:
sum = 0;
for(i = 1; i < n; i++)
for(j = 1; j < i*i; j++)
if(j % i == 0)
for(k = 0; k < j; k++)
sum++;
My initial idea is that this is O(n^3) after reduction, because the innermost loop will only run when j/i has no remainder and the multiplication rule is inapplicable. Is my reasoning correct here?
Let's ignore the outer loop for a second here, and let's analyze it in terms of i.
The mid loop runs i^2 times, and is invoking the inner loop whenever j%i == 0, that means you run it on i, 2i, 3i, ...,i^2, and at each time you run until the relevant j, this means that the inner loop summation of running time is:
i + 2i + 3i + ... + (i-1)*i = i(1 + 2 + ... + i-1) = i* [i*(i-1)/2]
The last equality comes from sum of arithmetic progression.
The above is in O(i^3).
repeat this to the outer loop which runs from 1 to n and you will get running time of O(n^4), since you actually have:
C*1^3 + C*2^3 + ... + C*(n-1)^3 = C*(1^3 + 2^3 + ... + (n-1)^3) =
= C/4 * (n^4 - 2n^3 + n^2)
The last equation comes from sum of cubes
And the above is in O(n^4), which is your complexity.