Here is the fragment:
sum1=0;
for(i=1;i<=n;i++)
for(j=1;j<=n;j++)
sum1++
sum2=0
for(k=1;k<=n;k*=2)
for(j=1;j<=k;j++)
sum2++
Below is the answer:
2 assignment statements – O(1) each
1st nested loop – O(n2)
2nd nested loop – O(n)
Running time complexity of code fragment = O(1) + O(n^2) + O(1) + O(n) = O(n2)
But here is how I worked it out:
2 assignments:- O(1).
First nested loop: O(n*n)=O(n^2)
Second nested loop:
Outer loop runs n times..
Now the inner loop will be executed (1+2+3+.....+(n-1)+n) times
which gives n(n+1)/2 =O(n^2)
Total running time = O(n^2)+O(n^2)+O(1)=O(n^2)
And yes I've done some research and I came across the following:
In a loop if an index jumps by an increasing amount in each iteration the sequence has complexity log n.
In that case I suppose the second loop will have complexity (n-1)/2*logn...which will be equal to O(n*log n).
I'm really confused with the second loop whether it should be O(n)..O(n^2) or O(nlogn)..
HELP PLEASE
Since you k increased double each time . your calculation is not correct. It should be (1+2+4+....n/2+n)
for(k=1;k<=n;k*=2)
So, O(nlogn) is right.
Related
I'm trying to find the complexity of the following algorithm:
for(i=1;i<=n;i++){
for(j=1;j<=i;j++){
for(k=i;k<=j;k++){
//code
}
}
}
Since your k starts with "i" and goes upto "j",your worst case time complexity is O(n2). Lets take an example and see. For i=4, j goes from 1 to 4 and k runs only one time for each value of j (except for j=4 which runs exactly 2 times).Therefore for each value of j,the inner loop runs in O(1) time. The outer two loops take O(n2) time. Also, taking into account that your (//code) inside the innermost loop runs in O(1) time. Therefore, time complexity for this algorithm is O(n2).
For this question part A,I know the Big-O is n^2, because the outer loop can run at most (n-1) times, and each inner loop can run at most (n(n+1))/2 = n^2/2 + n/2 , and since we are calculating the Big-O, we only take the higher bound, hence, we have (n * n) = O(n^2).
But for part B, I know the array is A[1.....n] = {1,1,4,7,10,..,3(n-2)+1}, and
From my understand, the outer loop have at least (n-1) iterations, and the inner loop have at least (n/2) iterations. So we have (n*n/2) = (cn^2) = (n^2), is this correct?
According to the answer sheet, there are at least n^2/4 iteration, which is Big-Omega(n^2), I just don't understand how they get to n^2/4 and not n^2/2, can someone explain how to do part B in detail please, Thanks.
You are correct the best case time complexity of the bizzare() procedure is Big-Omega (n^2/2) assuming that the inner-loop gets executed for all i.
Look at it this way:
Let n = A.size(),
so for the first time when i=2 the inner loop will run atleast once,
when i=2, the inner loop will run atleast twice
when i=3, inner loop runs atleast thrice and so on
So the total best case complexity is actually Big-Omega(sum of first n-1 natural numbers) = Big-Omega(n*(n-1)/2) = Big-Omega(n^2). Also, note that Big-Omega(n^2/2)=Big-Omega(n^2/4). If you take average of outer loop * average of inner loop that will give you n^2/4 iterations on average assuming that the distribution of data is uniform which means half will go to the if block and half will go to the else block. The constant really doesnt matter.
If I have the following code in . net:
for i=0 to n
for j=0 to n
m=i*j
next j
next i
so I have done the following complexity analysis:
is this correct? Also in which case a double loop could give a O(n) complexity?
In the simplest case, the complexity of two nested loops is equal to the complexity of one loop multiplied by the complexity of the other. The demonstration is almost identical to what you posted.
Therefore, if you want the complexity of the two nested loops to be O(n), you could make one of the loops execute in constant time:
for i=0 to 10
for j=0 to n
m = i*j
next j
next i
or you could have each of them execute in O(n^1/2):
for i=0 to sqrt(n)
for j=0 to sqrt(n)
m = i*j
next j
next i
or other variations of the sort.
A particularly interesting solution could be when the number of iterations in the inner loop depends on the counter for the outer loop, but i don't have an example for you.
Can you explain me how to find time complexity for this?
sum=0;
for(k=1;k<=n;k*=2)
for(j=1;j<=k;j++)
sum++;
So, i know the outer loop has time complexity of O(logn), but since the iterations of the inner loop depends on the value of the outer loop, the complexity of this algorithm is not O(nlogn).
The book says it is O(n).
I really dont understand how it is O(n)...Can someone please explain it...
I'll be really grateful if u could go into the details btw :D
A mathematical solution would help me understand better...
Just see how many times the inner loop runs:
1 + 2 + 4 + 8 + 16 +...+ n
Note that if n = 32, then this sum = 31 + 32. ~ 2n.
This is because the sum of all the terms except the last term is almost equal to the last term.
Hence the overall complexity = O(n).
EDIT:
The geometric series sum (http://www.mathsisfun.com/algebra/sequences-sums-geometric.html) is of the order of:
(2^(logn) - 1)/(2-1) = n-1.
The outer loop executed log(Base2)n times.so it is O(log(Base2)n).
the inner loop executed k times for each iteration of the outer loop.now in each iteration of the outer loop, k gets incremented to k*2.
so total number of inner loop iterations=1+2+4+8+...+2^(log(Base2)n)
=2^0+2^1+2^2+...+2^log(Base2)n (geometric series)
=2^((log(base2)n+1)-1/(2-1)
=2n-1.
=O(n)
so the inner loop is O(n).
So total time complexity=O(n), as O(n+log(base2)n)=O(n).
UPDATE:It is also O(nlogn) because nlogn>>n for large value of n , but it is not asymptotically tight. you can say it is o(nlogn)[Small o] .
I believe you should proceed like the following to formally obtain your algorithm's order of growth complexity, using Mathematics (Sigma Notation):
Our prof and various materials say Summation(n) = (n) (n+1) /2 and hence is theta(n^2). But intuitively, we just need one loop to find the sum of first n terms! So, it has to be theta(n).I'm wondering what am I missing here?!
All of these answers are misunderstanding the problem just like the original question: The point is not to measure the runtime complexity of an algorithm for summing integers, it's talking about how to reason about the complexity of an algorithm which takes i steps during each pass for i in 1..n. Consider insertion sort: On each step i to insert one member of the original list the output list is i elements long, thus it takes i steps (on average) to perform the insert. What is the complexity of insertion sort? It's the sum of all of those steps, or the sum of i for i in 1..n. That sum is n(n+1)/2 which has an n^2 in it, thus insertion sort is O(n^2).
The running time of the this code is Θ(1) (assuming addition/subtraction and multiplaction are constant time operations):
result = n*(n + 1)/2 // This statement executes once
The running time of the following pseudocode, which is what you described, is indeed Θ(n):
result = 0
for i from 1 up to n:
result = result + i // This statement executes exactly n times
Here is another way to compute it which has a running time of Θ(n²):
result = 0
for i from 1 up to n:
for j from i up to n:
result = result + 1 // This statement executes exactly n*(n + 1)/2 times
All three of those code blocks compute the natural numbers' sum from 1 to n.
This Θ(n²) loop is probably the type you are being asked to analyse. Whenever you have a loop of the form:
for i from 1 up to n:
for j from i up to n:
// Some statements that run in constant time
You have a running time complexity of Θ(n²), because those statements execute exactly summation(n) times.
I think the problem is that you're incorrectly assuming that the summation formula has time complexity theta(n^2).
The formula has an n^2 in it, but it doesn't require a number of computations or amount of time proportional to n^2.
Summing everything up to n in a loop would be theta(n), as you say, because you would have to iterate through the loop n times.
However, calculating the result of the equation n(n+1)/2 would just be theta(1) as it's a single calculation that is performed once regardless of how big n is.
Summation(n) being n(n+1)/2 refers to the sum of numbers from 1 to n. Which is a mathematical formula and can be calculated without a loop which is O(1) time. If you iterate an array to sum all values that is an O(n) algorithm.