The running time of for/while loop is number of iterations * running time of statement
Sum=0
for (i=0; i< n; i++)
For (j=0;j
Please help me Guys everyone is Saying it's complexity is O (n^5) I'm Stuck..
"Sum=0 for (i=0; i< n; i++) For (j=0;j " is not finished so we can only guess whats following. In case its Sum=0 for(i=0; i< n; i++) For (j=0;j < n;j++) then its complexity is n^2 since you are doing the second loop n times and the second loop goes through n iterations so you get n^2. Just to clarify a bit in the case of : for(i=0; i< n; i++) For ( j=0 ;j < y ; j++ ) , the complexity would be n * y.
Related
I'm having some trouble counting these operations in these short bits of code. Can someone explain counting operations in code. I understand big-oh just need some insight on how to count and explain in big-oh.
my counting is below
sum = 0; --> one operation n
for( i = 0; i < n; i++ ) --> 1+n+1+n= 2n+2
sum++; --> n
total= 3n+2
2.
sum = 0;
for( i = 0; i < n2; i++)
for( j = 0; j < i; j++)
sum++;
I have this question as follows
int sum = 0;
for(int i = 0; i*i<N; i++)
for(int j=0; j*j<4*N; j++)
for(int k=0; k<N*N; k++)
sum++;
How to find the order of growth of the worst case running time of the above question.Please explain step by step.
Assuming N is your parameter
for(int i = 0; i*i < N; i++) -
O(sqrt(N))
for(int j=0; j*j<4*N; j++) -
O(sqrt(N))
for(int k=0; k < N*N; k++) -
O(n^2)
Inside of the last loop - constant.
So, each loop is inside the previous, you multiply it, so it's O(n^3)
I am working out a function:
total = 0;
for (i = 0; i < N; i++){
for (j = 0; j < i*i; j++){
if (j % i == 0){
for (k=0; k < j; k++){
total++;
I the Big O number for this N^4 or N^5 when you break it down. I am not sure how to handle the % sign and the run time of that inner loop.
A roughly equivalent code would be
total=0;
for (i=1; i<=N; i++)
for(j=1; j <= i*i; j+= i)
for (k=1; k <= j; k++)
total++;
by restricting j to those values that are actually divisible by i. Shifting the range of each variable by one avoids the issue of having i = 0.
Rewriting again gives
total=0;
for (i=1; i<=N; i++)
for(j=1; j <= i; j+= 1)
for (k=1; k <= j*j; k++)
total++;
The j loop iterates the same number of times, but instead ranging over the square numbers directly, we simply iterate over the simple integers and shift the multiplication into the k loop. From this, it should be a little easier to prove that total is increment O(N^4) times: the inner k loop runs O(N^2) times, and itself iterates over O(N^2) values.
for(I = 0; I < n; I++)
for(j = I; j < n; j++)
for(k = I; k < n; k++)
statement;
outer loop runs n times.
2nd loop runs (n - I) times = n(n-1)/2 times.
3rd loop runs (n- I) times = n(n-1)/2 times.
so statement will run (n(n-1)/2)^2 times.
Is this correct?
You can count like this to check whether it is right or not
int Cnt = 1; // initialization
for(I = 0; I < n; I++)
for(j = I; j < n; j++)
for(k = I; k < n; k++, Cnt++)
printf ("This is the %dth time\n", Cnt);
It is O(n^3) - because
O(n^3+AnyConst*n^2+AnyOtherConst*n+ThirdConst)=O(n^3)
O notation estimates asymptotic behavior as n goes to infinity, therefore, only fastest growing component matters.
I recently started playing with algorithms from this princeton course and I observed the following pattern
O(N)
double max = a[0];
for (int i = 1; i < N; i++)
if (a[i] > max) max = a[i];
O(N^2)
for (int i = 0; i < N; i++)
for (int j = i+1; j < N; j++)
if (a[i] + a[j] == 0)
cnt++;
O(N^3)
for (int i = 0; i < N; i++)
for (int j = i+1; j < N; j++)
for (int k = j+1; k < N; k++)
if (a[i] + a[j] + a[k] == 0)
cnt++;
The common pattern here is that as the nesting in the loop grows the exponent also increases.
Is it safe to assume that if I have 20-for loops my complexity would be 0(N^20)?
PS: Note that 20 is just a random number I picked, and yes if you nest 20 for loops in your code there is clearly something wrong with you.
It depends on what the loops do. For example, if I change the end of the 2nd loop to just do 3 iterations like this:
for (int i = 0; i < N; i++)
for (int j = i; j < i+3; j++)
if (a[i] + a[j] == 0)
cnt++;
we get back to O(N)
The key is whether the number of iterations in the loop is related to N and increases linearly as N does.
Here is another example where the 2nd loop goes to N ^ 2:
for (int i = 0; i < N; i++)
for (int j = i; j < N*N; j++)
if (a[i] + a[j] == 0)
cnt++;
This would be o(N^3)
Yes, if the length of the loop is proportional to N and the loops are nested within each other like you described.
In your specific pattern, yes. But it is not safe to assume that in general. You need to check whether the number of iterations in each loop is O(n) regardless of the state of all the enclosing loops. Only after you have verified that this is the case can you conclude that the complexity is O(nloop-nesting-level).
Yes. Even though you decrease the interval of iteration, Big-o notation works with N increasing towards infinity and as all your loops' lengths grow proportional to N, it is true that such an algorithm would have time complexity O(N^20)
I strongly recommend that you understand why a doubly nested loop with each loop running from 0 to N is O(N^2).Use summations to evaluate the number of steps involved in the for loops, and then dropping constants and lower order terms, you will get the Big-Oh of that algorithm.