Big O Remainder/Modulus Loops - for-loop

I am working out a function:
total = 0;
for (i = 0; i < N; i++){
for (j = 0; j < i*i; j++){
if (j % i == 0){
for (k=0; k < j; k++){
total++;
I the Big O number for this N^4 or N^5 when you break it down. I am not sure how to handle the % sign and the run time of that inner loop.

A roughly equivalent code would be
total=0;
for (i=1; i<=N; i++)
for(j=1; j <= i*i; j+= i)
for (k=1; k <= j; k++)
total++;
by restricting j to those values that are actually divisible by i. Shifting the range of each variable by one avoids the issue of having i = 0.
Rewriting again gives
total=0;
for (i=1; i<=N; i++)
for(j=1; j <= i; j+= 1)
for (k=1; k <= j*j; k++)
total++;
The j loop iterates the same number of times, but instead ranging over the square numbers directly, we simply iterate over the simple integers and shift the multiplication into the k loop. From this, it should be a little easier to prove that total is increment O(N^4) times: the inner k loop runs O(N^2) times, and itself iterates over O(N^2) values.

Related

Time complexity question 3 loops + if statement

I have some trouble finding the time complexity of the code below. I figured that the if statement will run for approximately n times; however, I could not manage to describe it mathematically. Thanks in advance.
int sum = 0;
for (int i = 1; i < n; i++) {
for (int j = 1 ; j < i*i; j++) {
if (j % i == 0) {
for (int k = 0; k < j; k++) {
sum++;
}
}
}
}
Outer loop
Well, it's clear that it's O(n) because i is bounded by n.
Inner loops
If we take a look at the second loop alone, then it looks as follows:
...
for (int j = 1 ; j < i*i; j++){
...
j is bounded by i*i or simply n^2.
However, the innermost loop won't be executed for every j, but only for js that are divisible by i because that's what the constraint j % i == 0 means. Since j ~ i*i, there will be only i cases, when the innermost loop is executed. So, the number of iterations in the inner loops is bounded by i^3 or simply n^3.
Result
Hence, the overall time complexity is O(n4).

Calculate the big-O and big-Omega of the following piece of code

I was asked to find the big-O and Big-Omega if I know that function f has O(log(n)), Ω(1) and function g has O(n), Ω((log(n))^2)
for (int i = n; i >= 0; i/=2)
if (f(i) <= g(i))
for (int j = f(i)*f(i); j < n; j++)
f(j);
The big problem that I have is that I don't know how to incorporate the complexity of the funstions in the calculation. I mean I know how to calculate the complexity of loops that looks like this:
for(int i =0 ; i< n*2; i++) {
....
}
or like this
for(int i = 0; i < n; i++) {
for(int j = 0; j < n; j++) {
}
}
Thank you in advance.
This is what I've tried:
for (int i = n; i >= 0; i/=2)// this is aproximatly O(log(n))
if (f(i) <= g(i))// because O(n) < O(log(n)) this line is O(n)
for (int j = f(i)*f(i); j < n; j++)// O(n*log(n))
f(j);// O(log(n))
So by my calculation I get O(log(n)*n *n *log(n)*log(n))=O(n^2*log^3(n))
This is tricky question, because the loop execution depends on values, returned by functions f and g. However remember, that you need to estimate the worst case - so you need to assume two things:
f(i) <= g(i) is always true, so the internal loop always executes
the internal loop starts from 0, because it's a minimal value, which you can get as a result of squaring the f(i) value
So, your piece of code becomes much simpler:
for (int i = n; i >= 0; i/=2)
{
f(i);
g(i);
f(i);
f(i);
for (int j = 0; j < n; j++)
f(j);
}
I think you can take over from here.

What is the running time T(n) of this algorithm?

What is the running time T(n) of a program
implementing this algorithm - What is The Total Time ?
T (n) ≈ cop C(n).
sum = 0;
for (i=1; i<=n; i++)
for (j=1; j<=i; j++)
sum++;
for (k=0; k<n; k++)
A[k] = k;
Nested loops
for (i=1; i<=n; i++)
for (j=1; j<=i; j++)
sum++;
brings
n - outer loop
(n + 1) / 2 - inner loop
n * (n + 1) / 2 == 0.5 * (n^2 + n) == O(n^2)
operations. You can implement a better O(n) routine:
sum = n > 0 ? n * (n + 1) / 2 : 0;
for (k = 0; k < n; k++)
A[k] = k;
You reach the instruction sum++; n(n+1)/2 times and the instruction A[k]=k; n times.
The total would be T(n)=(n^2+3n)/2.
If you want exact analysis, it will be like the following (we need to start from inside out):
where c1, c2 are constants.

Complexity of a triple for loop

for(I = 0; I < n; I++)
for(j = I; j < n; j++)
for(k = I; k < n; k++)
statement;
outer loop runs n times.
2nd loop runs (n - I) times = n(n-1)/2 times.
3rd loop runs (n- I) times = n(n-1)/2 times.
so statement will run (n(n-1)/2)^2 times.
Is this correct?
You can count like this to check whether it is right or not
int Cnt = 1; // initialization
for(I = 0; I < n; I++)
for(j = I; j < n; j++)
for(k = I; k < n; k++, Cnt++)
printf ("This is the %dth time\n", Cnt);
It is O(n^3) - because
O(n^3+AnyConst*n^2+AnyOtherConst*n+ThirdConst)=O(n^3)
O notation estimates asymptotic behavior as n goes to infinity, therefore, only fastest growing component matters.

Algorithmic complexity of o(n)

I recently started playing with algorithms from this princeton course and I observed the following pattern
O(N)
double max = a[0];
for (int i = 1; i < N; i++)
if (a[i] > max) max = a[i];
O(N^2)
for (int i = 0; i < N; i++)
for (int j = i+1; j < N; j++)
if (a[i] + a[j] == 0)
cnt++;
O(N^3)
for (int i = 0; i < N; i++)
for (int j = i+1; j < N; j++)
for (int k = j+1; k < N; k++)
if (a[i] + a[j] + a[k] == 0)
cnt++;
The common pattern here is that as the nesting in the loop grows the exponent also increases.
Is it safe to assume that if I have 20-for loops my complexity would be 0(N^20)?
PS: Note that 20 is just a random number I picked, and yes if you nest 20 for loops in your code there is clearly something wrong with you.
It depends on what the loops do. For example, if I change the end of the 2nd loop to just do 3 iterations like this:
for (int i = 0; i < N; i++)
for (int j = i; j < i+3; j++)
if (a[i] + a[j] == 0)
cnt++;
we get back to O(N)
The key is whether the number of iterations in the loop is related to N and increases linearly as N does.
Here is another example where the 2nd loop goes to N ^ 2:
for (int i = 0; i < N; i++)
for (int j = i; j < N*N; j++)
if (a[i] + a[j] == 0)
cnt++;
This would be o(N^3)
Yes, if the length of the loop is proportional to N and the loops are nested within each other like you described.
In your specific pattern, yes. But it is not safe to assume that in general. You need to check whether the number of iterations in each loop is O(n) regardless of the state of all the enclosing loops. Only after you have verified that this is the case can you conclude that the complexity is O(nloop-nesting-level).
Yes. Even though you decrease the interval of iteration, Big-o notation works with N increasing towards infinity and as all your loops' lengths grow proportional to N, it is true that such an algorithm would have time complexity O(N^20)
I strongly recommend that you understand why a doubly nested loop with each loop running from 0 to N is O(N^2).Use summations to evaluate the number of steps involved in the for loops, and then dropping constants and lower order terms, you will get the Big-Oh of that algorithm.

Resources