What is the running time T(n) of a program
implementing this algorithm - What is The Total Time ?
T (n) ≈ cop C(n).
sum = 0;
for (i=1; i<=n; i++)
for (j=1; j<=i; j++)
sum++;
for (k=0; k<n; k++)
A[k] = k;
Nested loops
for (i=1; i<=n; i++)
for (j=1; j<=i; j++)
sum++;
brings
n - outer loop
(n + 1) / 2 - inner loop
n * (n + 1) / 2 == 0.5 * (n^2 + n) == O(n^2)
operations. You can implement a better O(n) routine:
sum = n > 0 ? n * (n + 1) / 2 : 0;
for (k = 0; k < n; k++)
A[k] = k;
You reach the instruction sum++; n(n+1)/2 times and the instruction A[k]=k; n times.
The total would be T(n)=(n^2+3n)/2.
If you want exact analysis, it will be like the following (we need to start from inside out):
where c1, c2 are constants.
Related
I understand that the innermost for loop is Θ(logn)
and the two outermost for loops is Θ(n^2) because it's an arithmetic sum. The if-statement is my main problem. Does anyone know how to solve this?
int tally=0;
for (int i = 1; i < n; i ++)
{
for (int j = i; j < n; j ++)
{
if (j % i == 0)
{
for (int k = 1; k < n; k *= 2)
{
tally++;
}
}
}
}
Edit:
Now I noticed loop order: i before j.
In this case for given i value j varies from i to n and there are (n/i) successful if-conditions.
So program will call then most inner loop
n/1 +n/2+n/3+..+n/n
times. This is sum of harmonic series, it converges to n*ln(n)
So inner loop will be executed n*log^2(n) times.
As you wrote, two outermost loops provide O(n^2) complexity, so overall complexity is O(n^2 + n*log^2(n)), the first term overrides the second one, loop, and finally overall complexity is quadratic.
int tally=0;
for (int i = 1; i < n; i ++)
{
// N TIMES
for (int j = i; j < n; j ++)
{
//N*N/2 TIMES
if (j % i == 0)
{
//NlogN TIMES
for (int k = 1; k < n; k *= 2)
{
//N*logN*logN
tally++;
}
}
}
}
Old answer (wrong)
This complexity is linked with sum of sigma0(n) function (number of divisors) and represented as sequence A006218 (Dirichlet Divisor problem)
We can see that approximation for sum of divisors for values up to n is
n * ( log(n) + 2*gamma - 1 ) + O(sqrt(n))
so average number of successful if-conditions for loop counter j is ~log(j)
I have an assignment I am not sure with; I have to calculate the time complexity of the following code:
int a[][] = new int[m][n]; //O(1)
int w = 0; //O(1)
for (int i = 0; i < m; i++) //O(n)
for (int j = 0; j <n; j++) //O(n)
if (a[i] [j] % 2 == 0) //O(logn)
w++; //O(1)
So from my O estimations I add them up:
O(1) + O(1) + O(n) * ( O(n) * ( O(logn) + O(1) / 2 ) )
O(1) + O(1) + O(n) * ( O(nlogn) + O(n) / 2 )
O(1) + O(1) + (O(n2logn) + O(n2) / 2)
=O(n2logn)
I'm not sure if my train of thought is correct, could somebody help?
for (int i = 0; i < m; i++) //O(m)
for (int j = 0; j <n; j++) //O(n)
if (a[i] [j] % 2 == 0) //O(1)
w++; //O(1)
So the total complexity in terms of big-o is:
O(m)*(O(n) + O(1) + O(1)) = O(m)*O(n) = O(m*n).
for (int i = 0; i < m; i++) //O(m)
{
for (int j = 0; j <n; j++) //O(n)
{
// your code
}
}
So the i loop will go on m times, and for the j loop would run n times.
So in total the code will go on m*n times which would be its time complexity: O(m.n)
The final complexity is O(n^2)
Your logic is close except...
int a[][] = new int[m][n]; //O(1)
int w = 0; //O(1)
for (int i = 0; i < m; i++) //O(n)
for (int j = 0; j <n; j++) //O(n)
if (a[i] [j] % 2 == 0) //O(1)
w++; //O(1)
Your if statement embedded in your second for loop is simply referencing an element in an array and doing a basic comparison. This is of time complexity O(1). Also, typically you would not consider initializing variables in a time complexity problem.
I tried to solve the following exercise :
What is the order of growth of the worst case running time of the following code fragment
as a function of N?
int sum = 0;
for (int i = 1; i <= N; i++)
for (int j = 1; j <= i*i; j++)
for (int k = 1; k <= j*j; k++)
sum++;
and I found that the complexity is O(n4), however the correct answer is :
The answer is : N7
For a given value of i, the body of the innermost loop is executed 12 + 22 + 32 + ... + (i2)2 ~ 1/3 i6 times. Summing up over all values of i yields ~ 1/21 N7.
I would like some help to understand this answer and the correct way to calculate complexity in this case.
EDIT : In particular, I don't understand this statement :
12 + 22 + 32 + ... + (i2)2 ~ 1/3 i6
Because for me, 2 + 22 + 32 + ... + (i2)2 ~ i4
EDIT:
I'll add a bit of explanation to clear up your confusion about the quote in your question. Let's consider a fixed value of i and focus on the innermost two loops:
for (int j = 1; j <= i*i; j++)
for (int k = 1; k <= j*j; k++)
sum++;
How many times is the j-loop iterated? The answer is i^2 times. On each of those iterations, the k-loop is iterated j^2 times, which is different for each outer iteration because the value of j increases from 1 all the way to i^2.
When j = 1, the k-loop iterates 1^2 times. When j = 2, the k-loop iterates 2^2 times. When j=3, 3^2 times. Tallying up the total number of the iterations of the k-loop over all values of j, you have 1^2 + 2^2 + 3^2 + ... + (i^2)^2, since j runs between 1 and i^2. Hopefully that clarifies how you arrive at the following statement:
For a given value of i, the body of the innermost loop is executed 12 + 22 + 32 + ... + (i2)2 ~ 1/3 i6 times.
The total number of iterations can be expressed in sum form. The innermost loop has exactly j^2 iterations for each (varying) value of j, the middle loop has i^2 iterations for each value of i, and the outermost loop has N iterations. More neatly, the exact number of iterations is:
Multiplying through, you'll find this is a 7th order polynomial in N, so it is apparent why this is O(N^7).
In case you doubt the answer above is correct, simply run your own code and compare the value of sum you get with the formula provided above:
var sum = 0;
var N = 10;
for (var i = 1; i <= N; i++)
for (var j = 1; j <= i*i; j++)
for (var k = 1; k <= j*j; k++)
sum++;
function T(N) { return (1/420)*N*(1+N)*(1+2*N)*(8+11*N+21*N*N+20*N*N*N+10*N*N*N*N); }
console.log(sum===T(N));
Here's a demo: http://jsfiddle.net/wby9deax/. No matter what value of N you put in the answer will be correct (note: be careful with large values for N, it will probably freeze up your browser, since the number of iterations grows very rapidly).
int sum = 0;
for (int i = 1; i <= N; i++) -- O(N^1)
for (int j = 1; j <= i*i; j++) -- O(N^2)
for (int k = 1; k <= j*j; k++) -- O(N^4)
sum++;
Since they're nested (and since they're all linear) you get O(N1 × N2 × N4) = O(N1+2+4) = O(N7)
EDIT : In particular, I don't understand this statement :
12 + 22 + 32 + ... + (i2)2 ~ 1/3 i6
Keep in mind that you may have N terms hiding in the "…" part.
because will be
N^1 loops - first for
N^2 loops - second for
N^4 loops - third for
and N^1 * N^2 * N^4 = N^7
I think it is good idea to substitute variables (i,j and k) by N values.
for (int i = 1; i <= N; i++) //<- i = N
for (int j = 1; j <= i*i; j++) //<- i*i = N*N
for (int k = 1; k <= j*j; k++) //<- j*j = (i*i)*(i*i) = N*N*N*N
In the first loop number of iterations will be N, that's simple part.
In second loop number of iterations will be N*N. In the third - N*N*N*N.
So finally, number of iterations will be N (first loop), times N*N (second), times N*N*N*N (third), so N*(N*N)*(N*N*N*N) = N^7
I am working out a function:
total = 0;
for (i = 0; i < N; i++){
for (j = 0; j < i*i; j++){
if (j % i == 0){
for (k=0; k < j; k++){
total++;
I the Big O number for this N^4 or N^5 when you break it down. I am not sure how to handle the % sign and the run time of that inner loop.
A roughly equivalent code would be
total=0;
for (i=1; i<=N; i++)
for(j=1; j <= i*i; j+= i)
for (k=1; k <= j; k++)
total++;
by restricting j to those values that are actually divisible by i. Shifting the range of each variable by one avoids the issue of having i = 0.
Rewriting again gives
total=0;
for (i=1; i<=N; i++)
for(j=1; j <= i; j+= 1)
for (k=1; k <= j*j; k++)
total++;
The j loop iterates the same number of times, but instead ranging over the square numbers directly, we simply iterate over the simple integers and shift the multiplication into the k loop. From this, it should be a little easier to prove that total is increment O(N^4) times: the inner k loop runs O(N^2) times, and itself iterates over O(N^2) values.
I have a series of questions in which I need feedback and answers. I will comment as to what I think, this is not a homework assignment but rather preparation for my exam.
My main problem is determining the iterations of a loop for different cases. How would go about attempting to figure that out?
Evaluate Running time.
Q2.
for(int i =0 ; i < =n ; i++) // runs n times
for(int j =1; j<= i * i; j++) // same reasoning as 1. n^2
if (j % i == 0)
for(int k = 0; k<j; k++) // runs n^2 times? <- same reasoning as above.
sum++;
Correct Answer: N × N2 × N = O(N^4)
For the following Questions below, I do not have the correct answers.
Q3. a)
int x=0; //constant
for(int i=4*n; i>=1; i--) //runs n times, disregard the constant
x=x+2*i;
My Answer: O(n)
b) Assume for simplicity that n = 3^k
int z=0;
int x=0;
for (int i=1; i<=n; i=i*3){ // runs n/3 times? how does it effect final answer?
z = z+5;
z++;
x = 2*x;
}
My Answer: O(n)
c) Assume for simplicity that n = k^2,
int y=0;
for(int j=1; j*j<=n; j++) //runs O(logn)? j <= (n)^1/2
y++; //constant
My Answer: O(logn)
d)
int b=0; //constant
for(int i=n; i>0; i--) //n times
for(int j=0; j<i; j++) // runs n+ n-1 +...+ 1. O(n^2)
b=b+5;
My Answer: O(n^3)
(e)
int y=1;
int j=0;
for(j=1; j<=2n; j=j+2) //runs n times
y=y+i;
int s=0;
for(i=1; i<=j; i++) // runs n times
s++;
My Answer: O(n)
(f)
int b=0;
for(int i=0; i<n; i++) //runs n times
for(int j=0; j<i*n; j++) //runs n^2 x n times?
b=b+5;
My Answer: O(n^4)
(g) Assume for simplicity that n = 3k, for some positive integer k.
int x=0;
for(int i=1; i<=n; i=i*3){ //runs 1, 3, 9, 27...for values of i.
if(i%2 != 0) //will always be true for values above
for(int j=0; j<i; j++) // runs n times
x++;
}
My Answer: O (n x log base 3 n? )
(h) Assume for simplicity that n = k2, for some positive integer k.
int t=0;
for(int i=1; i<=n; i++) //runs n times
for(int j=0; j*j<4*n; j++) //runs O(logn)
for(int k=1; k*k<=9*n; k++) //runs O(logn)
t++;
My Answer: n x logn x log n = O(n log n^2)
(i) Assume for simplicity that n = 2s, for some positive integer s.
int a = 0;
int k = n*n;
while(k > 1) //runs n^2
{
for (int j=0; j<n*n; j++) //runs n^2
{ a++; }
k = k/2;
}
My Answer: O(n^4)
(j)
int i=0, j=0, y=0, s=0;
for(j=0; j<n+1; j++) //runs n times
y=y+j; //y equals n(n+1)/2 ~ O(n^2)
for(i=1; i<=y; i++) // runs n^2 times
s++;
My Answer: O(n^3)
(k)
int i=1, z=0;
while( z < n*(n+1)/2 ){ //arithmetic series, runs n times
z+=i; i++;
}
My Answer: O(n)
(m) Assume for simplicity that n = 2s, for some positive integer s.
int a = 0;
int k = n*n*n;
while(k > 1) //runs O(logn) complexity
{
for (int j=0; j<k; j++) //runs n^3 times
{ a--; }
k = k/2;
}
My Answer: O(n^3 log n)
Question 4
a) True - since its bounded below by n^2
b) False - f(n) not strictly smaller than g(n)
c) True
d) True -bounded by n^10
e) False - f(n) not strictly smaller than g(n)
f) True
g) True
h) false - since does not equal O(nlogn)
i) true
j) not sure
k) not sure
l) not sure - how should I even attempt these?*
Let's go through these one at a time.
Part (a)
int x=0; //constant
for(int i=4*n; i>=1; i--) //runs n times, disregard the constant
x=x+2*i;
My Answer: O(n)
Yep! That's correct. The loop runs O(n) times and does O(1) work per iteration.
Part (b)
int z=0;
int x=0;
for (int i=1; i<=n; i=i*3){ // runs n/3 times? how does it effect final answer?
z = z+5;
z++;
x = 2*x;
}
My Answer: O(n)
Not quite. Think about the values of i as the loop progresses. It will take on the series of values 1, 3, 9, 27, 81, 243, ..., 3k. Since i is tripling on each iteration, it takes on successive powers of three.
The loop clearly only does O(1) work per iteration, so the main question here is how many total iterations there will be. The loop will stop when i > n. If we let k be some arbitrary iteration of the loop, the value of i on iteration k will be 3k. The loop stops when 3k > n, which happens when k > log3 n. Therefore, the number of iterations is only O(log n), so the total complexity is O(log n).
Part (c)
int y=0;
for(int j=1; j*j<=n; j++) //runs O(logn)? j <= (n)^1/2
y++; //constant
My Answer: O(logn)
Not quite. Notice that j is still growing linearly, but the loop runs as long as j2 ≤ n. This means that as soon as j exceeds √ n, the loop will stop. Therefore, there will only be O(√n) iterations of the loop, and since each one does O(1) work, the total work done is O(√n).
Part (d)
int b=0; //constant
for(int i=n; i>0; i--) //n times
for(int j=0; j<i; j++) // runs n+ n-1 +...+ 1. O(n^2)
b=b+5;
My Answer: O(n^3)
Not quite. You're actually doubly-counting a lot of the work you need to do. You're correct that the inner loop will run n + (n-1) + (n-2) + ... + 1 times, which is O(n2) times, but you're already summing up across all iterations of the outer loop. You don't need to multiply that value by O(n) one more time. The most accurate answer would be O(n2).
Part (e)
int y=1;
int j=0;
for(j=1; j<=2n; j=j+2) //runs n times
y=y+i;
int s=0;
for(i=1; i<=j; i++) // runs n times
s++;
My Answer: O(n)
Yep! Exactly right.
Part (f)
int b=0;
for(int i=0; i<n; i++) //runs n times
for(int j=0; j<i*n; j++) //runs n^2 x n times?
b=b+5;
My Answer: O(n^4)
Again, I believe you're overcounting. The inner loop will run 0 + n + 2n + 3n + 4n + ... + n(n-1) = n(0 + 1 + 2 + ... + n - 1) times, so the total work done is O(n3). You shouldn't multiply by the number of times the outer loop runs because you're already summing up across all iterations. The most accurate runtime would be O(n3).
Part (g)
int x=0;
for(int i=1; i<=n; i=i*3){ //runs 1, 3, 9, 27...for values of i.
if(i%2 != 0) //will always be true for values above
for(int j=0; j<i; j++) // runs n times
x++;
}
My Answer: O (n x log base 3 n? )
The outer loop here will indeed run O(log n) times, but let's see how much work the inner loop does. You're correct that the if statement always evaluates to true. This means that the inner loop will do 1 + 3 + 9 + 27 + ... + 3log3 n work. This summation, however, works out to (3log3 n + 1 - 1) / 2 = (3n + 1) / 2. Therefore, the total work done here is just O(n).
Part (h)
int t=0;
for(int i=1; i<=n; i++) //runs n times
for(int j=0; j*j<4*n; j++) //runs O(logn)
for(int k=1; k*k<=9*n; k++) //runs O(logn)
t++;
My Answer: n x logn x log n = O(n log n^2)
Not quite. Look at the second loop. This actually runs O(√n) times using the same logic as one of the earlier parts. That third inner loop also runs O(√n) times, and so the total work done will be O(n2).
Part (i)
int a = 0;
int k = n*n;
while(k > 1) //runs n^2
{
for (int j=0; j<n*n; j++) //runs n^2
{ a++; }
k = k/2;
}
My Answer: O(n^4)
Not quite. The outer loop starts with k initialized to n2, but notice that k is halved on each iteration. This means that the number of iterations of the outer loop will be log (n2) = 2 log n = O(log n), so the outer loop runs only O(log n) times. That inner loop does do O(n2) work, so the total runtime is O(n2 log n).
Part (j)
int i=0, j=0, y=0, s=0;
for(j=0; j<n+1; j++) //runs n times
y=y+j; //y equals n(n+1)/2 ~ O(n^2)
for(i=1; i<=y; i++) // runs n^2 times
s++;
My Answer: O(n^3)
Close, but not quite! The first loop runs in time O(n) and by the time it's done, the value of j is Θ(n2). This means that the second loop runs for time Θ(n2), so the total time spent is Θ(n2).
Part (k)
int i=1, z=0;
while( z < n*(n+1)/2 )//arithmetic series, runs n times
{
z+=i; i++;
}
My Answer: O(n)
That's correct!
Part (l)
That's odd, there is no part (l).
Part (m)
int a = 0;
int k = n*n*n;
while(k > 1) //runs O(logn) complexity
{
for (int j=0; j<k; j++) //runs n^3 times
{ a--; }
k = k/2;
}
My Answer: O(n^3 log n)
Close, but not quite. You're right that the outer loop runs O(log n) times and that the inner loop does O(n3) work on the first iteration. However, look at the number of iterations of the inner loop more closely:
n3 + n3 / 2+ n3 / 4 + n3 / 8 + ...
= n3 (1 + 1/2 + 1/4 + 1/8 + ...)
≤ 2n3
So the total work done here is actually only O(n3), even though there are log n iterations.
Question 4
Your answers are all correct except for these:
f) True
This is actually false. The expression on the left is
(3/2)n3/2 + 5n2 + lg n
which is not Ω(n2 √n) = Ω(n5/2)
For (j), note that log n6 = 6 log n. Does that help?
For (k), ask whether both sides are O and Ω of one another. What do you find?
For (l), use the fact that alogb c = clogba. Does that help?
Hope this helps!