order of growth for loop worst case running time - algorithm

I am having trouble with this. The inner loop depends on the outer loop and from trying out values of n, the loop runs 1+2+4...+sqrt(n) times. Any help would be greatly appreciated!
int sum = 0;
for (int k = 1; k*k <= n; k = k*2)
for (int j = 0; j < k; j++)
sum++;

If K is the largest power of 2 with K*K <= n, then your sum is 1+2+4+8+...+K = 2K+1.
K is clearly less than or equal to sqrt(n), but it's also greater than sqrt(n)/4 (because if not, then 2K*2K would be less than or equal to n, contradicting the fact that K is the largest power of 2 with K*K <= n.
So sqrt(n)/4 < K <= sqrt(n), and your runtime (2K+1) is between sqrt(n)/2+1 and 2sqrt(n)+1, and thus the complexity is Θ(sqrt(n)).

Related

What is the time complexity of following nested dependent loops

for(int i = 2; i < N; i ++)
for(int j = 1; j < N; j = j * i)
sum += 1
I got
Can we generalize it further?
Using an algebraic identity about logarithms, logᵢ(N) = log N/log i, so we can take log N out as a factor and the summation is then of 1/log i. Approximating this summation as the integral of 1/log x, we get that asymptotically it is O(N/log N), per Wikipedia. Since we previously took out a factor of log N, multiplying by this gives a final result of O(N).

Big O Notation for two code fragments

I've have two fragments of code and an explanation of what Big O category they fall into. However, try as I might, I can't tally the explanation with what I can come up either by looking at it or doing sample runs.
The first:
long count = 0;
long n = 1000;
long i, j, k;
for(i = 0; i < n; i++)
for (j = 0; j < i * i; j++)
for (k = 0; k < j; k++)
count++;
Sample runs of this consistently give me N^4, but the answer I've been given is "j can be as large as i^2, which could be as large as N^2. k can be as large as j, which is N^2. The running time is thus proportional to N^N^2^N^2, which is O(N^5)"
Second snippet:
long i, j, k;
long n = 1000;
long count = 0;
for (i = 1; i < n; i++)
for (j = 1; j < i * i; j++)
if (j % i == 0)
for (k = 0; k < j; k++)
count++;
For this the notes say "The if statement is executed at most N3 times, by previous arguments, but it is true only O(N^2) times (because it is true exactly i times for each i). Thus the innermost loop is only executed O(N^2) times. Each time through, it takes O(j^2) = O(N^2) time, for a total of O(N^4)"
For this the notes seem to be accurate enough for the N^4 (although I keep getting a result of N^4 / 10). I don't follow the modulo calculation only being true i times for each i however, it seems to enter that loop a lot less.
So the question is can anyone clarify what I'm not understanding?
For the first one:
sum from i = 0 to n-1 of
sum from j = 0 to i*i-1 of
sum from k = 0 to j-1 of
1
We know the sum of 1 m times is equal to m, so we can reduce this to
sum from i = 0 to n-1 of
sum from j = 0 to i*i-1 of
j
We know the sum 1 + 2 + ... + m = m * (m + 1) / 2, so we can reduce further:
sum from i = 1 to n-1 of
(i * i - 1) * i * i / 2 = (1/2) * (i * i * i * i - i * i)
We can make this easier by taking the (1/2) outside the summation and then splitting up the i * i * i * i and i * i terms; however, the summations are still harder and less well-known than for i alone. It does turn out to be Theta(n^5) hence O(n^5); to at least get an intuitive feeling for why this turns out, recognize that the difference f(n+1) - f(n) = (1/2)(n^4-n^2) which is on the order of n^4, so if f were a continuous function and this difference were the derivative, then the order of f would be one higher.
For the second case:
sum from i = 0 to n-1 of
sum from j = 0 to i-1 of
sum from k = 0 to i*j-1
1
Note that j now assumes only i different values for the purposes of the innermost loop: 0, i, 2i, ..., (i-1)i. The inner loop runs for i times as many iterations as the counter value for j. We do this multiplication shifting to avoid introducing a "step" notation so we can use our usual mathematical results.
sum from i = 0 to n-1 of
sum from j = 0 to i-1 of
i*j
sum from i = 0 to n-1 of
i * (1/2) * i * (i - 1) = (1/2)(i * i * i - i)
Again, we can cheat or do the math or we can use our intuition again to (correctly) surmise this turns out to be Theta(n^4).

Algorithmic order of growth of the code

I am attending a course online and stuck at the following question. What is the order of growth of the worst case running time of the following code fragment as a function of N?
int sum = 0;
for (int i = 1; i <= N*N; i++)
for (int j = 1; j <= i; j++)
for (int k = 1; k <= j; k++)
sum++;
I thought that it is of the order of N^4 but it seems this answer is wrong. can you please explain?
It is of order O(N^6). You should note that it is not true that every loop simply add an order N to the complexity. Considering the following example:
int sum = 0;
for (int i = 1; i <= M; i++)
for (int j = 1; j <= i; j++)
for (int k = 1; k <= j; k++)
sum++;
You should be able to figure out it is of order O(M^3) easily, so if you replace M=N^2, then you will get the answer. The key point is that every inner loop are of order O(N^2) in this case, but not O(N).
Let's denote n = N^2. Then, the loop will execute each time that k <= i <= j. This will be approximately n^3/6 times. Thus, the runtime is O(n^3)= O(N^6)
Explanation: Ignoring for a moment the cases where k==i or j==i or j==k,
we take 1 out of 6 distinct triples :
(a1,a2,a3)
(a1,a3,a2)
(a2,a1,a3)
(a2,a3,a1)
(a3,a2,a1)
(a3,a1,a2)
Overall, there are n^3 triples. Only one out of 6 triples obeys the order.
One run of the inner loop increments sum exactly j times.
One run of the middle loop invokes the inner loop exactly i times, with values of j between 1 and i (inclusive). So it increments sum exactly 1+2+3+...i times, which is i.(i+1)/2 by the well-known "triangular numbers" formula.
The outer loop invokes the middle loop exactly N^2 times (let us denote it as M), with values of i between 1 and M (inclusive). So it increments sum exactly 1+3+6+...M.(M+1)/2 times. Similarly, this is M.(M+1).(M+2)/6, by the not-so-well-known "tetrahedral numbers" formula (http://en.wikipedia.org/wiki/Tetrahedral_number).
All in all, the final value of sum is N^2.(N^2+1).(N^2+2)/6.
Thinking in asymptotic terms, the inner loop is O(j), the middle one O(i^2) (by summation) and the outer one O(M^3) (by summation), i.e. O(N^6).
Also see Faulhaber's formula, which shows that the sum of n^k is O(N^(k+1)) (http://en.wikipedia.org/wiki/Faulhaber%27s_formula).
Any given run of the innermost (k) loop has a time proportional to j, but we've got to do one of those for each of j = 1 through j = i, and that sum 1 + 2 + … + i grows like i^2. So for any given i we've got an O(i^2) running time, but of course we've got to deal with i = 1 through i = N^2. The sum of i^2 for i = 1 through N^2 happens to grow like N^6.

Iterative function

Stuck with me HW - Need to try complexity
time=0;
for (i=n; i>=1; i = sqrt(i))
for (j=1; j<=i; j++)
time++;
What I did - First loop going like this:
i=n, n^(1/2), n^(1/4)...1
than we get:
n^(1/2)^k = 1
if I log both sides one side get 0... what should I do?
I suppose there is a typo somewhere because otherwise it's Θ(∞) if the input n is not smaller than 1. (For i == 1, the update i = sqrt(i) doesn't change i, so that's an infinite loop.)
So let us suppose it's actually
time = 0;
for (i = n; i > 1; i = sqrt(i))
for (j = 1; j <= i; j++)
time++;
Then, to get the complexity of nested loops, you need to sum the complexity of the inner loop for each iteration of the outer loop. Here, the inner loop runs i times, obviously, so we need to sum the values i runs through in the outer loop. These values are n, n^0.5, n^0.25, ..., n^(1/2^k), where k is characterised by
n^(1/2^(k+1)) < 2 <= n^(1/2^k)
or, equivalently,
2^(2^k) <= n < 2^(2^(k+1))
2^k <= lg n < 2^(k+1)
k <= lg (lg n) < k+1
k = floor(lg(lg n))
Now it remains to estimate the sum from above and below to get the Θ of the algorithm. This estimate is very easy if you start writing down the sums for a few (large) values of n.

Asymptotic analysis of three interdependent nested for loops

The code fragment I am to analyse is below:
int sum = 0;
for (int i = 0; i < n; i++)
for (int j = 0; j < i * i; j++)
for (int k = 0; k < j; k++)
sum++;
I know that the first loop is O(n) but that's about as far as I've gotten. I think that the second loop may be O(n^2) but the more I think about it the less sense it makes. Any guidance would be much appreciated.
The first loop executes n times. Each time, the value i grows. For each such i, the second loop executes i*i times. That means the second loop executes 1*1 + 2*2 + 3*3 + ... + n*n times.
This is a summation of squares, and the formula for this is well-known. Hence we have the second loop executing (n(1 + n)(1 + 2 n))/6 times.
Thus, we know that in total there will be (n(1 + n)(1 + 2 n))/6 values of j, and that for each of these the third loop will execute 1 + 2 + ... + j = j(j+1)/2 times. Actually calculating how many times the third loop executes in total would be very difficult. Luckily, all you really need is a least upper bound for the order of the function.
You know that for each of the (n(1 + n)(1 + 2 n))/6 values of j, the third loop will execute less than n(n+1)/2 times. Therefore you can say that the operation sum++ will execute less than [(n(1 + n)(1 + 2 n))/6] * [n(n+1)/2] times. After some quick mental math, that amounts to a polynomial of maximal degree 5, therefore your program is O(n^5).
int sum = 0;
for (int i = 0; i < n; i++) // Let's call this N
for (int j = 0; j < i * i; j++) // Let's call this M
for (int k = 0; k < j; k++) // Let's call this K
sum++;
N is the number of steps of the entire program, M is the number of steps the two inner loops do and lastly K is the number of steps the last loop does.
It is easy to see that K = j, it takes j steps to do.
Then M = Sum(j=0,i^2,K) = Sum(j=0, i^2, j)
(First param is the iterator, second is the upper bound and last param is what we are adding)
This is actually now a sum of n numbers to i*i. Which means we can apply the formula ((n+1)*n)/2
M = Sum(j=0,i^2,j) = ((i^2+1)*(i^2))/2 = (i^4 + i^2)/2
N = Sum(i=0, n, M) = 1/2 * ( Sum(i=0, n, (i^4)) + Sum(i=0, n, (i^2)) )
These are both well known formulas and after a little playing you get:
N = (n^5)/10 + (n^4)/4 + (n^3)/3 + (n^2)/4 + n/15
This should be the exact number of steps the loop takes, but if you are interested in the O notation you can note that n^5 is the strongest growing part so the solution is O(n^5)
If you proceed methodically using Sigma Notation, you'll end up with the following result:
Try to count how many times the inner loop is executed:
The middle loop runs
0*0 times when i == 0
1*1 times when i == 1
2*2 times when i == 2
...
n*n = n^2 times when i == n.
So it is O(n^2).

Resources