Runtime of for loop - for-loop

What's the runtime for this nested for loop in big O notation?
for(i = 1 to k)
{
for(j = i+1 to k)
{}
}
It's smaller than O(k^2) but I can't figure it out.

Your question is closely related to the series sum S(k) = 0 + 1 + 2 + ... + (k-2) + (k-1). It can be shown that S(k) = (k*(k-1))/2 = (k*k)/2 - k/2. [How? Reorder the sum as S(k) = {0+(k-1)} + {1+(k-2)} + {2+(k-3)} + .... This shows how.]
Therefore, is the algorithmic order smaller than O(k*k)? Remember that constant coefficients like 1/2 do not influence the big O notation.
Question: So it's equivalent to replacing j = i+1 to k with j = 1 to k?
Answer: Right. This is tricky, so let's think it through. For i == 1, how many times does the inner loop's action run? Answer: it runs k-1 times. Again, for i == 2, how many times does the inner loop's action run? Answer: it runs k-2 times. Ultimately, for i == k, how many times does the inner loop's action run? Answer: it runs zero times. Therefore, over all values of i, how many times does the inner loop's action run? Answer: (k-1) + (k-2) + ... + 0, which is just the aforementioned sum S(k).

Related

How can I calculate the time function T(n) of the following code?

x=0;
for(int i=1 ; i<=n ; i++){
for(int j=1 ; j<=n ; j++){
x++;
n--;
}
}
By testing the code, the nested FOR loop recurs ⌈n/2⌉ times per steps of the first For loop.
But I don't know how to formulate these rules with sigmas. I would really appreciate if anyone can help me with this.
You can express T(n) as T(n-2)+1, i.e. T(n)=T(n-2)+1 Or its expected time complexity is O(n/2) => O(n).
Edit: T(n-2)+1 expression is evaluated as you can see if you increase n-2 by 2 means when n-2 became n, the number of times the loop will be executed is the 1 + number of time loop executed for n-2. 1 is because you are incrementing j and decrementing n simultaneously. it is exactly the same as incrementing j by 2.
Let's compute the exact value of x.
TL;DR: x(N) = N-[N/2^i+1], where i is the lowest number, satisfying the condition: (i+1) 2^i > N. As Mariano Demarchi said, T(N)=O(N).
First we will check how variables change after each inner loop. Let we have (n, i, x) values between 2 and 3 lines in code (before the inner loop):
How many iterations will happens? Each iteration increases j and decreases n, so the distance between them decreases by two. Start distance is n-1, and final, after the loop, is 0 (if n is odd) or -1 (if n is even). Thus if n=2k, the answer is k, otherwise k+1. So, the inner loop makes [(n+1)/2] = d iterations.
Thus x will increase by d, n becomes n-d and i becomes i+1.
(n, i, x) -> (n-d, i+1, x+d) or equal ([n/2], i+1, x + [(n+1)/2])
Now concentrate on values of n and i variables in the big loop:
They changes like that: (n, i) -> ([n/2], i+1)
The stop-condition is [N/2^i] < i+1, which is equals to (i+1)*2^i > N. Of course, we need minimal i, satisfying the condition.
So, i is the first number satisfying the condition and we DO NOT SUM further:
x = [(N+1)/2] + [([N/2]+1)/2] + [([N/2^2]+1)/2] + ... + [([N/2^i-1]+1)/2]
By the number theory magic (not related on this question) this series is equals to N (1-1/2^i+1). Particularly, if N is a power of 2 minus 1, we can see it easily.
So, this code returns exactly the same value in O(log(N)) time.
// finding i
unsigned long long i = 0;
while ((i + 1) * (1ll << i) < n)
++i;
// finding x
x = n - (n >> (i + 1));
In the inner loop, given that n decrements at the same time that j increments, n is going to be lower than j at the middle of the difference between both initial values, that is (n-1)/2.
That's why your tests show that the inner loop runs ⌈n/2⌉ times per each iteration of the outer loop.
Then the outer loop is going to stop when for the i that satisfies this equality n/2^i = i-1. This affects the outer loop stopping condition.
T(n)
=
n/2 + T(n/2)
=
n/2 + n/4 + T(n/4)
=
n (1/2 + 1/4 + ... + 1/(2^i))
This series converges to n so that algorithm is O(n).

confused about the time complexity of the follwing func. A good explanation would be helpful

If the the first loop runs for n+1 times.
second loop runs for n(n+1) times.
third loop will run for ??? it has n^2+1 one relation with with with the second loop i guess but how about with the first one
somefunction(n) {
c = 0
for (i = 1 to n*n)
for (j = 1 to n)
for (k = 1 to 2*j)
c = c+1
return c
}
The first loop has O(n**2) iterations.
The second loop has O(n) iterations.
The third loop has O(n) iterations as well, since j is steadily increasing towards n.
(It's a little easier to see if you sum up the number of times c = c + 1 executes for the two inner loops combined. The inner loop runs 2 times for j = 1, 4 for j = 2, ..., and 2*n times for j = n. 2 + 4 + .. + 2*n = O(n**2).)
You can then (loosely speaking) multiply the three values together to get a total bound of O(n**4).

Calculate the code complexity of below code

I feel that in worst case also, condition is true only two times when j=i or j=i^2 then loop runs for an extra i + i^2 times.
In worst case, if we take sum of inner 2 loops it will be theta(i^2) + i + i^2 , which is equal to theta(i^2) itself;
Summation of theta(i^2) on outer loop gives theta(n^3).
So, is the answer theta(n^3) ?
I would say that the overall performance is theta(n^4). Here is your pseudo-code, given in text format:
for (i = 1 to n) do
for (j = 1 to i^2) do
if (j % i == 0) then
for (k = 1 to j) do
sum = sum + 1
Appreciate first that the j % i == 0 condition will only be true when j is multiples of n. This would occur in fact only n times, so the final inner for loop would only be hit n times coming from the for loop in j. The final for loop would require n^2 steps for the case where j is near the end of the range. On the other hand, it would only take roughly n steps for the start of the range. So, the overall performance here should be somewhere between O(n^3) and O(n^4), but theta(n^4) should be valid.
For fixed i, the i integers 1 ≤ j ≤ i2 such that j % i = 0 are {i,2i,...,i2}. It follows that the inner loop is executed i times with arguments i * m for 1 ≤ m ≤ i and the guard executed i2 times. Thus, the complexity function T(n) ∈ Θ(n4) is given by:
T(n) = ∑[i=1,n] (∑[j=1,i2] 1 + ∑[m=1,i] ∑[k=1,i*m] 1)
= ∑[i=1,n] ∑[j=1,i2] 1 + ∑[i=1,n] ∑[m=1,i] ∑[k=1,i*m] 1
= n3/3 + n2/2 + n/6 + ∑[i=1,n] ∑[m=1,i] ∑[k=1,i*m] 1
= n3/3 + n2/2 + n/6 + n4/8 + 5n3/12 + 3n2/8 + n/12
= n4/8 + 3n3/4 + 7n2/8 + n/4

Algorithm complexity for this function

def mystery(L):
sum1 = 0
sum2 = 0
bound = 1
while bound <= len(L):
i = 0
while i < bound:
j = 0
while j < len(L):
if L[j] > L[i]:
sum1 = sum1 + L[j]
j = j + 2
j = 1
while j < len(L):
sum2 = sum2 + L[j]
j = j*2
i = i + 1
bound = bound * 2
return sum1 + sum2
I am having trouble finding the complexity of this function. I got to the i loop and don't know what to do.
It's a bit tricky to sort out how many times the middle level while loop runs. The outer loop increases bound by a factor of two on each pass (up to len(L)), which means the i loop will run O(bound) times per pass for O(log(N)) passes (where N is len(L)). The tricky part is how to add up the bound values, since they're changing on each pass.
I think the easiest way to figure out the sum is to start with the largest bound, just before the loop quits. First, lets assume that N (aka len(L)) is a power of 2. Then the last bound value will be exactly equal to N. The next smaller one (used on the next to last iteration) will be N/2 and the next after that will be N/4. Their sum will be:
N + N/2 + N/4 + N/8 + ... + 1
If we factor out N from each term, we'll get:
N*(1 + 1/2 + 1/4 + 1/8 + ... + 1/N)
You should recognize the sum in the parentheses, it's a simple geometric series (the sum of the powers of 1/2), which comes up pretty often in mathematics and analysis. If the sum went on forever, it would add up exactly to 2. Since we're quitting a bit early, it will be less than two by an amount equal to the last term (1/N). When we multiply the N term in again, we get the whole thing as being run 2*N - 1 times, so the loop is O(N)
The same Big-O bound works when N is not exactly a power of 2, since the values we added up in the analysis above will each serve as the upper bound for one of the actual bound values we will see in the loop.
So, the i loop runs O(N) times.

Basic Algorithm Analysis and Summation Notation

So for a homework we had to count the number of steps in a piece of code. Here it is:
int sum = 0;
for (int i = 1; i <= n*n; i++)
for (int j = 1; j <= i; j++)
for (int k = 1; k <= 6; k++)
sum++;
My prof (i think) explained that the number of operations in the 2nd line could be found using summation notation, like so:
n^2
Σ x 4 + 3
i=1
which would be 1/2(n^4 + n^2) x 4 + 3 = 2n^4 + 2n^2 + 3
but from just looking the line, I would think it would be something like 4n^4 + 2 (my prof said 4n^4 + 3, I'm not sure where the third operation is though...)
Am I doing the summation notation wrong here? It made sense to me to do summation notation for nested for loops, but I don't know why it would work for a for loop by itself.
Thanks.
Actually even your prof result is wrong. The exact result is 3n^4+3n^2.
To obtain that result simply consider:
All passages are pretty simple (the passage from step 4 to step 5 is immediate if you consider the formula for the sum of the firsts n natural numbers).
I guess both you and your professor are wrong. According to my calculation (I might be wrong too) it should be 3n^4+3n^2.
The outer most loop will run n^2 times. Taken this into consideration the inner loop will run 1 time for the first iteration and so on till n^2. i.e. from j=1 to j=1,2,3,4 ... n^2. If we sum the series (1+2+3...n^2) this becomes (n^2(n^2+1))/2.
So for n^2 iterations of outer loop the inner loop will execute (n^2(n^2+1))/2 times. The most inner loop executes six times for every iteration of the second loop. So by just multiplying (n^2(n^2+1))/2 with 6 it evaluates to 3n^4+3n^2.
To check the answer let's take an example. Say n=5, run your algorithm and print the sum this will give 1950. Now substitute this value in the evaluated expression, this will be like 3(5^4)+3(5^2) and again this evaluates to 1950.
What you need to calculate is this:
S = sum(i in 1..n^2) sum(j in 1..i) sum(k in 1..6) 1
Now, the innermost sum is obviously 6, hence we have
S = sum(i in 1..n^2) sum(j in 1..i) 6
= 6 sum(i in 1..n^2) sum(j in 1..i) 1
The innermost sum is just the sum of the first i numbers, which you should know is i(i + 1)/2, giving
S = 6 sum(i in 1..n^2) i(i + 1)/2
= 3 sum(i in 1..n^2) i(i + 1)
= 3 sum(i in 1..n^2) (i^2 + i)
We can separate this into two sums:
S = 3 [ (sum(i in 1..n^2) i^2) + (sum(i in 1..n^2) i) ]
The second sum there is just our old friend, the sum of the first n^2 numbers, so expanding that is easy.
The first sum there is a new friend, the sum of the first n^2 squares. You can google for that if you don't know it off hand.
Drop in the formulae, expand a little, tidy with a broom, and you should get your answer.
Cheers!

Resources