Calculate the code complexity of below code - computation-theory

I feel that in worst case also, condition is true only two times when j=i or j=i^2 then loop runs for an extra i + i^2 times.
In worst case, if we take sum of inner 2 loops it will be theta(i^2) + i + i^2 , which is equal to theta(i^2) itself;
Summation of theta(i^2) on outer loop gives theta(n^3).
So, is the answer theta(n^3) ?

I would say that the overall performance is theta(n^4). Here is your pseudo-code, given in text format:
for (i = 1 to n) do
for (j = 1 to i^2) do
if (j % i == 0) then
for (k = 1 to j) do
sum = sum + 1
Appreciate first that the j % i == 0 condition will only be true when j is multiples of n. This would occur in fact only n times, so the final inner for loop would only be hit n times coming from the for loop in j. The final for loop would require n^2 steps for the case where j is near the end of the range. On the other hand, it would only take roughly n steps for the start of the range. So, the overall performance here should be somewhere between O(n^3) and O(n^4), but theta(n^4) should be valid.

For fixed i, the i integers 1 ≤ j ≤ i2 such that j % i = 0 are {i,2i,...,i2}. It follows that the inner loop is executed i times with arguments i * m for 1 ≤ m ≤ i and the guard executed i2 times. Thus, the complexity function T(n) ∈ Θ(n4) is given by:
T(n) = ∑[i=1,n] (∑[j=1,i2] 1 + ∑[m=1,i] ∑[k=1,i*m] 1)
= ∑[i=1,n] ∑[j=1,i2] 1 + ∑[i=1,n] ∑[m=1,i] ∑[k=1,i*m] 1
= n3/3 + n2/2 + n/6 + ∑[i=1,n] ∑[m=1,i] ∑[k=1,i*m] 1
= n3/3 + n2/2 + n/6 + n4/8 + 5n3/12 + 3n2/8 + n/12
= n4/8 + 3n3/4 + 7n2/8 + n/4

Related

How to find the big theta?

Here's some code segment I'm trying to find the big-theta for:
i = 1
while i ≤ n do #loops Θ(n) times
A[i] = i
i = i + 1
for j ← 1 to n do #loops Θ(n) times
i = j
while i ≤ n do #loops n times at worst when j = 1, 1 times at best given j = n.
A[i] = i
i = i + j
So given the inner while loop will be a summation of 1 to n, the big theta is Θ(n2. So does that mean the big theta is Θ(n2) for the entire code?
The first while loop and the inner while loop should be equal to Θ(n) + Θ(n2) which should just equal Θ(n2).
Thanks!
for j = 1 to n step 1
for i = j to n step j
# constant time op
The double loop is O(n⋅log(n)) because the number of iterations in the inner loop falls inversely to j. Counting the total number of iterations gives:
floor(n/1) + floor(n/2) + ... + floor(n/n) <= n⋅(1/1 + 1/2 + ... + 1/n) ∼ n⋅log(n)
The partial sums of the harmonic series have logarithmic behavior asymptotically, so the above shows that the double loop is O(n⋅log(n)). That can be strengthened to Θ(n⋅log(n)) with a math argument involving the Dirichlet Divisor Problem.
[ EDIT ] For an alternative derivation of the lower bound that establishes the Θ(n⋅log(n)) asymptote, it is enough to use the < part of the x - 1 < floor(x) <= x inequality, avoiding the more elaborate math (linked above) that gives the exact expression.
floor(n/1) + floor(n/2) + ... + floor(n/n) > (n/1 - 1) + (n/2 - 1) + ... + (n/n - 1)
= n⋅(1/1 + 1/2 + ... + 1/n) - n
∼ n⋅log(n) - n
∼ n⋅log(n)

Most appropriate run-time formula in terms of primitive operations needed for input of size n

For the following block of code, select the most appropriate run-time formula in terms of primitive operations needed for input of size n:
When resolving from inside out, i get:
inner loop = 3n+1
main loop + inner loop = 3 + (3n +1) + logn = 4 + 3n + logn
extra steps + all loops = 4 + n(4 + 3n + logn) = 4 + 4n + 3n2 + logn
This is the code to analyze:
def rate(n):
total= 0
i = 1
while i < n:
j = 0
while j < n:
total= i * j + total
j = j + 1
i = i * 2
return total
and the answer is supposed to be --> f(n) = 4 + 4log(n) + log(n)*(3n)
I am actually coming up with O(NlgN) here for the overall running time. Appreciate that the inner loop in j is not dependent on the outer loop in i. The following should be true:
The outer loop in i is O(lgN), because i is doubling at each iteration, which is exponential behavior.
The inner loop in j is O(N), because j cycles from 0 to N at each iteration, regardless of the value of i.
We may therefore multiply together these complexities to get the overall complexity.
Note that for N of arbitrarily large size, your expression:
4 + 4log(n) + log(n)*(3n)
reduces to NlgN.
def rate(n):
total= 0
i = 1
while i < n: //This outer loop runs O(log(n)) times
j = 0
while j < n: //This inner loop runs O(n) times for each iteration of outer loop
total= i * j + total
j = j + 1
i = i * 2
return total
Hence, the total runtime complexity for your implementation in big-O is = O(log(n)) * O(n) = O(nlog(n)).

What is the time complexity for given snippet?

for i = 1 to n do
for j = 1 to i do
for k = 1 to j do
What is its time complexity in terms of 'n'?
The inner-most loop will obviously run j times. Assuming that it contains operations worth 1 time unit, this will be:
T_inner(j) = j
The middle loop will run i times, i.e.
T_middle(i) = Sum {j from 1 to i} T_inner(j)
= Sum {j from 1 to i} j
= i/2 * (1 + i)
Finally:
T_outer(n) = Sum {i from 1 to n} T_middle(i)
= Sum {i from 1 to n} (i/2 * (1 + i))
= 1/6 * n * (1 + n) * (2 + n)
= 1/6 n^3 + 1/2 n^2 + 1/3 n
And this is obviously O(n^3).
Note: This only counts the operations in the inner most block. It neglects the operations necessary to perform the loop. But if you include those, you will see that the time complexity is the same.

Time complexity of the algorithm?

This is the algorithm: I think its time complexity is O(n^2) because of loop in loop. How can I explain that?
FindSum(array, n, t)
i := 0
found := 0
array := quick_sort(array, 0, n - 1)
while i < n – 2
j = i + 1
k = n - 1
while k > j
sum = array[i] + array[j] + array[k]
if sum == t
found += 1
k -= 1
j += 1
else if sum > t
k -= 1
else
j += 1
Yes, the complexity is indeed O(n^2).
The inner loops runs anywhere between k-j = n-1-(i+1) = n-i-2 to (k-j)/2 = (n-i-2)/2 iterations.
Summing it up for all possible values of i from 0 to n-2 gives you:
T = n-0-2 + n-1-2 + n-2-2 + ... + n-(n-2)-2
= n-2 + n-3 + ... + 0
This is sum of arithmetic progression, that sums in (n-1)(n-2)/2 (sum of arithmetic progression), which is quadric. Note that dividing by extra 2 (for "best" case of inner loop) does not change time complexity in terms of big O notation.

Big O runtime for this algorithm?

Here's the pseudocode:
Baz(A) {
big = −∞
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
sum = 0
for k = j to j + i - 1
sum = sum + A(k)
if sum > big
big = sum
return big
So line 3 will be O(n) (n being the length of the array, A)
I'm not sure what line 4 would be...I know it decreases by 1 each time it is run, because i will increase.
and I can't get line 6 without getting line 4...
All help is appreciated, thanks in advance.
Let us first understand how first two for loops work
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
First for loop will run from 1 to n(length of Array A) and the second for loop will depend on value of i. SO when i = 1 second for loop will run for n times..When i increments to 2 your second for loop will run for (n-1) time ..so it will go on till 1.
So your second for loop will run as follows:
n + (n - 1) + (n - 2) + (n - 3) + .... + 1 times...
You can use following formula: sum(1 to n) = N * (N + 1) / 2 which gives (N^2 + N)/2 So we have Big oh for these two loops as
O(n^2) (Big Oh of n square )
Now let us consider third loop also...
Your third for loop looks like this
for k = j to j + i - 1
But this actually means,
for k = 0 to i - 1 (you are just shifting the range of values by adding/subtracting j but number of times the loop should run will not change, as difference remains same)
So your third loop will run from 0 to 1(value of i) for first n iterations of second loop then it will run from 0 to 2(value of i) for first (n - 1) iterations of second loop and so on..
So you get:
n + 2(n-1) + 3(n-2) + 4(n-3).....
= n + 2n - 2 + 3n - 6 + 4n - 12 + ....
= n(1 + 2 + 3 + 4....) - (addition of some numbers but this can not be greater than n^2)
= `N(N(N+1)/2)`
= O(N^3)
So your time complexity will be N^3 (Big Oh of n cube)
Hope this helps!
Methodically, you can follow the steps using Sigma Notation:
Baz(A):
big = −∞
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
sum = 0
for k = j to j + i - 1
sum = sum + A(k)
if sum > big
big = sum
return big
For Big-O, you need to look for the worst scenario
Also the easiest way to find the Big-O is to look into most important parts of the algorithm, it can be loops or recursion
So we have this part of the algorithm consisting of loops
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
for k = j to j + i - 1
sum = sum + A(k)
We have,
SUM { SUM { i } for j = 1 to n-i+1 } for i = 1 to n
= 1/6 n (n+1) (n+2)
= (1/6 n^2 + 1/6 n) (n + 2)
= 1/6 n^3 + 2/6 2 n^2 + 1/6 n^2 + 2/6 n
= 1/6 n^3 + 3/6 2 n^2 + 2/6 n
= 1/6 n^3 + 1/2 2 n^2 + 1/3 n
T(n) ~ O(n^3)

Resources