I am struggling with this question would like some help , thank you.
Determine the big O running time of the method myMethod() by counting the
approximate number of operations it performs. Show all details of your answer.
Note: the symbol % represent a modulus operator, that is, the remainder of a
number divided by another number.
๐ ๐ก๐๐ก๐๐ ๐๐๐ก ๐๐ฆ๐๐๐กโ๐๐(๐ด[], ๐) {
๐๐๐ข๐๐ก โ 0
๐๐๐ ๐ โ 0 ๐ก๐ ๐ โ 1 {
๐๐๐ ๐ โ ๐ ๐ก๐ ๐ โ 1 {
๐๐๐ข๐๐ก โ ๐๐๐ข๐๐ก+ ๐ด[๐]
๐ โ 1
๐คโ๐๐๐ (๐ < ๐ + 2) {
๐๐ (๐%2 == 0) {
๐๐๐ข๐๐ก = ๐๐๐ข๐๐ก + 1
}
๐ + +
}
}
}
๐๐๐ก๐ข๐๐ ๐๐๐ข๐๐ก
}
The outer for loop with the variable i runs n times.
The inner for loop with the variable j runs n - i times for each iteration of the outer loop. This would make the inner loop run n + n-1 + n-2 +...+ 1 times in aggregation which is the equivalent of n * (n+1) / 2.
The while loop inside the inner for loop runs n + 1 times for each iteration of the inner for loop.
This makes the while loop to run (n * (n+1) / 2) * (n + 2). This produces (n^2 + n) / 2 * (n + 2) = (n^2 + n) * (n + 2) / 2 = (n^3 + 2n^2 + n^2 + 2n) / 2 = (n^3 + 3n^2 + 2n) / 2).
Dropping lower degrees of n and constants we get O(n^3).
You could have also argued that n + n-1 + ... + 1 is O(n^2) times a linear operation becomes O(n^3). Which would have been more intuitive and faster.
Related
I feel that in worst case also, condition is true only two times when j=i or j=i^2 then loop runs for an extra i + i^2 times.
In worst case, if we take sum of inner 2 loops it will be theta(i^2) + i + i^2 , which is equal to theta(i^2) itself;
Summation of theta(i^2) on outer loop gives theta(n^3).
So, is the answer theta(n^3) ?
I would say that the overall performance is theta(n^4). Here is your pseudo-code, given in text format:
for (i = 1 to n) do
for (j = 1 to i^2) do
if (j % i == 0) then
for (k = 1 to j) do
sum = sum + 1
Appreciate first that the j % i == 0 condition will only be true when j is multiples of n. This would occur in fact only n times, so the final inner for loop would only be hit n times coming from the for loop in j. The final for loop would require n^2 steps for the case where j is near the end of the range. On the other hand, it would only take roughly n steps for the start of the range. So, the overall performance here should be somewhere between O(n^3) and O(n^4), but theta(n^4) should be valid.
For fixed i, the i integers 1 โค j โค i2 such that j % i = 0 are {i,2i,...,i2}. It follows that the inner loop is executed i times with arguments i * m for 1 โค m โค i and the guard executed i2 times. Thus, the complexity function T(n) โ ฮ(n4) is given by:
T(n) = โ[i=1,n] (โ[j=1,i2] 1 + โ[m=1,i] โ[k=1,i*m] 1)
= โ[i=1,n] โ[j=1,i2] 1 + โ[i=1,n] โ[m=1,i] โ[k=1,i*m] 1
= n3/3 + n2/2 + n/6 + โ[i=1,n] โ[m=1,i] โ[k=1,i*m] 1
= n3/3 + n2/2 + n/6 + n4/8 + 5n3/12 + 3n2/8 + n/12
= n4/8 + 3n3/4 + 7n2/8 + n/4
so I got this algorithm I need to calculate its time complexity
which goes like
for i=1 to n do
k=i
while (k<=n) do
FLIP(A[k])
k = k + i
where A is an array of booleans, and FLIP is as it is, flipping the current value. therefore it's O(1).
Now I understand that the inner while loop should be called
n/1+n/2+n/3+...+n/n
If I'm correct, but is there a formula out there for such calculation?
pretty confused here
The more exact computation is T(n) \sum((n-i)/i) for i = 1 to n (because k is started from i). Hence, the final sum is n + n/2 + ... + n/n - n = n(1 + 1/2 + ... + 1/n) - n, approximately. We knew 1 + 1/2 + ... + 1/n = H(n) and H(n) = \Theta(\log(n)). Hence, T(n) = \Theta(n\log(n)). The -n has not any effect on the asymptotic computaional cost, as n = o(n\log(n)).
Lets say we want to calculate sum of this equation
n + n / 2 + n / 3 + ... + n / n
=> n ( 1 + 1 / 2 + 1 / 3 + ..... + 1 / n )
Then in bracket ( 1 + 1 / 2 + 1 / 3 + ... + 1 / n ) this is a well known Harmonic series and i am afraid there is no proven formula to calculate Harmonic series.
The given problem boils down to calculate below sum -Sum of harmonic series
Although this sum can't be calculated accurately, however you can still find asymptotic upper bound for this sum, which is approximately O(log(n)).
Hence answer to above problem will be - O(nlog(n))
I was going through this question to calculate time complexity.
int fun(int n)
{
int count = 0;
for (int i = n; i > 0; i /= 2)
for (int j = 0; j < i; j++)
count += 1;
return count;
}
My first impression was O(n log n) but the answer is O(n). Please help me understand why it is O(n).
The inner loop does n iterations, then n/2, then n/4, etc. So the total number of inner loop iterations is:
n + n/2 + n/4 + n/8 + ... + 1
<= n * (1 + 1/2 + 1/4 + 1/8 + ...)
= 2n
(See Geometric series), and therefore is O(n).
For an input integer n,
for i=n, j loop will run n times,
then for i= n/2, j loop will run n/2 times,
.
.
so on,
.
.
till i=1, where j loop will run 1 time.
So,
T(n) = (n + n/2 + n/4 + n/8 + ...... + 1)
T(n) = n(1 + 1/2 + 1/4 + 1/8 + ..... + 1/n) .....(1)
let 1/n = 1/2^k
so, k = logn
now, using summation of geometric series in eq 1
T(n) = n((1-1/2^k) / 1-1/2)
T(n) = 2n(1-1/2^k)
using k = logn
T(n) = 2n(1-1/2^logn)
now for larger value of n, logn tends to infinite and 1/2^infinite will tend to 0.
so, T(n) = 2n
so, T(n) = O(n)
For a input integer n,
the innermost statement of fun() is executed following times. n + n/2 + n/4 + ... 1 So time complexity T(n) can be written as
T(n) = O(n + n/2 + n/4 + ... 1) = O(n) The value of count is also n + n/2 + n/4 + .. + 1
The outermost loop iterates in O(logn) so it is ignored as O(n) is high
let's have i and j table
where n = 8
i -> less than 0
j -> less than i
i
j
8
[0,7]
4
[0,3]
2
[0,1]
1
[0,0]
"i" which is outer loop is logn
Now check for "j" inner loop
[0, 7] -> b - a + 1 -> 8 -> n
[0, 3] -> b - a + 1 -> 4 -> n/2
[0, 1] -> b - a + 1 -> 2 -> n/4
[0, 0] -> b - a + 1 -> 1 -> n/n
if you see it's [n + n/2 + n/4....1] it's moving to infinity not logn times
it's infinite series right if we look closely it's GP with infinite series n[1 + 1/2 + 1/4 + .......] where (r < 1)
sum of series will gives us 2 (for GP proof you check google)
which is
n[1 + 1/2 + 1/4 + .......] -> 2n
so logn + 2n => O(n)
// n > 0
i โ 0
while (i < n)
j โ 0
while (j < power(2,i))
j โ j + 1
done
i โ i + 1
done
Is the overall complexity O(n(log(n)) because the inner while loop has a conditional where 2^i so 2^0 2^1 2^2 ... = 1 2 8 16 32 64 128... etc. So for 2^i < n --> log(n) > i?
And the outer loop looks to be simply O(n).
Multiple both loop complexities for O(n(log(n)), confirm please? Thanks in advance.
It's O(2^n)
For the outer loop, the number of iterations is n, so the inner loop executes for every value of i from 0 to n-1.
The number of iterations of the inner loop each time is 2^i, so the total number of iterations for the entire program is:
2^0 + 2^1 + 2^2 + 2^3 + 2^4 + 2^5 + ... +2^(n-1)
This sum is equal to 2^n - 1. Because 2^n is so large compared to 1, we can drop the 1 in the big-O notation, giving us O(2^n).
Using a formal methodology through Sigma notation:
looks like O(2^n), inner loop is a sum of (2^i) from i=0 to i=n-1, which sums to 2^(i)-1 operations
i โ 0
while (i < n)
j โ 0
while (j < power(2,i))
j โ j + 1
done
i โ i + 1
done
Time Complexity
Time = 2^0 + (2^0 + 2^1) + .. + (2^0 + .. + 2^(n-1))
Time = n * 2^0 + (n-1) * 2^1 + .. + (n-(n-1)) * 2^(n-1)
Time = SUM { (n-k) * 2^k | k = 0..(n-1) }
Time = 2^(n+1) - n - 2
Time ~ O(2^(n+1))
Here's the pseudocode:
Baz(A) {
big = โโ
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
sum = 0
for k = j to j + i - 1
sum = sum + A(k)
if sum > big
big = sum
return big
So line 3 will be O(n) (n being the length of the array, A)
I'm not sure what line 4 would be...I know it decreases by 1 each time it is run, because i will increase.
and I can't get line 6 without getting line 4...
All help is appreciated, thanks in advance.
Let us first understand how first two for loops work
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
First for loop will run from 1 to n(length of Array A) and the second for loop will depend on value of i. SO when i = 1 second for loop will run for n times..When i increments to 2 your second for loop will run for (n-1) time ..so it will go on till 1.
So your second for loop will run as follows:
n + (n - 1) + (n - 2) + (n - 3) + .... + 1 times...
You can use following formula: sum(1 to n) = N * (N + 1) / 2 which gives (N^2 + N)/2 So we have Big oh for these two loops as
O(n^2) (Big Oh of n square )
Now let us consider third loop also...
Your third for loop looks like this
for k = j to j + i - 1
But this actually means,
for k = 0 to i - 1 (you are just shifting the range of values by adding/subtracting j but number of times the loop should run will not change, as difference remains same)
So your third loop will run from 0 to 1(value of i) for first n iterations of second loop then it will run from 0 to 2(value of i) for first (n - 1) iterations of second loop and so on..
So you get:
n + 2(n-1) + 3(n-2) + 4(n-3).....
= n + 2n - 2 + 3n - 6 + 4n - 12 + ....
= n(1 + 2 + 3 + 4....) - (addition of some numbers but this can not be greater than n^2)
= `N(N(N+1)/2)`
= O(N^3)
So your time complexity will be N^3 (Big Oh of n cube)
Hope this helps!
Methodically, you can follow the steps using Sigma Notation:
Baz(A):
big = โโ
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
sum = 0
for k = j to j + i - 1
sum = sum + A(k)
if sum > big
big = sum
return big
For Big-O, you need to look for the worst scenario
Also the easiest way to find the Big-O is to look into most important parts of the algorithm, it can be loops or recursion
So we have this part of the algorithm consisting of loops
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
for k = j to j + i - 1
sum = sum + A(k)
We have,
SUM { SUM { i } for j = 1 to n-i+1 } for i = 1 to n
= 1/6 n (n+1) (n+2)
= (1/6 n^2 + 1/6 n) (n + 2)
= 1/6 n^3 + 2/6 2 n^2 + 1/6 n^2 + 2/6 n
= 1/6 n^3 + 3/6 2 n^2 + 2/6 n
= 1/6 n^3 + 1/2 2 n^2 + 1/3 n
T(n) ~ O(n^3)