Time complexity of fun()? - algorithm

I was going through this question to calculate time complexity.
int fun(int n)
{
int count = 0;
for (int i = n; i > 0; i /= 2)
for (int j = 0; j < i; j++)
count += 1;
return count;
}
My first impression was O(n log n) but the answer is O(n). Please help me understand why it is O(n).

The inner loop does n iterations, then n/2, then n/4, etc. So the total number of inner loop iterations is:
n + n/2 + n/4 + n/8 + ... + 1
<= n * (1 + 1/2 + 1/4 + 1/8 + ...)
= 2n
(See Geometric series), and therefore is O(n).

For an input integer n,
for i=n, j loop will run n times,
then for i= n/2, j loop will run n/2 times,
.
.
so on,
.
.
till i=1, where j loop will run 1 time.
So,
T(n) = (n + n/2 + n/4 + n/8 + ...... + 1)
T(n) = n(1 + 1/2 + 1/4 + 1/8 + ..... + 1/n) .....(1)
let 1/n = 1/2^k
so, k = logn
now, using summation of geometric series in eq 1
T(n) = n((1-1/2^k) / 1-1/2)
T(n) = 2n(1-1/2^k)
using k = logn
T(n) = 2n(1-1/2^logn)
now for larger value of n, logn tends to infinite and 1/2^infinite will tend to 0.
so, T(n) = 2n
so, T(n) = O(n)

For a input integer n,
the innermost statement of fun() is executed following times. n + n/2 + n/4 + ... 1 So time complexity T(n) can be written as
T(n) = O(n + n/2 + n/4 + ... 1) = O(n) The value of count is also n + n/2 + n/4 + .. + 1
The outermost loop iterates in O(logn) so it is ignored as O(n) is high

let's have i and j table
where n = 8
i -> less than 0
j -> less than i
i
j
8
[0,7]
4
[0,3]
2
[0,1]
1
[0,0]
"i" which is outer loop is logn
Now check for "j" inner loop
[0, 7] -> b - a + 1 -> 8 -> n
[0, 3] -> b - a + 1 -> 4 -> n/2
[0, 1] -> b - a + 1 -> 2 -> n/4
[0, 0] -> b - a + 1 -> 1 -> n/n
if you see it's [n + n/2 + n/4....1] it's moving to infinity not logn times
it's infinite series right if we look closely it's GP with infinite series n[1 + 1/2 + 1/4 + .......] where (r < 1)
sum of series will gives us 2 (for GP proof you check google)
which is
n[1 + 1/2 + 1/4 + .......] -> 2n
so logn + 2n => O(n)

Related

Determine the big O running time

I am struggling with this question would like some help , thank you.
Determine the big O running time of the method myMethod() by counting the
approximate number of operations it performs. Show all details of your answer.
Note: the symbol % represent a modulus operator, that is, the remainder of a
number divided by another number.
๐‘ ๐‘ก๐‘Ž๐‘ก๐‘–๐‘ ๐‘–๐‘›๐‘ก ๐‘š๐‘ฆ๐‘€๐‘’๐‘กโ„Ž๐‘œ๐‘‘(๐ด[], ๐‘›) {
๐‘๐‘œ๐‘ข๐‘›๐‘ก โ† 0
๐‘“๐‘œ๐‘Ÿ ๐‘– โ† 0 ๐‘ก๐‘œ ๐‘› โˆ’ 1 {
๐‘“๐‘œ๐‘Ÿ ๐‘— โ† ๐‘– ๐‘ก๐‘œ ๐‘› โˆ’ 1 {
๐‘๐‘œ๐‘ข๐‘›๐‘ก โ† ๐‘๐‘œ๐‘ข๐‘›๐‘ก+ ๐ด[๐‘—]
๐‘˜ โ† 1
๐‘คโ„Ž๐‘–๐‘™๐‘’ (๐‘˜ < ๐‘› + 2) {
๐‘–๐‘“ (๐‘—%2 == 0) {
๐‘๐‘œ๐‘ข๐‘›๐‘ก = ๐‘๐‘œ๐‘ข๐‘›๐‘ก + 1
}
๐‘˜ + +
}
}
}
๐‘Ÿ๐‘’๐‘ก๐‘ข๐‘Ÿ๐‘› ๐‘๐‘œ๐‘ข๐‘›๐‘ก
}
The outer for loop with the variable i runs n times.
The inner for loop with the variable j runs n - i times for each iteration of the outer loop. This would make the inner loop run n + n-1 + n-2 +...+ 1 times in aggregation which is the equivalent of n * (n+1) / 2.
The while loop inside the inner for loop runs n + 1 times for each iteration of the inner for loop.
This makes the while loop to run (n * (n+1) / 2) * (n + 2). This produces (n^2 + n) / 2 * (n + 2) = (n^2 + n) * (n + 2) / 2 = (n^3 + 2n^2 + n^2 + 2n) / 2 = (n^3 + 3n^2 + 2n) / 2).
Dropping lower degrees of n and constants we get O(n^3).
You could have also argued that n + n-1 + ... + 1 is O(n^2) times a linear operation becomes O(n^3). Which would have been more intuitive and faster.

time complexity for this algorithm

for (i = 1; i <= n; i *= 2)
for (j = 1; j <= i; j++)
sum++;
I think it's time complexity is O(n) (1 + 2 + 2 + 3 + 3 + 3 ... = (1(2^(log2n)-1))/(2-1), sum of geometric sequence). Is it right? I'm confused with my answser.
i changes in these steps: 1, 2, 2^2, ..., 2^log(n) and in each iteration of the outer loop, the inner loop run i times. Hence, the time complexity T(n) is 1 + 2 + 2^2 + ... + 2^log(n) which is equal to 2^{log(n) + 1} -1. Threfore, T(n) = Theta(n).

How to find the big theta?

Here's some code segment I'm trying to find the big-theta for:
i = 1
while i โ‰ค n do #loops ฮ˜(n) times
A[i] = i
i = i + 1
for j โ† 1 to n do #loops ฮ˜(n) times
i = j
while i โ‰ค n do #loops n times at worst when j = 1, 1 times at best given j = n.
A[i] = i
i = i + j
So given the inner while loop will be a summation of 1 to n, the big theta is ฮ˜(n2. So does that mean the big theta is ฮ˜(n2) for the entire code?
The first while loop and the inner while loop should be equal to ฮ˜(n) + ฮ˜(n2) which should just equal ฮ˜(n2).
Thanks!
for j = 1 to n step 1
for i = j to n step j
# constant time op
The double loop is O(nโ‹…log(n)) because the number of iterations in the inner loop falls inversely to j. Counting the total number of iterations gives:
floor(n/1) + floor(n/2) + ... + floor(n/n) <= nโ‹…(1/1 + 1/2 + ... + 1/n) โˆผ nโ‹…log(n)
The partial sums of the harmonic series have logarithmic behavior asymptotically, so the above shows that the double loop is O(nโ‹…log(n)). That can be strengthened to ฮ˜(nโ‹…log(n)) with a math argument involving the Dirichlet Divisor Problem.
[ EDIT ] For an alternative derivation of the lower bound that establishes the ฮ˜(nโ‹…log(n)) asymptote, it is enough to use the < part of the x - 1 < floor(x) <= x inequality, avoiding the more elaborate math (linked above) that gives the exact expression.
floor(n/1) + floor(n/2) + ... + floor(n/n) > (n/1 - 1) + (n/2 - 1) + ... + (n/n - 1)
= nโ‹…(1/1 + 1/2 + ... + 1/n) - n
โˆผ nโ‹…log(n) - n
โˆผ nโ‹…log(n)

Determining Big-o While Loop

// n > 0
i โ† 0
while (i < n)
j โ† 0
while (j < power(2,i))
j โ† j + 1
done
i โ† i + 1
done
Is the overall complexity O(n(log(n)) because the inner while loop has a conditional where 2^i so 2^0 2^1 2^2 ... = 1 2 8 16 32 64 128... etc. So for 2^i < n --> log(n) > i?
And the outer loop looks to be simply O(n).
Multiple both loop complexities for O(n(log(n)), confirm please? Thanks in advance.
It's O(2^n)
For the outer loop, the number of iterations is n, so the inner loop executes for every value of i from 0 to n-1.
The number of iterations of the inner loop each time is 2^i, so the total number of iterations for the entire program is:
2^0 + 2^1 + 2^2 + 2^3 + 2^4 + 2^5 + ... +2^(n-1)
This sum is equal to 2^n - 1. Because 2^n is so large compared to 1, we can drop the 1 in the big-O notation, giving us O(2^n).
Using a formal methodology through Sigma notation:
looks like O(2^n), inner loop is a sum of (2^i) from i=0 to i=n-1, which sums to 2^(i)-1 operations
i โ† 0
while (i < n)
j โ† 0
while (j < power(2,i))
j โ† j + 1
done
i โ† i + 1
done
Time Complexity
Time = 2^0 + (2^0 + 2^1) + .. + (2^0 + .. + 2^(n-1))
Time = n * 2^0 + (n-1) * 2^1 + .. + (n-(n-1)) * 2^(n-1)
Time = SUM { (n-k) * 2^k | k = 0..(n-1) }
Time = 2^(n+1) - n - 2
Time ~ O(2^(n+1))

Big O runtime for this algorithm?

Here's the pseudocode:
Baz(A) {
big = โˆ’โˆž
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
sum = 0
for k = j to j + i - 1
sum = sum + A(k)
if sum > big
big = sum
return big
So line 3 will be O(n) (n being the length of the array, A)
I'm not sure what line 4 would be...I know it decreases by 1 each time it is run, because i will increase.
and I can't get line 6 without getting line 4...
All help is appreciated, thanks in advance.
Let us first understand how first two for loops work
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
First for loop will run from 1 to n(length of Array A) and the second for loop will depend on value of i. SO when i = 1 second for loop will run for n times..When i increments to 2 your second for loop will run for (n-1) time ..so it will go on till 1.
So your second for loop will run as follows:
n + (n - 1) + (n - 2) + (n - 3) + .... + 1 times...
You can use following formula: sum(1 to n) = N * (N + 1) / 2 which gives (N^2 + N)/2 So we have Big oh for these two loops as
O(n^2) (Big Oh of n square )
Now let us consider third loop also...
Your third for loop looks like this
for k = j to j + i - 1
But this actually means,
for k = 0 to i - 1 (you are just shifting the range of values by adding/subtracting j but number of times the loop should run will not change, as difference remains same)
So your third loop will run from 0 to 1(value of i) for first n iterations of second loop then it will run from 0 to 2(value of i) for first (n - 1) iterations of second loop and so on..
So you get:
n + 2(n-1) + 3(n-2) + 4(n-3).....
= n + 2n - 2 + 3n - 6 + 4n - 12 + ....
= n(1 + 2 + 3 + 4....) - (addition of some numbers but this can not be greater than n^2)
= `N(N(N+1)/2)`
= O(N^3)
So your time complexity will be N^3 (Big Oh of n cube)
Hope this helps!
Methodically, you can follow the steps using Sigma Notation:
Baz(A):
big = โˆ’โˆž
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
sum = 0
for k = j to j + i - 1
sum = sum + A(k)
if sum > big
big = sum
return big
For Big-O, you need to look for the worst scenario
Also the easiest way to find the Big-O is to look into most important parts of the algorithm, it can be loops or recursion
So we have this part of the algorithm consisting of loops
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
for k = j to j + i - 1
sum = sum + A(k)
We have,
SUM { SUM { i } for j = 1 to n-i+1 } for i = 1 to n
= 1/6 n (n+1) (n+2)
= (1/6 n^2 + 1/6 n) (n + 2)
= 1/6 n^3 + 2/6 2 n^2 + 1/6 n^2 + 2/6 n
= 1/6 n^3 + 3/6 2 n^2 + 2/6 n
= 1/6 n^3 + 1/2 2 n^2 + 1/3 n
T(n) ~ O(n^3)

Resources