Can you give the asymptotic analysis of this
i = 1;
k = 1;
while(k<n){
k = k+ i;
i = i + 1;
}
I have tried analysing but I was stuck at the inner loop
Let π be the number of the iteration, then the values of π and π evolve as follows:
π
π
π
0
1
1
1
1+1
2
2
1+1+2
3
3
1+1+2+3
4
4
1+1+2+3+4
5
So we see that π is 1 + βππ=1 π
This sum is a triangular number, and so:
Β Β Β π = 1 + π(π+1)/2
And if we fix π to the total number of iterations, then:
Β Β Β 1 + π(π-1)/2 < π β€ 1 + π(π+1)/2.
So π is O(πΒ²), and thus π is O(βπ).
The number of iterations π is a measure of the complexity, so it is O(βπ).
Related
I am struggling with this question would like some help , thank you.
Determine the big O running time of the method myMethod() by counting the
approximate number of operations it performs. Show all details of your answer.
Note: the symbol % represent a modulus operator, that is, the remainder of a
number divided by another number.
π π‘ππ‘ππ πππ‘ ππ¦πππ‘βππ(π΄[], π) {
πππ’ππ‘ β 0
πππ π β 0 π‘π π β 1 {
πππ π β π π‘π π β 1 {
πππ’ππ‘ β πππ’ππ‘+ π΄[π]
π β 1
π€βπππ (π < π + 2) {
ππ (π%2 == 0) {
πππ’ππ‘ = πππ’ππ‘ + 1
}
π + +
}
}
}
πππ‘π’ππ πππ’ππ‘
}
The outer for loop with the variable i runs n times.
The inner for loop with the variable j runs n - i times for each iteration of the outer loop. This would make the inner loop run n + n-1 + n-2 +...+ 1 times in aggregation which is the equivalent of n * (n+1) / 2.
The while loop inside the inner for loop runs n + 1 times for each iteration of the inner for loop.
This makes the while loop to run (n * (n+1) / 2) * (n + 2). This produces (n^2 + n) / 2 * (n + 2) = (n^2 + n) * (n + 2) / 2 = (n^3 + 2n^2 + n^2 + 2n) / 2 = (n^3 + 3n^2 + 2n) / 2).
Dropping lower degrees of n and constants we get O(n^3).
You could have also argued that n + n-1 + ... + 1 is O(n^2) times a linear operation becomes O(n^3). Which would have been more intuitive and faster.
I have:
π(π) = 2π+5 + π2
π(π) = 2π+1 - 1
I must show whether:
π(π) = Ξ©(π(π)) or/and
π(π) = O(π(π))
I know that you don't need to acknowledge the π2 in π(π) or the -1 in π(π) because 2π+5 and 2π+1 have the higher complexity. But I'm not really sure how to find out the lower and the upper bound.
My approach would be to say that the +5 in π(π) and the +1 in π(π) doesn't change anything about the complexity, which means that both of the above statements are true and π(π) = ΞΈβ(π(π)). But I have no way to prove this.
We have
π(π) = 2π+5 + π2
π(π) = 2π+1 β 1
π(π) = Ξ©(π(π)) is true when we can find a π such that π(π) β₯ πβ
π(π)) for any π greater than a chosen π0.
We see that even with π=1 and π0=0 this is true.
π(π) = O(π(π)) is true when we can find a π such that π(π) β€ πβ
π(π)) for any π greater than a chosen π0:
Β Β Β 2π+5 + π2 β€ π(2π+1 β 1)
Let's choose π = 25, then we must show that for large enough π:
Β Β Β 2π+5 + π2 β€ 25(2π+1 β 1)
Β Β Β 2π+5 + π2 β€ 2β
2π+5 β 32
Β Β Β π2 β€ 2π+5 β 32
We can see that this is true for all π greater than 1.
I am studying algorithms complexity, and I have a question about the difference between the following two algorithms:
Algorithm #1:
sum = 0
i = 1
while (i < n) {
for j = 1 to i {
sum = sum + 1
}
i = i*2;
}
return sum
Algorithm #2:
sum = 0
i = 1
while (i < n) {
for j = 1 to n {
sum = sum + 1
}
i = i*2;
}
return sum
The only difference is the 'for' loop, but what is the difference between the time complexity of these algorithms ? When do I have to multiply or add the complexity of nested loops?
Let say for simplisity that n is a power of 2 i.e. 2^k. Then it is obvious that outer loop will proceed k times and on each step inner loop will proceed:
1 to 1 i.e. 2^0
1 to 2 i.e. 2^1
1 to 4 i.e. 2^2
...
1 to 2^k
So we just need to find the sum 2^0 + 2^1 + ... + 2^k
It is knows as 2^(k+1) - 1 = 2^k * 2 - 1 = 2*n + 1
So omitting constants we get O(n)
Second one is simple. Outer loop is log(n), inner n so you get O(n*log(n)).
// n > 0
i β 0
while (i < n)
j β 0
while (j < power(2,i))
j β j + 1
done
i β i + 1
done
Is the overall complexity O(n(log(n)) because the inner while loop has a conditional where 2^i so 2^0 2^1 2^2 ... = 1 2 8 16 32 64 128... etc. So for 2^i < n --> log(n) > i?
And the outer loop looks to be simply O(n).
Multiple both loop complexities for O(n(log(n)), confirm please? Thanks in advance.
It's O(2^n)
For the outer loop, the number of iterations is n, so the inner loop executes for every value of i from 0 to n-1.
The number of iterations of the inner loop each time is 2^i, so the total number of iterations for the entire program is:
2^0 + 2^1 + 2^2 + 2^3 + 2^4 + 2^5 + ... +2^(n-1)
This sum is equal to 2^n - 1. Because 2^n is so large compared to 1, we can drop the 1 in the big-O notation, giving us O(2^n).
Using a formal methodology through Sigma notation:
looks like O(2^n), inner loop is a sum of (2^i) from i=0 to i=n-1, which sums to 2^(i)-1 operations
i β 0
while (i < n)
j β 0
while (j < power(2,i))
j β j + 1
done
i β i + 1
done
Time Complexity
Time = 2^0 + (2^0 + 2^1) + .. + (2^0 + .. + 2^(n-1))
Time = n * 2^0 + (n-1) * 2^1 + .. + (n-(n-1)) * 2^(n-1)
Time = SUM { (n-k) * 2^k | k = 0..(n-1) }
Time = 2^(n+1) - n - 2
Time ~ O(2^(n+1))
Here's the pseudocode:
Baz(A) {
big = ββ
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
sum = 0
for k = j to j + i - 1
sum = sum + A(k)
if sum > big
big = sum
return big
So line 3 will be O(n) (n being the length of the array, A)
I'm not sure what line 4 would be...I know it decreases by 1 each time it is run, because i will increase.
and I can't get line 6 without getting line 4...
All help is appreciated, thanks in advance.
Let us first understand how first two for loops work
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
First for loop will run from 1 to n(length of Array A) and the second for loop will depend on value of i. SO when i = 1 second for loop will run for n times..When i increments to 2 your second for loop will run for (n-1) time ..so it will go on till 1.
So your second for loop will run as follows:
n + (n - 1) + (n - 2) + (n - 3) + .... + 1 times...
You can use following formula: sum(1 to n) = N * (N + 1) / 2 which gives (N^2 + N)/2 So we have Big oh for these two loops as
O(n^2) (Big Oh of n square )
Now let us consider third loop also...
Your third for loop looks like this
for k = j to j + i - 1
But this actually means,
for k = 0 to i - 1 (you are just shifting the range of values by adding/subtracting j but number of times the loop should run will not change, as difference remains same)
So your third loop will run from 0 to 1(value of i) for first n iterations of second loop then it will run from 0 to 2(value of i) for first (n - 1) iterations of second loop and so on..
So you get:
n + 2(n-1) + 3(n-2) + 4(n-3).....
= n + 2n - 2 + 3n - 6 + 4n - 12 + ....
= n(1 + 2 + 3 + 4....) - (addition of some numbers but this can not be greater than n^2)
= `N(N(N+1)/2)`
= O(N^3)
So your time complexity will be N^3 (Big Oh of n cube)
Hope this helps!
Methodically, you can follow the steps using Sigma Notation:
Baz(A):
big = ββ
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
sum = 0
for k = j to j + i - 1
sum = sum + A(k)
if sum > big
big = sum
return big
For Big-O, you need to look for the worst scenario
Also the easiest way to find the Big-O is to look into most important parts of the algorithm, it can be loops or recursion
So we have this part of the algorithm consisting of loops
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
for k = j to j + i - 1
sum = sum + A(k)
We have,
SUM { SUM { i } for j = 1 to n-i+1 } for i = 1 to n
= 1/6 n (n+1) (n+2)
= (1/6 n^2 + 1/6 n) (n + 2)
= 1/6 n^3 + 2/6 2 n^2 + 1/6 n^2 + 2/6 n
= 1/6 n^3 + 3/6 2 n^2 + 2/6 n
= 1/6 n^3 + 1/2 2 n^2 + 1/3 n
T(n) ~ O(n^3)