Asymptotic analysis of a program - data-structures

Can you give the asymptotic analysis of this
i = 1;
k = 1;
while(k<n){
k = k+ i;
i = i + 1;
}
I have tried analysing but I was stuck at the inner loop

Let π‘š be the number of the iteration, then the values of π‘˜ and 𝑖 evolve as follows:
π‘š
π‘˜
𝑖
0
1
1
1
1+1
2
2
1+1+2
3
3
1+1+2+3
4
4
1+1+2+3+4
5
So we see that π‘˜ is 1 + βˆ‘π‘šπ‘–=1 𝑖
This sum is a triangular number, and so:
Β  Β  Β  π‘˜ = 1 + π‘š(π‘š+1)/2
And if we fix π‘š to the total number of iterations, then:
Β  Β  Β  1 + π‘š(π‘š-1)/2 < 𝑛 ≀ 1 + π‘š(π‘š+1)/2.
So 𝑛 is O(π‘šΒ²), and thus π‘š is O(βˆšπ‘›).
The number of iterations π‘š is a measure of the complexity, so it is O(βˆšπ‘›).

Related

Determine the big O running time

I am struggling with this question would like some help , thank you.
Determine the big O running time of the method myMethod() by counting the
approximate number of operations it performs. Show all details of your answer.
Note: the symbol % represent a modulus operator, that is, the remainder of a
number divided by another number.
π‘ π‘‘π‘Žπ‘‘π‘–π‘ 𝑖𝑛𝑑 π‘šπ‘¦π‘€π‘’π‘‘β„Žπ‘œπ‘‘(𝐴[], 𝑛) {
π‘π‘œπ‘’π‘›π‘‘ ← 0
π‘“π‘œπ‘Ÿ 𝑖 ← 0 π‘‘π‘œ 𝑛 βˆ’ 1 {
π‘“π‘œπ‘Ÿ 𝑗 ← 𝑖 π‘‘π‘œ 𝑛 βˆ’ 1 {
π‘π‘œπ‘’π‘›π‘‘ ← π‘π‘œπ‘’π‘›π‘‘+ 𝐴[𝑗]
π‘˜ ← 1
π‘€β„Žπ‘–π‘™π‘’ (π‘˜ < 𝑛 + 2) {
𝑖𝑓 (𝑗%2 == 0) {
π‘π‘œπ‘’π‘›π‘‘ = π‘π‘œπ‘’π‘›π‘‘ + 1
}
π‘˜ + +
}
}
}
π‘Ÿπ‘’π‘‘π‘’π‘Ÿπ‘› π‘π‘œπ‘’π‘›π‘‘
}
The outer for loop with the variable i runs n times.
The inner for loop with the variable j runs n - i times for each iteration of the outer loop. This would make the inner loop run n + n-1 + n-2 +...+ 1 times in aggregation which is the equivalent of n * (n+1) / 2.
The while loop inside the inner for loop runs n + 1 times for each iteration of the inner for loop.
This makes the while loop to run (n * (n+1) / 2) * (n + 2). This produces (n^2 + n) / 2 * (n + 2) = (n^2 + n) * (n + 2) / 2 = (n^3 + 2n^2 + n^2 + 2n) / 2 = (n^3 + 3n^2 + 2n) / 2).
Dropping lower degrees of n and constants we get O(n^3).
You could have also argued that n + n-1 + ... + 1 is O(n^2) times a linear operation becomes O(n^3). Which would have been more intuitive and faster.

Big O Notation of the terms f(n) = 2^(n+5) + n^2 and g(n) = 2^(n+1) - 1

I have:
𝑓(𝑛) = 2𝑛+5 + 𝑛2
𝑔(𝑛) = 2𝑛+1 - 1
I must show whether:
𝑓(𝑛) = Ξ©(𝑔(𝑛)) or/and
𝑓(𝑛) = O(𝑔(𝑛))
I know that you don't need to acknowledge the 𝑛2 in 𝑓(𝑛) or the -1 in 𝑔(𝑛) because 2𝑛+5 and 2𝑛+1 have the higher complexity. But I'm not really sure how to find out the lower and the upper bound.
My approach would be to say that the +5 in 𝑓(𝑛) and the +1 in 𝑔(𝑛) doesn't change anything about the complexity, which means that both of the above statements are true and 𝑓(𝑛) = θ​(𝑔(𝑛)). But I have no way to prove this.
We have
𝑓(𝑛) = 2𝑛+5 + 𝑛2
𝑔(𝑛) = 2𝑛+1 βˆ’ 1
𝑓(𝑛) = Ξ©(𝑔(𝑛)) is true when we can find a π‘˜ such that 𝑓(𝑛) β‰₯ π‘˜β‹…π‘”(𝑛)) for any 𝑛 greater than a chosen 𝑛0.
We see that even with π‘˜=1 and 𝑛0=0 this is true.
𝑓(𝑛) = O(𝑔(𝑛)) is true when we can find a π‘˜ such that 𝑓(𝑛) ≀ π‘˜β‹…π‘”(𝑛)) for any 𝑛 greater than a chosen 𝑛0:
Β  Β  Β  2𝑛+5 + 𝑛2 ≀ π‘˜(2𝑛+1 βˆ’ 1)
Let's choose π‘˜ = 25, then we must show that for large enough 𝑛:
Β  Β  Β  2𝑛+5 + 𝑛2 ≀ 25(2𝑛+1 βˆ’ 1)
Β  Β  Β  2𝑛+5 + 𝑛2 ≀ 2β‹…2𝑛+5 βˆ’ 32
Β  Β  Β  𝑛2 ≀ 2𝑛+5 βˆ’ 32
We can see that this is true for all 𝑛 greater than 1.

Complexity of two algorithms

I am studying algorithms complexity, and I have a question about the difference between the following two algorithms:
Algorithm #1:
sum = 0
i = 1
while (i < n) {
for j = 1 to i {
sum = sum + 1
}
i = i*2;
}
return sum
Algorithm #2:
sum = 0
i = 1
while (i < n) {
for j = 1 to n {
sum = sum + 1
}
i = i*2;
}
return sum
The only difference is the 'for' loop, but what is the difference between the time complexity of these algorithms ? When do I have to multiply or add the complexity of nested loops?
Let say for simplisity that n is a power of 2 i.e. 2^k. Then it is obvious that outer loop will proceed k times and on each step inner loop will proceed:
1 to 1 i.e. 2^0
1 to 2 i.e. 2^1
1 to 4 i.e. 2^2
...
1 to 2^k
So we just need to find the sum 2^0 + 2^1 + ... + 2^k
It is knows as 2^(k+1) - 1 = 2^k * 2 - 1 = 2*n + 1
So omitting constants we get O(n)
Second one is simple. Outer loop is log(n), inner n so you get O(n*log(n)).

Determining Big-o While Loop

// n > 0
i ← 0
while (i < n)
j ← 0
while (j < power(2,i))
j ← j + 1
done
i ← i + 1
done
Is the overall complexity O(n(log(n)) because the inner while loop has a conditional where 2^i so 2^0 2^1 2^2 ... = 1 2 8 16 32 64 128... etc. So for 2^i < n --> log(n) > i?
And the outer loop looks to be simply O(n).
Multiple both loop complexities for O(n(log(n)), confirm please? Thanks in advance.
It's O(2^n)
For the outer loop, the number of iterations is n, so the inner loop executes for every value of i from 0 to n-1.
The number of iterations of the inner loop each time is 2^i, so the total number of iterations for the entire program is:
2^0 + 2^1 + 2^2 + 2^3 + 2^4 + 2^5 + ... +2^(n-1)
This sum is equal to 2^n - 1. Because 2^n is so large compared to 1, we can drop the 1 in the big-O notation, giving us O(2^n).
Using a formal methodology through Sigma notation:
looks like O(2^n), inner loop is a sum of (2^i) from i=0 to i=n-1, which sums to 2^(i)-1 operations
i ← 0
while (i < n)
j ← 0
while (j < power(2,i))
j ← j + 1
done
i ← i + 1
done
Time Complexity
Time = 2^0 + (2^0 + 2^1) + .. + (2^0 + .. + 2^(n-1))
Time = n * 2^0 + (n-1) * 2^1 + .. + (n-(n-1)) * 2^(n-1)
Time = SUM { (n-k) * 2^k | k = 0..(n-1) }
Time = 2^(n+1) - n - 2
Time ~ O(2^(n+1))

Big O runtime for this algorithm?

Here's the pseudocode:
Baz(A) {
big = βˆ’βˆž
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
sum = 0
for k = j to j + i - 1
sum = sum + A(k)
if sum > big
big = sum
return big
So line 3 will be O(n) (n being the length of the array, A)
I'm not sure what line 4 would be...I know it decreases by 1 each time it is run, because i will increase.
and I can't get line 6 without getting line 4...
All help is appreciated, thanks in advance.
Let us first understand how first two for loops work
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
First for loop will run from 1 to n(length of Array A) and the second for loop will depend on value of i. SO when i = 1 second for loop will run for n times..When i increments to 2 your second for loop will run for (n-1) time ..so it will go on till 1.
So your second for loop will run as follows:
n + (n - 1) + (n - 2) + (n - 3) + .... + 1 times...
You can use following formula: sum(1 to n) = N * (N + 1) / 2 which gives (N^2 + N)/2 So we have Big oh for these two loops as
O(n^2) (Big Oh of n square )
Now let us consider third loop also...
Your third for loop looks like this
for k = j to j + i - 1
But this actually means,
for k = 0 to i - 1 (you are just shifting the range of values by adding/subtracting j but number of times the loop should run will not change, as difference remains same)
So your third loop will run from 0 to 1(value of i) for first n iterations of second loop then it will run from 0 to 2(value of i) for first (n - 1) iterations of second loop and so on..
So you get:
n + 2(n-1) + 3(n-2) + 4(n-3).....
= n + 2n - 2 + 3n - 6 + 4n - 12 + ....
= n(1 + 2 + 3 + 4....) - (addition of some numbers but this can not be greater than n^2)
= `N(N(N+1)/2)`
= O(N^3)
So your time complexity will be N^3 (Big Oh of n cube)
Hope this helps!
Methodically, you can follow the steps using Sigma Notation:
Baz(A):
big = βˆ’βˆž
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
sum = 0
for k = j to j + i - 1
sum = sum + A(k)
if sum > big
big = sum
return big
For Big-O, you need to look for the worst scenario
Also the easiest way to find the Big-O is to look into most important parts of the algorithm, it can be loops or recursion
So we have this part of the algorithm consisting of loops
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
for k = j to j + i - 1
sum = sum + A(k)
We have,
SUM { SUM { i } for j = 1 to n-i+1 } for i = 1 to n
= 1/6 n (n+1) (n+2)
= (1/6 n^2 + 1/6 n) (n + 2)
= 1/6 n^3 + 2/6 2 n^2 + 1/6 n^2 + 2/6 n
= 1/6 n^3 + 3/6 2 n^2 + 2/6 n
= 1/6 n^3 + 1/2 2 n^2 + 1/3 n
T(n) ~ O(n^3)

Resources