Determine the computational complexity of the following algorithm - algorithm

s = 0
for i = 1 to n3
for j = 1 to i do
s = s + 1
what is mean by computational complexity?

Let's assume each operation i in this code takes constant time ci. Then the running time can be expressed by the sum c1 + c2 * n3 + c3 * i * n3 + c4 * i * n3. We consider constant coefficients as insignificant because they only contribute a constant value regardless of the input. This gives us Θ(1) + Θ(n3) + Θ(i * n3 + 1) + Θ(i * n3). So in this case the time complexity is Θ(n3!) which is to say this algorithm runs in factorial time.

Just calculate some values (how many times s = s + 1 is executed for each iteration of i and sum up) and see what you get:
1 + 2 + 3 + ... + n = n * (n + 1) / 2 = n^2 / 2 + n / 2 => complexity O(n^2).

Related

Determine the big O running time

I am struggling with this question would like some help , thank you.
Determine the big O running time of the method myMethod() by counting the
approximate number of operations it performs. Show all details of your answer.
Note: the symbol % represent a modulus operator, that is, the remainder of a
number divided by another number.
𝑠𝑡𝑎𝑡𝑖𝑐 𝑖𝑛𝑡 𝑚𝑦𝑀𝑒𝑡ℎ𝑜𝑑(𝐴[], 𝑛) {
𝑐𝑜𝑢𝑛𝑡 ← 0
𝑓𝑜𝑟 𝑖 ← 0 𝑡𝑜 𝑛 − 1 {
𝑓𝑜𝑟 𝑗 ← 𝑖 𝑡𝑜 𝑛 − 1 {
𝑐𝑜𝑢𝑛𝑡 ← 𝑐𝑜𝑢𝑛𝑡+ 𝐴[𝑗]
𝑘 ← 1
𝑤ℎ𝑖𝑙𝑒 (𝑘 < 𝑛 + 2) {
𝑖𝑓 (𝑗%2 == 0) {
𝑐𝑜𝑢𝑛𝑡 = 𝑐𝑜𝑢𝑛𝑡 + 1
}
𝑘 + +
}
}
}
𝑟𝑒𝑡𝑢𝑟𝑛 𝑐𝑜𝑢𝑛𝑡
}
The outer for loop with the variable i runs n times.
The inner for loop with the variable j runs n - i times for each iteration of the outer loop. This would make the inner loop run n + n-1 + n-2 +...+ 1 times in aggregation which is the equivalent of n * (n+1) / 2.
The while loop inside the inner for loop runs n + 1 times for each iteration of the inner for loop.
This makes the while loop to run (n * (n+1) / 2) * (n + 2). This produces (n^2 + n) / 2 * (n + 2) = (n^2 + n) * (n + 2) / 2 = (n^3 + 2n^2 + n^2 + 2n) / 2 = (n^3 + 3n^2 + 2n) / 2).
Dropping lower degrees of n and constants we get O(n^3).
You could have also argued that n + n-1 + ... + 1 is O(n^2) times a linear operation becomes O(n^3). Which would have been more intuitive and faster.

Solving a recurrence relation using Smoothness Rule

Consider this recurrence relation: x(n) = x(n/2) + n, for n > 1 and x(1) = 0.
Now here the method of back substitution will struggle for values of n not powers of 2, so it is best known here is to use the smoothness rule to solve this types of questions, and when we use the smoothness rule, where we will solve for n = 2^k (for n = values powers of 2) we will have a solution of x(n) = 2n - 1.
However, if we use the method of backward substitution, this recurrence relation will have a solution!
x(n) = x(n/2) + n = x(n/4) + n/2 + n = x(n/8) + n/4 + n/2 + n = x(n/16) + n/8 + n/4 + n/2 + n = ....
where the pattern is
x(n) = x(n/i) + n/(i/2) + n/(i/4) + n/(i/8) + n/(i/16) + ...
which will stop when n = 1 (i.e when i = n) and in this case
x(n) = x(n/n) + n/(n/2) + n/(n/4) + n/(n/8) + n/(n/16) + ... = 1 + 2 + 4 + 8 + 16 + ... = 2^(n+1) - 1
which is two different answers!
So please I am so confused here because in the textbook (Introduction to Analysis and Design of Algorithms by Anany Levitin) it is mention that we should use here the smoothness rule, but as you can see I have solved it exactly by the method of backward substitution where the method was expected here to struggle but nothing has happened!
The transition 1 + 2 + 4 + 8 + 16 + ... = 2^(n+1) - 1 is false.
That is since the number of elements in the left series is log n so the sum is 2^(log n + 1) - 1, which is exactly 2n - 1.
The reason there are log n elements is that n/(2^i) = 1 (the last element of the series is 1) when i = log n.

formula for the sum of n+n/2+n/3+...+n/n

so I got this algorithm I need to calculate its time complexity
which goes like
for i=1 to n do
k=i
while (k<=n) do
FLIP(A[k])
k = k + i
where A is an array of booleans, and FLIP is as it is, flipping the current value. therefore it's O(1).
Now I understand that the inner while loop should be called
n/1+n/2+n/3+...+n/n
If I'm correct, but is there a formula out there for such calculation?
pretty confused here
The more exact computation is T(n) \sum((n-i)/i) for i = 1 to n (because k is started from i). Hence, the final sum is n + n/2 + ... + n/n - n = n(1 + 1/2 + ... + 1/n) - n, approximately. We knew 1 + 1/2 + ... + 1/n = H(n) and H(n) = \Theta(\log(n)). Hence, T(n) = \Theta(n\log(n)). The -n has not any effect on the asymptotic computaional cost, as n = o(n\log(n)).
Lets say we want to calculate sum of this equation
n + n / 2 + n / 3 + ... + n / n
=> n ( 1 + 1 / 2 + 1 / 3 + ..... + 1 / n )
Then in bracket ( 1 + 1 / 2 + 1 / 3 + ... + 1 / n ) this is a well known Harmonic series and i am afraid there is no proven formula to calculate Harmonic series.
The given problem boils down to calculate below sum -Sum of harmonic series
Although this sum can't be calculated accurately, however you can still find asymptotic upper bound for this sum, which is approximately O(log(n)).
Hence answer to above problem will be - O(nlog(n))

Basic randomized algorithm recurrence

I'm having trouble fully understanding how to write the recurrence for the expected running time of a randomized algorithm.
I believe I'm doing it correctly, but if someone could look over it, that'd be a huge help.
Here's the pseudocode for the algorithm:
printIntegers(A, n) // an array A of integers is the input, with n integers
if A.length > 0
for i = 1 to n
print A[i]
randInt = rand(1, 10)
if randInt != 10
return
else
printIntegers(A, n-1)
The only random part is the random generator between 1 and 10. I'm trying to understand how that would translate in the recurrence.
I'm thinking:
T(n) = O(n) if a != 10 probability = 9/10
T(n-1) + O(n) a = 10 = 1/10
T(n-2) + O(n)
....
T(0) + O(n)
This makes sense in my head, and then the expected running time would be O(n). Am I approaching this correctly?
Note that the initial condition should use n in the check, not A.length since the latter is not changing in the recursion.
The expected number of times the recursion will be called is 0.1. The expectation is the same as the probability the recursion will be called. In the current case, if the random number generator is truly random, the number 10 will appear 1/10 of the times. Likewise, the expected number of times there will be no recursion is 0.9. But the O(n) appears in both the cases, so the equation will be, when considering expected values:
T(n) = (0.9 + 0.1) * O(n) + 0.1 * T(n-1)
= O(n) + 0.1 * T(n-1)
= O(n) + 0.1 * (O(n-1) + 0.1 * T(n-2))
= O(n) + 0.1 * O(n-1) + 0.1^2 * O(n-2) +...
= O(n) * (0.1 + 0.1^2 +...+0.1^(n-1)) + 0.1^(n-1) * T(1)
= O(n) * (1 - 0.1^n)/0.9 + K
The above is O(n * (1 - 0.9^n)/0.9) which is essentially the same as O(n) depending upon your accuracy needs.
First note that:
T(n) = n + (n-1)/10 + (n-2)/10^2 + ... + 1/10^{n-1}
Then, bounding T(n) above:
T(n) = n + (n-1)/10 + (n-2)/10^2 + ... + 1/10^{n-1}
< n + n/10 + n/10^2 + ... + n/10^{n-1}
= n(1 + 1/10 + ... + 1/10^{n-1})
< n(1 + 1/10 + 1/10^2 + ...)
= n/(1 - 1/10) = 10n/9
and bounding it below:
T(n) = n + (n-1)/10 + (n-2)/10^2 + ... + 1/10^{n-1}
> n
So n < T(n) < 10n/9 and T(n) is Theta(n).

About the time complexity algorithm and asymptotic growth

I've got the question about the time complexity algorithm and asymptotic growth.
The pseudo code of question is
1: function NAIVE(x,A)
2: answer = 0
3: n= length of A
4: for I from - to n do
5: aux = 1
6. for j from 1 to I do
7: aux = aux*x
8: answer = answer + aux * A[I]
9. return answer
I have to find upperbound with O-notation and lowerbound witn Ω-notation.
I got the time complexity f(n) = 5n^2 + 4n + 8 and g(n) = n^2.
My question is I'm not too sure about the running time of line number 6 to 8.
I got constant 2 and time n+1 for line number 4 and constant 1 and time 1 for line 5.
I'm stuck on after that. I tried it and I got constant 2 and time n^2 + 1 for line 6, because it runs in the for loop (for loop and for loop) so I thought its n^2+1. Is that correct?
And for line number 8 its has 3 constants and run time is n^2. is this correct? I'm not too sure about the line number 8. This is how I got f(n) = 5n^2 + 4n + 8!
Please help me out to complete this question!
I wonder my work is right or not!
Thank you
Let's go through this step by step.
The complexity of line 7 T7 = 1.
The complexity of the surrounding for will be T6(I) = I * T7 = I.
The complexity of line 5 T5 = 1.
The complexity of line 8 T8 = 1.
The complexity of the surrounding for (assuming that - stands for 0) is
T4(n) = Sum{I from 0 to n} (T5 + T6(I) + T8)
= Sum{I from 0 to n}(2 + I)
= Sum{I from 0 to n}(2) + Sum{I from 0 to n}(I)
= (n + 1) * 2 + (n+1)/2 * (0 + n)
= 2n + 2 + n^2/2 + n/2
= 1/2 n^2 + 5/2 n + 2
The complexity of the remaining lines is T2 = T3 = T9 = 1.
The complexity of the entire algorithm is
T(n) = T2 + T3 + T4(n) + T9
= 1 + 1 + 1/2 n^2 + 5/2 n + 2 + 1
= 1/2 n^2 + 5/2 n + 5
This runtime is in the complexity classes O(n^2) and Ω(n^2).

Resources