What is the running time for this function? - algorithm

I have 3 questions in this function ,
Sum = 0
MyFunction (N)
M = 1,000,000
If (N > 1)
For I = 1 to M do
Sum = 0
J = 1
Do
Sum = Sum + J
J = J + 2
While J < N
End For
If (MyFunction(N / 2) % 3 == 0)
Return (2 * MyFunction(N / 2))
Else
Return (4 * MyFunction(N / 2))
End If
Else
Return 1
End If
End MyFunction
First question is : What's the Complexity of the non-recursive part of code?
I think the non recursive part is that loop
For I = 1 to M do
Sum = 0
J = 1
Do
Sum = Sum + J
J = J + 2
While J < N
End For
and my answer is M * log(n) , but my slides say it's not M * log (n) !
I need explanation for this.
The second question is: What's the correct recurrence for the previous code of MyFunction?
when I saw these lines of code
If (MyFunction(N / 2) % 3 == 0)
Return (2 * MyFunction(N / 2))
Else
Return (4 * MyFunction(N / 2))
End If
I think that it's T(n) = T(n/2) + Theta(non-recursive),
because if will execute one of the 2 calls.
Again this answer is wrong.
The third one is: What's the complexity of MyFunction?
My answer based on the 2 questions is T(n) = T(n/2) + M * lg n
and total running time is M * lg n .

Let's look at this one piece at a time.
First, here's the non-recursive part of the code:
For I = 1 to M do
Sum = 0
J = 1
Do
Sum = Sum + J
J = J + 2
While J < N
End For
The outer loop will run Θ(M) times. Since M is a fixed constant (one million), the loop will run Θ(1) times.
Inside the loop, the inner while loop will run Θ(N) times, since on each iteration J increases by two and stops as soon as J meets or exceeds N. Therefore, the total work done by this loop nest is Θ(N): Θ(N) work Θ(1) times.
Now, let's look at this part:
If (MyFunction(N / 2) % 3 == 0)
Return (2 * MyFunction(N / 2))
Else
Return (4 * MyFunction(N / 2))
End If
The if statement will make one recursive call on an input of size N / 2, and then depending on the result there will always be a second recursive call of size N / 2 (since you're not caching the result).
This gives the following recurrence relation for the runtime:
T(n) = 2T(n / 2) + Θ(n)
Using the Master Theorem, this solves to Θ(n log n).
Hope this helps!

Related

Calculate the code complexity of below code

I feel that in worst case also, condition is true only two times when j=i or j=i^2 then loop runs for an extra i + i^2 times.
In worst case, if we take sum of inner 2 loops it will be theta(i^2) + i + i^2 , which is equal to theta(i^2) itself;
Summation of theta(i^2) on outer loop gives theta(n^3).
So, is the answer theta(n^3) ?
I would say that the overall performance is theta(n^4). Here is your pseudo-code, given in text format:
for (i = 1 to n) do
for (j = 1 to i^2) do
if (j % i == 0) then
for (k = 1 to j) do
sum = sum + 1
Appreciate first that the j % i == 0 condition will only be true when j is multiples of n. This would occur in fact only n times, so the final inner for loop would only be hit n times coming from the for loop in j. The final for loop would require n^2 steps for the case where j is near the end of the range. On the other hand, it would only take roughly n steps for the start of the range. So, the overall performance here should be somewhere between O(n^3) and O(n^4), but theta(n^4) should be valid.
For fixed i, the i integers 1 ≤ j ≤ i2 such that j % i = 0 are {i,2i,...,i2}. It follows that the inner loop is executed i times with arguments i * m for 1 ≤ m ≤ i and the guard executed i2 times. Thus, the complexity function T(n) ∈ Θ(n4) is given by:
T(n) = ∑[i=1,n] (∑[j=1,i2] 1 + ∑[m=1,i] ∑[k=1,i*m] 1)
= ∑[i=1,n] ∑[j=1,i2] 1 + ∑[i=1,n] ∑[m=1,i] ∑[k=1,i*m] 1
= n3/3 + n2/2 + n/6 + ∑[i=1,n] ∑[m=1,i] ∑[k=1,i*m] 1
= n3/3 + n2/2 + n/6 + n4/8 + 5n3/12 + 3n2/8 + n/12
= n4/8 + 3n3/4 + 7n2/8 + n/4

What is the time complexity for given snippet?

for i = 1 to n do
for j = 1 to i do
for k = 1 to j do
What is its time complexity in terms of 'n'?
The inner-most loop will obviously run j times. Assuming that it contains operations worth 1 time unit, this will be:
T_inner(j) = j
The middle loop will run i times, i.e.
T_middle(i) = Sum {j from 1 to i} T_inner(j)
= Sum {j from 1 to i} j
= i/2 * (1 + i)
Finally:
T_outer(n) = Sum {i from 1 to n} T_middle(i)
= Sum {i from 1 to n} (i/2 * (1 + i))
= 1/6 * n * (1 + n) * (2 + n)
= 1/6 n^3 + 1/2 n^2 + 1/3 n
And this is obviously O(n^3).
Note: This only counts the operations in the inner most block. It neglects the operations necessary to perform the loop. But if you include those, you will see that the time complexity is the same.

Big O runtime for this algorithm?

Here's the pseudocode:
Baz(A) {
big = −∞
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
sum = 0
for k = j to j + i - 1
sum = sum + A(k)
if sum > big
big = sum
return big
So line 3 will be O(n) (n being the length of the array, A)
I'm not sure what line 4 would be...I know it decreases by 1 each time it is run, because i will increase.
and I can't get line 6 without getting line 4...
All help is appreciated, thanks in advance.
Let us first understand how first two for loops work
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
First for loop will run from 1 to n(length of Array A) and the second for loop will depend on value of i. SO when i = 1 second for loop will run for n times..When i increments to 2 your second for loop will run for (n-1) time ..so it will go on till 1.
So your second for loop will run as follows:
n + (n - 1) + (n - 2) + (n - 3) + .... + 1 times...
You can use following formula: sum(1 to n) = N * (N + 1) / 2 which gives (N^2 + N)/2 So we have Big oh for these two loops as
O(n^2) (Big Oh of n square )
Now let us consider third loop also...
Your third for loop looks like this
for k = j to j + i - 1
But this actually means,
for k = 0 to i - 1 (you are just shifting the range of values by adding/subtracting j but number of times the loop should run will not change, as difference remains same)
So your third loop will run from 0 to 1(value of i) for first n iterations of second loop then it will run from 0 to 2(value of i) for first (n - 1) iterations of second loop and so on..
So you get:
n + 2(n-1) + 3(n-2) + 4(n-3).....
= n + 2n - 2 + 3n - 6 + 4n - 12 + ....
= n(1 + 2 + 3 + 4....) - (addition of some numbers but this can not be greater than n^2)
= `N(N(N+1)/2)`
= O(N^3)
So your time complexity will be N^3 (Big Oh of n cube)
Hope this helps!
Methodically, you can follow the steps using Sigma Notation:
Baz(A):
big = −∞
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
sum = 0
for k = j to j + i - 1
sum = sum + A(k)
if sum > big
big = sum
return big
For Big-O, you need to look for the worst scenario
Also the easiest way to find the Big-O is to look into most important parts of the algorithm, it can be loops or recursion
So we have this part of the algorithm consisting of loops
for i = 1 to length(A)
for j = 1 to length(A) - i + 1
for k = j to j + i - 1
sum = sum + A(k)
We have,
SUM { SUM { i } for j = 1 to n-i+1 } for i = 1 to n
= 1/6 n (n+1) (n+2)
= (1/6 n^2 + 1/6 n) (n + 2)
= 1/6 n^3 + 2/6 2 n^2 + 1/6 n^2 + 2/6 n
= 1/6 n^3 + 3/6 2 n^2 + 2/6 n
= 1/6 n^3 + 1/2 2 n^2 + 1/3 n
T(n) ~ O(n^3)

time complexity for the following

int i=1,s=1;
while(s<=n)
{
i++;
s=s+i;
}
time complexity for this is O(root(n)).
I do not understood it how.
since the series is going like 1+2+...+k .
please help.
s(k) = 1 + 2 + 3 + ... k = (k + 1) * k / 2
for s(k) >= n you need at least k steps. n = (k + 1) * k / 2, thus k = -1/2 +- sqrt(1 + 4 * n)/2;
you ignore constants and coeficients and O(-1/2 + sqrt(1+4n)/2) = O(sqrt(n))
Let the loop execute x times. Now, the loop will execute as long as s is less than n.
We have :
After 1st iteration :
s = s + 1
After 2nd iteration :
s = s + 1 + 2
As it goes on for x iterations, Finally we will have
1 + 2 ... + x <= n
=> (x * (x + 1)) / 2 <= n
=> O(x^2) <= n
=> x= O (root(n))
This computes the sum s(k)=1+2+...+k and stops when s(k) > n.
Since s(k)=k*(k+1)/2, the number of iterations required for s(k) to exceed n is O(sqrt(n)).

What is the worst case time complexity for this algorithm?

procedure matrixvector(n:integer);
var i,j:integer;
begin
for i<-1 to n do begin
B[i] = 0;
C[i] = 0;
for j<-1 to i do
B[i]<- B[i]+ A[i,j];
for j<-n down to i+1 do
C[i]<-C[i] + A[i,j]
end
end;
O(n^2), if I read it right.
Why you need two inner loops is beyond me. Why not sum B and C in the same loop?
Let us trace the number of times each loop executes in each iteration.
procedure matrixvector(n : integer);
var i, j : integer;
begin
for i<-1 to n do begin // OuterLoop
B[i] = 0;
C[i] = 0;
for j <- 1 to i do // InnerLoop_1
B[i] <- B[i] + A[i, j];
for j <- n down to (i + 1) do // InnerLoop_2
C[i] <- C[i] + A[i, j]
end
end;
InnerLoop_1
In the first iteration of OuterLoop (i = 1), InnerLoop_1 executes once.
In the second iteration of OuterLoop (i = 2), InnerLoop_1 executes twice.
In the third iteration of OuterLoop (i = 3), InnerLoop_1 executes thrice.
.
.
.
In the last iteration of OuterLoop (i = n), InnerLoop_1 executes n times.
Therefore, the total number of times this code executes is
1 + 2 + 3 + … + n
= (n(n + 1) / 2) (Sum of Natural Numbers Formula)
= (((n^2) + n) / 2)
= O(n^2)
InnerLoop_2
In the first iteration of OuterLoop (i = 1), InnerLoop_2 executes n - 1 times.
In the second iteration of OuterLoop (i = 2), InnerLoop_2 executes n - 2 times.
In the third iteration of OuterLoop (i = 3), InnerLoop_2 executes n - 3 times.
.
.
.
In the n - 2th iteration of OuterLoop (i = n - 2), InnerLoop_2 executes 2 times.
In the n - 1th iteration of OuterLoop (i = n - 1), InnerLoop_2 executes 1 time.
In the last (nth) iteration of OuterLoop (i = n), InnerLoop_2 executes 0 times.
Therefore, the total number of times this code executes is
n - 1 + n - 2 + n - 3 + … + 2 + 1 + 0
= 0 + 1 + 2 + … + n - 3 + n - 2 + n - 1
= (n - 1)((n - 1) + 1) / 2 (Sum of Natural Numbers Formula)
= (n - 1)(n) / 2
= (((n^2) - n) / 2)
= O(n^2)
Time Complexity
Number of times InnerLoop_1 executes : (((n^2) + n) / 2)
Number of times InnerLoop_2 executes : (((n^2) - n) / 2)
Adding, we get
(((n^2) + n) / 2) + (((n^2) - n) / 2)
= ((((n^2) + n) + ((n^2) - n)) / 2)
= (((n^2) + n + (n^2) - n) / 2)
= (((n^2) + (n^2)) / 2)
= ((2(n^2)) / 2)
= (n^2)
= O(n^2)
——————
Also, do take a look at these
https://stackoverflow.com/a/71537431/17112163
https://stackoverflow.com/a/71146522/17112163
https://stackoverflow.com/a/69821878/17112163
https://stackoverflow.com/a/72046825/17112163
https://stackoverflow.com/a/72046933/17112163
Just explaining in detail for beginners:
Outermost for loop will run n times (0 to n)
Then there are two for loops within the out ermost for loop.
First for loop will go from 1 to n (1+2+3+4+.....+n)
And the second for loop will go from n to 1 (n+n-1+n-2+....+1)
The summation formula for (1+2+3+4+5+....+n ) is n(n+1)/2
so the total running time can be computed as n + n(n+1)/2 + n(n+1)/2
Observe the highest polynomial in this equation, it is n^2.
We can further simplify this equation and drop the constants and ignore the linear part, which will give us a run time of n^2.
worst case is O(n²).
there are indeed three loops, but not all inside each other, thus giving O(n²).
also, you can clearly see that the inner loops won't go from 1 to n (like the outer loop does). But because this would only change the time complexity by some constant, we can ignore this and say that it is just O(n^2).
This shows that time complexity is a measure saying: your algorithm will scale with this order, and it won't ever take any longer. (faster is however always possible)
for more information about "calculating" the worst case complexity of any algorithm, I can point you to a related question I asked earlier

Resources