How do I find the time complexity of these 3 nested loops? - algorithm

The task is to analyze the following algorithm and calculate its time complexity.
I solved it as taking nested loops are 3 so O(n^3).
How do I solve this problem?
MSS (A[], N) //Where N is size of array A[]
{
int temp = 0, MS = 0;
For (int i = 0; i < N; i++)
{
for(int j = i; j < N; j++)
{
temp = 0;
for(int k = i; k <= j; k++)
temp = temp + A[k];
if(temp > MS)
MS = temp;
}
}
return(MS);
}

Well, you can proceed formally as such:

Related

Time complexity single loop with two variables

What will be the time complexity of below code and why?
public static int[] Shuffle(int[] nums, int n)
{
int len = nums.Length;
int[] final = new int[2 * n];
int counter = 0;
for (int i = 0, j = n; i < n; i++, j++)
{
final[counter++] = nums[i];
final[counter++] = nums[j];
}
return final;
}
If we will have two loops as below then it will be considered as time complexity of O(n^2)
for (int i = 0; i < n; i++)
{
for (int j = 0; j < n; j++)
{
}
}
Complexity is O(n) because the cursor is looping from i = 0 until i = n-1. Number of variables doesn't matter when it comes to time complexity. (there is space complexity as well) However care,
for (int i = 0, j = n; i < n; i++, j++)
is completely different from
for (int i = 0; i < n; i++)
{
for (int j = 0; j < n; j++)
{

What is the Big O notation of a loop in loop

I have this code and cannot understand the Big-O of this... Thanks
for(i = 0; i<n; i++){
for(j = i; j<n; j++){
if (arr[j]%2!=0){
if (minodd > arr[j]){
}
}
}
}
One of the best ways to approach this problem is to break it down into smaller parts.
First, lets look at your inner loop:
for(j = i; j<n; j++){
if (arr[j]%2!=0){ // O(1)
if (minodd > arr[j]){ // O(1)
}
}
}
The if-statements are O(1) or constant time so we can ignore those and we get just the inner for loop:
for(j = i; j<n; j++){
... // O(1) + O(1)
}
Since the worst case scenario is it loops n times we have O(n) + O(1) + O(1) which can be simplified to O(n) which is called linear time.
Next, lets zoom out and replace the inner loop with our new info:
for(i = 0; i<n; i++){
for(j = i; j<n; j++){
if (arr[j]%2!=0){
if (minodd > arr[j]){
}
}
}
}
becomes:
for(i = 0; i<n; i++){
O(n)
}
Since we know the outside for loop will cycle n times in the worst case, and the inside for loop will cycle n times in the worst case: We get O(n x n) or O(n²) which is also know as polynomial time.
Doesn't this just go on for forever?
You have i < n in your inner loop, so I think it's O(inf).
Now that you've updated the loop, I think #e2-e4 is right:
#include <stdio.h>
int eqn(int n)
{
return n > 0 ? n + eqn(n - 1) : 0;
}
int main(int argc, char **argv)
{
int i, j, n, v, a;
v = 0;
n = 5;
for (i = 0; i < n; i++) {
for (j = i; j < n; j++) {
v++;
}
}
// v = 15 ? 15
printf("v = %d ? %d\n", v, eqn(n));
return 0;
}

Algorithmic big o order of growth code

I'm doing an online course and i'm stuck on this question. I know there are similar questions but they don't help me.
What is the order of growth of the worst case running time of the
following code fragment as a function of N?
int sum = 0;
for (int i = 0; i*i*i < N; i++)
for (int j = 0; j*j*j < N; j++)
for (int k = 0; k*k*k < N; k++)
sum++;
I thought that the order would be n^3 but I don't think this is correct because the loops only go through a third of n each time. So would that make it nlogn?
Also
int sum = 0;
for (int i = 1; i <= N; i++)
for (int j = 1; j <= N; j++)
for (int k = 1; k <= N; k = k*2)
for (int h = 1; h <= k; h++)
sum++;
I think this one would be n^4 because you have n * n * 0.5n * 0.5n
The loops in fact only go up to the cube root of N. (i^3 < n, etc.)
The 3 nested loops of this length, give O(cube root of N, cubed). This O(N)
Of note, if you were correct and they each went to one third of N, then cubing this still gives O(N^3/9), 1/9 is constant, so this is O(n^3)
If you examine the value of sum for various values of N, then it becomes pretty clear what the time complexity of the algorithm is:
#include <iostream>
int main()
{
for( int N=1 ; N<=100 ; ++N ) {
int sum = 0;
for (int i = 0; i*i*i < N; i++)
for (int j = 0; j*j*j < N; j++)
for (int k = 0; k*k*k < N; k++)
sum++;
std::cout << "For N=" << N << ", sum=" << sum << '\n';
}
return 0;
}
You can then draw your own conclusions with greater insight.

Order of growth of as function of N

I'm practicing with algorithm complexities, I thought all the codes below were quadratic in terms of the order of growth but since I need the order of growth as a function of N, I think that changes things and I don't know exactly how to work it out.
int sum = 0;
for(int n = N; n > 0; n/=2)
for(int i = 0; i < n; i++)
sum++
int sum = 0;
for(int i = 1; i < N; i*=2)
for(int j = 0; j < i; j++)
sum++
int sum = 0;
for(int i = 1; i < N; i*=2)
for(int j = 0; j < N; j++)
sum++
int sum = 0;
for(int n = N; n > 0; n/=2)
for(int i = 0; i < n; i++)
sum++
This is O(N), the inner loop runs total of N + N/2 + N/4 + ... + 1 times, this sum converges to 2N when N->infinity, and thus it is O(N).
int sum = 0;
for(int i = 1; i < N; i*=2)
for(int j = 0; j < i; j++)
sum++
This is very similar to case1, and I am going to leave it to you as practice. Follow the same approach I did there, and you will get the answer.
int sum = 0;
for(int i = 1; i < N; i*=2)
for(int j = 0; j < N; j++)
sum++
Here, the main difference is the inner loop does not depend on the variable of the outer loop. This means, regardless of value of i, inner loop is going to repeat N times.
So, you need to realize how many times the outer loop will repeat, and multiply it with N.
I leave it as well for you as practice after explaining these guidelines.

What will be the complexity of for loop if nothing is happening in the body of loop

Code:
int c = 0;
for (int i = 0; i < n; i++)
{
for (int j = 0; j < n; j++)
{
c = i * j;
}
}
Time Complexity: O(n2)
Now what will be the complexity of following code:
for (int i = 0; i < n; i++)
{
for (int j = 0; j < n; j++)
{
//c = i * j;
// nothing is happening inside the loop
}
}
whether complexity will be same as above( O(n2) ) or something else??
Theoretically - yes because there is still the issue of increasing the i and j which still needs to happen, and comparing them to the end value in each iteration.
However - compilers might optimize it to be done in constant time, and just set the post values of i and j.
For both complexity is O(N^2).

Resources