Time complexity single loop with two variables - data-structures

What will be the time complexity of below code and why?
public static int[] Shuffle(int[] nums, int n)
{
int len = nums.Length;
int[] final = new int[2 * n];
int counter = 0;
for (int i = 0, j = n; i < n; i++, j++)
{
final[counter++] = nums[i];
final[counter++] = nums[j];
}
return final;
}
If we will have two loops as below then it will be considered as time complexity of O(n^2)
for (int i = 0; i < n; i++)
{
for (int j = 0; j < n; j++)
{
}
}

Complexity is O(n) because the cursor is looping from i = 0 until i = n-1. Number of variables doesn't matter when it comes to time complexity. (there is space complexity as well) However care,
for (int i = 0, j = n; i < n; i++, j++)
is completely different from
for (int i = 0; i < n; i++)
{
for (int j = 0; j < n; j++)
{

Related

array accesses in Radix Sort

I am studying radix sort and I can't understand why array accesses in radix sort is 11N+4R+1.
The following is radix sort code written in Java.
int n = a.length;
String[] aux = new String[n]; //N
int[] count = new int[R + 1]; //R+1
for (int i = 0; i < n; i++) {
count[a[i]+1]++; } //why 3N?
for (int r = 0; r < R; r++) {
count[r + 1] += count[r]; } //3R
for (int i = 0; i < n; i++) {
aux[count[a[i]]++] = a[i]; } //3N(?)+N+N = 5N
for (int i = 0; i < n; i++) {
a[i] = aux[i]; } //2N
count[a[i]+1]++; is equal to count[a[i]+1]=count[a[i]+1]+1;. I think each count[a[i]+1] has 2N array accesses so the total is 4N. If you look at the third for loop, a[i] is also duplicated at both sides in aux[count[a[i]]++] = a[i] but the right one is just counted one array access. Why does count[a[i]+1]++ count to 3N?

Algorithmic big o order of growth code

I'm doing an online course and i'm stuck on this question. I know there are similar questions but they don't help me.
What is the order of growth of the worst case running time of the
following code fragment as a function of N?
int sum = 0;
for (int i = 0; i*i*i < N; i++)
for (int j = 0; j*j*j < N; j++)
for (int k = 0; k*k*k < N; k++)
sum++;
I thought that the order would be n^3 but I don't think this is correct because the loops only go through a third of n each time. So would that make it nlogn?
Also
int sum = 0;
for (int i = 1; i <= N; i++)
for (int j = 1; j <= N; j++)
for (int k = 1; k <= N; k = k*2)
for (int h = 1; h <= k; h++)
sum++;
I think this one would be n^4 because you have n * n * 0.5n * 0.5n
The loops in fact only go up to the cube root of N. (i^3 < n, etc.)
The 3 nested loops of this length, give O(cube root of N, cubed). This O(N)
Of note, if you were correct and they each went to one third of N, then cubing this still gives O(N^3/9), 1/9 is constant, so this is O(n^3)
If you examine the value of sum for various values of N, then it becomes pretty clear what the time complexity of the algorithm is:
#include <iostream>
int main()
{
for( int N=1 ; N<=100 ; ++N ) {
int sum = 0;
for (int i = 0; i*i*i < N; i++)
for (int j = 0; j*j*j < N; j++)
for (int k = 0; k*k*k < N; k++)
sum++;
std::cout << "For N=" << N << ", sum=" << sum << '\n';
}
return 0;
}
You can then draw your own conclusions with greater insight.

Order of growth of as function of N

I'm practicing with algorithm complexities, I thought all the codes below were quadratic in terms of the order of growth but since I need the order of growth as a function of N, I think that changes things and I don't know exactly how to work it out.
int sum = 0;
for(int n = N; n > 0; n/=2)
for(int i = 0; i < n; i++)
sum++
int sum = 0;
for(int i = 1; i < N; i*=2)
for(int j = 0; j < i; j++)
sum++
int sum = 0;
for(int i = 1; i < N; i*=2)
for(int j = 0; j < N; j++)
sum++
int sum = 0;
for(int n = N; n > 0; n/=2)
for(int i = 0; i < n; i++)
sum++
This is O(N), the inner loop runs total of N + N/2 + N/4 + ... + 1 times, this sum converges to 2N when N->infinity, and thus it is O(N).
int sum = 0;
for(int i = 1; i < N; i*=2)
for(int j = 0; j < i; j++)
sum++
This is very similar to case1, and I am going to leave it to you as practice. Follow the same approach I did there, and you will get the answer.
int sum = 0;
for(int i = 1; i < N; i*=2)
for(int j = 0; j < N; j++)
sum++
Here, the main difference is the inner loop does not depend on the variable of the outer loop. This means, regardless of value of i, inner loop is going to repeat N times.
So, you need to realize how many times the outer loop will repeat, and multiply it with N.
I leave it as well for you as practice after explaining these guidelines.

How do I find the time complexity of these 3 nested loops?

The task is to analyze the following algorithm and calculate its time complexity.
I solved it as taking nested loops are 3 so O(n^3).
How do I solve this problem?
MSS (A[], N) //Where N is size of array A[]
{
int temp = 0, MS = 0;
For (int i = 0; i < N; i++)
{
for(int j = i; j < N; j++)
{
temp = 0;
for(int k = i; k <= j; k++)
temp = temp + A[k];
if(temp > MS)
MS = temp;
}
}
return(MS);
}
Well, you can proceed formally as such:

What will be the complexity of for loop if nothing is happening in the body of loop

Code:
int c = 0;
for (int i = 0; i < n; i++)
{
for (int j = 0; j < n; j++)
{
c = i * j;
}
}
Time Complexity: O(n2)
Now what will be the complexity of following code:
for (int i = 0; i < n; i++)
{
for (int j = 0; j < n; j++)
{
//c = i * j;
// nothing is happening inside the loop
}
}
whether complexity will be same as above( O(n2) ) or something else??
Theoretically - yes because there is still the issue of increasing the i and j which still needs to happen, and comparing them to the end value in each iteration.
However - compilers might optimize it to be done in constant time, and just set the post values of i and j.
For both complexity is O(N^2).

Resources