I am studying radix sort and I can't understand why array accesses in radix sort is 11N+4R+1.
The following is radix sort code written in Java.
int n = a.length;
String[] aux = new String[n]; //N
int[] count = new int[R + 1]; //R+1
for (int i = 0; i < n; i++) {
count[a[i]+1]++; } //why 3N?
for (int r = 0; r < R; r++) {
count[r + 1] += count[r]; } //3R
for (int i = 0; i < n; i++) {
aux[count[a[i]]++] = a[i]; } //3N(?)+N+N = 5N
for (int i = 0; i < n; i++) {
a[i] = aux[i]; } //2N
count[a[i]+1]++; is equal to count[a[i]+1]=count[a[i]+1]+1;. I think each count[a[i]+1] has 2N array accesses so the total is 4N. If you look at the third for loop, a[i] is also duplicated at both sides in aux[count[a[i]]++] = a[i] but the right one is just counted one array access. Why does count[a[i]+1]++ count to 3N?
Related
What will be the time complexity of below code and why?
public static int[] Shuffle(int[] nums, int n)
{
int len = nums.Length;
int[] final = new int[2 * n];
int counter = 0;
for (int i = 0, j = n; i < n; i++, j++)
{
final[counter++] = nums[i];
final[counter++] = nums[j];
}
return final;
}
If we will have two loops as below then it will be considered as time complexity of O(n^2)
for (int i = 0; i < n; i++)
{
for (int j = 0; j < n; j++)
{
}
}
Complexity is O(n) because the cursor is looping from i = 0 until i = n-1. Number of variables doesn't matter when it comes to time complexity. (there is space complexity as well) However care,
for (int i = 0, j = n; i < n; i++, j++)
is completely different from
for (int i = 0; i < n; i++)
{
for (int j = 0; j < n; j++)
{
I would like to know what kind of sorting algorithm is the one below. I understand that it is a integer sorting algorithm but other than that I haven't figured it out:
void mySorter(int arr[]) {
int a = arr.length;
for (int i = 0; i < a-1; i++) {
int min = i;
for (int j = i +1 ; j < a; j++) {
if (arr[j] < arr[min])
min = j;
int temp = arr[min];
arr[min] = arr[i]
arr[i] = temp;
}
}
}
Could it be a selection sort?
It is Bubble Sort. Your code sorts the list in ascending order.
I have this following code. I need to calculate this algorithm complexity but i have no idea where to start. This algorithm has 3 nested loops so i guess its complexity is n^3 or am i wrong?
public static void RadixSort(DataArray data)
{
IList> digits = new List>();
for (int i = 0; i < 10; i++)
{
digits.Add(new List<int>());
}
for (int i = 0; i < data.Length; i++)
{
for (int j = 0; j < data.Length; j++)
{
int digit = (int)((data[j] % Math.Pow(10, i + 1)) / Math.Pow(10, i));
digits[digit].Add((int)data[j]);
}
int index = 0;
for (int k = 0; k < digits.Count; k++)
{
IList<int> selDigit = digits[k];
for (int l = 0; l < selDigit.Count; l++)
{
data.Swap(index++, selDigit[l]);
//data[index++] = selDigit[l];
}
}
for (int k = 0; k < digits.Count; k++)
{
digits[k].Clear();
}
}
}
Calculating complexity is more complex than just look at the number of nested loops. If you have a triple nested loop like this:
for(int i=0; i<n; i++)
for(int j=0; j<n; j++)
for(int k=0; k<n; k++)
it will be O(n³), assuming n is not changing in the loop. However, if you consider your case:
for(int i=0; i<n; i++)
for(int j=0; j<m; j++)
for(int k=0; k<m; k++)
the time complexity will instead be O(m²n).
And even the simplest sorting algorithms, like bouble sort, selection sort and insertions sort is O(n²), so if your implementation is worse than that you're doing something wrong. The time complexity for radix sort is O(wn), where w is a measure of the size of the elements.
When uncertain about complexity, a reasonable approach is to add counters to the inner-loop code and at the end of the routine print out the counts. Next, vary the size of the input to see how the results change. The empirical results can immediately confirm or deny your analytic or intuited results.
I'm doing an online course and i'm stuck on this question. I know there are similar questions but they don't help me.
What is the order of growth of the worst case running time of the
following code fragment as a function of N?
int sum = 0;
for (int i = 0; i*i*i < N; i++)
for (int j = 0; j*j*j < N; j++)
for (int k = 0; k*k*k < N; k++)
sum++;
I thought that the order would be n^3 but I don't think this is correct because the loops only go through a third of n each time. So would that make it nlogn?
Also
int sum = 0;
for (int i = 1; i <= N; i++)
for (int j = 1; j <= N; j++)
for (int k = 1; k <= N; k = k*2)
for (int h = 1; h <= k; h++)
sum++;
I think this one would be n^4 because you have n * n * 0.5n * 0.5n
The loops in fact only go up to the cube root of N. (i^3 < n, etc.)
The 3 nested loops of this length, give O(cube root of N, cubed). This O(N)
Of note, if you were correct and they each went to one third of N, then cubing this still gives O(N^3/9), 1/9 is constant, so this is O(n^3)
If you examine the value of sum for various values of N, then it becomes pretty clear what the time complexity of the algorithm is:
#include <iostream>
int main()
{
for( int N=1 ; N<=100 ; ++N ) {
int sum = 0;
for (int i = 0; i*i*i < N; i++)
for (int j = 0; j*j*j < N; j++)
for (int k = 0; k*k*k < N; k++)
sum++;
std::cout << "For N=" << N << ", sum=" << sum << '\n';
}
return 0;
}
You can then draw your own conclusions with greater insight.
The task is to analyze the following algorithm and calculate its time complexity.
I solved it as taking nested loops are 3 so O(n^3).
How do I solve this problem?
MSS (A[], N) //Where N is size of array A[]
{
int temp = 0, MS = 0;
For (int i = 0; i < N; i++)
{
for(int j = i; j < N; j++)
{
temp = 0;
for(int k = i; k <= j; k++)
temp = temp + A[k];
if(temp > MS)
MS = temp;
}
}
return(MS);
}
Well, you can proceed formally as such: