This question already has answers here:
How to calculate the median of an array?
(16 answers)
Closed 3 months ago.
I wrote a code that finds the median value of an unsorted array. What is this code's big O? Can you explain? Can we optimize runtime complexity?
public static int medianElement(int[] array,int low, int high) {
int[] tmpArray = new int[high - low + 1];
for (int i = 0; i < high - low; i++) {
tmpArray[i] = array[low + i];
}
boolean changed = true;
while (changed) {
changed = false;
for (int i = 0; i < high - low; i++) {
if (tmpArray[i] > tmpArray[i + 1]) {
changed = true;
swap(tmpArray, i, i + 1);
}
}
}
return tmpArray[(high - low + 1) / 2];
}
public static void swap(int[] arr, int i, int j) {
int temp = arr[i];
arr[i] = arr[j];
arr[j] = temp;
}
Your sorting algorithm is called Bubble sort https://en.wikipedia.org/wiki/Bubble_sort. Its runtime is O(n^2) in the worst case and O(n) in the best case.
To improve the average time this code takes to run, you could use a sorting algorithm with better worst case performance, such as merge sort https://en.wikipedia.org/wiki/Merge_sort. Merge sort performs at O(n log n) in the worse case.
Related
Hello everyone i have a question. It's my task which one is below:
Let A[] be a natural numbers array of length N, which is partially sorted, i.e. there exists such index i(0 < i < N-1), that the subaray A[0],...,A[i] is incrementally sorted and also the subarray A[i+1],...,A[N] is incrementally sorted. Design the algorithm, which sorts the whole array A[] and works in place (so has space complexity O(1)) and the result must be stored in the same array A[]. Describe the algorithm, its correctness and its time complexity approximation.
For this question which approaching is better? Bubble sorting or Insertion sort? Or is there more effective solution? I prefered bubble sorting for this task but i am open to other opinions
static void bubbleSort(int arr[], int n)
{
int i, j, temp;
boolean swapped;
for (i = 0; i < n - 1; i++)
{
swapped = false;
for (j = 0; j < n - i - 1; j++)
{
if (arr[j] > arr[j + 1])
{
// swap arr[j] and arr[j+1]
temp = arr[j];
arr[j] = arr[j + 1];
arr[j + 1] = temp;
swapped = true;
}
}
if (swapped == false)
break;
}
}
static void printArray(int arr[], int size)
{
int i;
for (i = 0; i < size; i++)
System.out.print(arr[i] + " ");
System.out.println();
}
public static void main(String args[])
{
int arr[] = { 1, 8, 45, 12, 22, 11, 90 };
int n = arr.length;
bubbleSort(arr, n);
System.out.println("Sorted array: ");
printArray(arr, n);
}
}
Bubble sort algorithm complexity is O(n^2). Even using if (swapped == false) break; this will not help to reduce the complexity (try for {2,3,4,5,1}, you will find out).
Since there exists such index i(0 < i < N-1), that the subaray A[0],...,A[i] is incrementally sorted and also the subarray A[i+1],...,A[N] is incrementally sorted.This problem can be solve in O(n) run time complexity. If we can find the index i where A[0:i] and A[i+1:n] are sorted, then we can think this problem as merging two sorted array into one array which can be done in O(n) time. Algorithm is given below:
void sortPartialSortedArray(int arr[], int n)
{
int pos = 0;
// find the position for which arr[0:pos] and arr[pos+1:n] is sorted
for(int i=0; i+1<n; i++) {
if(arr[i]>arr[i+1]) {
pos = i;
}
}
int i = pos, j= n-1;
// sort it from last position
while(i>=0 && j>=0) {
if(arr[i] > arr[j]) {
swap(arr[i],arr[j]);
}
j--;
if(i==j) {
i--;
}
}
}
I came across kth largest number problem in Leetcode
Input: [3,2,1,5,6,4] and k = 2, Output: 5
Suggested Solution:
public int findKthLargest(int[] nums, int k) {
shuffle(nums);
k = nums.length - k;
int lo = 0;
int hi = nums.length - 1;
while (lo < hi) {
final int j = partition(nums, lo, hi);
if(j < k) {
lo = j + 1;
} else if (j > k) {
hi = j - 1;
} else {
break;
}
}
return nums[k];
}
private int partition(int[] a, int lo, int hi) {
int i = lo;
int j = hi + 1;
while(true) {
while(i < hi && less(a[++i], a[lo]));
while(j > lo && less(a[lo], a[--j]));
if(i >= j) {
break;
}
exch(a, i, j);
}
exch(a, lo, j);
return j;
}
private void exch(int[] a, int i, int j) {
final int tmp = a[i];
a[i] = a[j];
a[j] = tmp;
}
private boolean less(int v, int w) {
return v < w;
}
Doesn't partition take O(n) and the while loop in the main function take O(log n) so it should be O(nlog n)? This looks like it uses Quicksort but the runtime for quicksort is O(nlogn). If quicksort takes O(n), this makes sense but it does not. Please help me understand what is going on?
This is a randomized algorithm that has average/expected O(n) runtime. This is because after randomly shuffling the input list, we typically have pivots good enough to expect that after each partition function call if we don't find the target yet we reduce our list (to be search next) roughly by half. This means even though if we not lucky and have to continuously call partition function we continuously keep reducing our list's size by half, therefore the average runtime is still only O(n) since O(n) + O(n/2) + O(n/4) + ... + O(1) is still O(n).
http://en.wikipedia.org/wiki/H-index
this wiki page is a definition of h-index
basically if I were to have an array of [ 0 3 4 7 8 9 10 ], my h-index would be 4 since I have 4 numbers bigger than 4. My h-index would've been 5 if I were to have 5 numbers bigger than 5, and etc. Given an array of integers bigger or equal to 0, what are the ways of calculating h-index efficiently?
edit: the array is not necessarily sorted
Here my realization O(N) with tabling, this is simple and blazing fast:
private static int GetHIndex(int[] m)
{
int[] s = new int[m.Length + 1];
for (int i = 0; i < m.Length; i++) s[Math.Min(m.Length, m[i])]++;
int sum = 0;
for (int i = s.Length - 1; i >= 0; i--)
{
sum += s[i];
if (sum >= i)
return i;
}
return 0;
}
This could be done in O(n) time.
Find median of the array.
if median > (n-1)/2 then the number comes before median. Find it iteratively
If median < (n-1)/2 then the number comes after median. Find it iteratively.
If median == (n-1)/2 then the median is the solution
Here I am assuming that n is odd. Change algorithm slightly for even n (replace (n+1)/2 with n/2 assuming rank of median is n/2). Also, finding actual median in O(n) time is complicated. Use a good pivot instead (as in quicksort).
Complexity: n+n/2 +n/4... = O(n)
Answer in c# but easily convertable to java as well
public int HIndex(int[] citations) {
Array.Sort(citations);
var currentCount = 0;
var length = citations.Length;
for (var i = citations.Length - 1; i >= 0; i--)
{
currentCount = length - i;
// if the count of items to the right is larger than current value it means thats the max we can expect for hindex
if (currentCount - 1 >= citations[i])
{
return currentCount - 1;
}
}
return currentCount;
}
This is one solution I could think of. not sure if its the best.
Sort the array in ascending order. complexity nlog(n)
Iterate through the array from the index 0 to n. complexity of n
and for each iteration, suppose index is i
if (arr[i] == (arr.length - (i+1))
return arr[i]
e.g.,
arr =[ 0 3 4 7 8 9 10 ]
arr[2] = 4
i = 2
arr.length = 7
4 = (7- (2+1))
This is in O(nlogn) time but sort and concise.
public static int hindex(int[] array) {
Arrays.sort(array);
int pos = 0;
while (pos < array.length && array[pos] <= array.length - pos) {
pos++;
}
return array[pos - 1];
}
n=size of array
sort the array
then h-index = max(min(f(i),i) for i=1:n)
since h-index can never exceed n, replace all numbers in array greater
than n with n.
Now use count sort to sort the array.
time complexity O(n)
space complexity O(n)
I was not happy with my previous implementation, so I replaced it with a faster solution written in Java.
public int hIndex(int[] citations) {
if(citations == null || citations.length == 0)
{
return 0;
}
Arrays.sort(citations);
int hIndex = 0;
for(int i=0;i<citations.length;i++)
{
int hNew;
if(citations[i]<citations.length-i)
{
hNew = citations[i];
if(hNew>hIndex)
{
hIndex = hNew;
}
}
else if(citations[i]>=citations.length-i)
{
hNew = citations.length-i;
if(hNew>hIndex)
{
hIndex = hNew;
}
break;
}
}
return hIndex;
}
I have to perform a run time analysis on quicksort by incrementing the size of the array by 100 each time. However when I measure the runtime using System.nanoTime the results aren't as I expect (my graphs look more like O(2^n)). The time shoots up whenever the array reaches around 800. Could someone please tell me where I'm going wrong with my code.
Also the count part is sort of irrelevant at the moment, since i only want to run the quicksort once at each array size.
import java.util.Random;
public class Quickworking {
public static void main(String[] args) {
Random rand = new Random();
int count2;
long total = 0;
count2 = 1;
int [] myArray = new int[1400];
//generates random array
for (int i = 0; i < myArray.length; i++){
myArray[i] = rand.nextInt(100) + 1;
//System.out.print(myArray[i] + ", ");
}
//loop sort n amount of times
for (int count = 0; count < count2; count++){
//calls the sort method on myArray giving the arguments
long start = System.nanoTime();
sort( myArray, 0, myArray.length - 1 );
long end = System.nanoTime();
System.out.println(end - start );
total += (end - start);
}
//long average = (long) total / (long) count2;
//System.out.println(average);
//prints the sorted array
//for (int i = 0; i <myArray.length; i++){
// System.out.print(myArray[i] + ", ");
//}
}
public static int sort(int myArray[], int left, int right){
int i = left, j = right;
int temp;
int pivot = myArray[(left + right) / 2];
//System.out.println("here are the pivot numbers " + pivot + ", ");
if (i <= j){
while (myArray[i] < pivot) //&& (i<right))
i++;
while (myArray[j] > pivot) //&& (j>left))
j--;
if (i <= j){
temp = myArray[i];
myArray[i] = myArray[j];
myArray[j] = temp;
i++;
j--;
}
}
if (left < j) sort(myArray, left, j);
if (i < right) sort(myArray, i, right);
return i;
}
}
Quicksort's behaviour when sorting already sorted arrays is O(n^2). After the array is sorted the first time in your code you are then sorting the sorted array. That will give you the worst case for quick sort. You need to generate a completely new random array each time to really test this.
QuickSort does not perform well on already sorted data.
See here:
http://en.wikipedia.org/wiki/Quicksort
You should randomize your array within the for loop in order to get more accurate results.
What is the runtime/memory complexity of the Maximum subarray problem using brute force?
Can they be optimized more? Especially the memory complexity?
Thanks,
Brute force is Omega(n^2). Using Divide and conquer you can do it with Theta(n lg n) complexity. Further details are available in many books such as Introduction to Algorithms, or in various resources on the Web, such as this lecture.
As suggested in this answer you can use Kadane's algorithm which has O(n) complexity. An implementation in Java:
public int[] kadanesAlgorithm (int[] array) {
int start_old = 0;
int start = 0;
int end = 0;
int found_max = 0;
int max = array[0];
for(int i = 0; i<array.length; i++) {
max = Math.max(array[i], max + array[i]);
found_max = Math.max(found_max, max);
if(max < 0)
start = i+1;
else if(max == found_max) {
start_old=start;
end = i;
}
}
return Arrays.copyOfRange(array, start_old, end+1);
}