Assume I have N items in an un-ordered array that I need to sort, and that I support a comparison operation between items in the array, but it is very expensive, so I want to perform as few comparisons as possible. That is, I don't just want to pick a sorting algorithm with guaranteed O(n log n) comparisons, which can be up to a scale factor, I want to pick that algorithm which will use the fewest comparisons of all. What algorithm do I pick?
Related
Given an integer array nums, return the largest perimeter of a triangle with a non-zero area formed from three of these lengths. If it is impossible to form any triangle of a non-zero area, return 0.
I can think about the brute force method, and another method is sorting the array given and iterating a loop backward and checking the condition for forming a triangle and returning the sum, is there any other method to solve this
The best algorithm that I can think of is to put the numbers into a heap, instead of sorting the array. Usually it will be O(n) to make the heap and O(1) to find the triangle. The worst case time is if no numbers satisfy the triangle condition, and therefore it would take O(n log(n)) comparisons. This worst case can only be hit if you allow big integers into the mix. With 64-bit integers, say, the worst case is O(n).
Given that you have to look at every element, you can't do better than O(n) average time. And all of the tricks for fixed sized integers, like a radix sort, won't help much.
Even in the arbitrary integer situation, it is still very good.
Suppose that we had a comparison based algorithm with a worst case of o(n log(n)). Then by tracking the comparisons made and applying Kahn's algorithm for a topological sort, you'd have a comparison based sort that is o(n log(n)), which is well-known to be impossible.
I'm having trouble finding a non-comparison based algorithm for integers of arbitrary size that might do better.
Is there any sorting algorithm with an average time complexity log(n)??
example [8,2,7,5,0,1]
sort given array with time complexity log(n)
No; this is, in fact, impossible for an arbitrary list! We can prove this fairly simply: the absolute minimum thing we must do for a sort is look at each element in the list at least once. After all, an element may belong anywhere in the sorted list; if we don't even look at an element, it's impossible for us to sort the array. This means that any sorting algorithm has a lower bound of n, and since n > log(n), a log(n) sort is impossible.
Although n is the lower bound, most sorts (like merge sort, quick sort) are n*log(n) time. In fact, while we can sort purely numerical lists in n time in some cases with radix sort, we actually have no way to, say, sort arbitrary objects like strings in less than n*log(n).
That said, there may be times when the list is not arbitrary; ex. we have a list that is entirely sorted except for one element, and we need to put that element in the list. In that case, methods like binary search tree can let you insert in log(n), but this is only possible because we are operating on a single element. Building up a tree (ie. performing n inserts) is n*log(n) time.
As #dominicm00 also mentioned the answer is no.
In general when you see an algorithm with time complexity of Log N with base 2 that means that, you are dividing the input list into 2 sets, and getting rid of one of them repeatedly. In sorting algorithm we need to put all the elements in their appropriate place, if we get rid of half of the list in each iteration, that does not correlate with sorting functionality.
The most efficient sorting algorithms have the time complexity of O(n), but with some limitations. Three most famous algorithm with complexity of O(n) are :
Counting sort with time complexity of O(n+k), while k is the maximum number in given list. Assuming n>>k, you can consider its time complexity as O(n)
Radix sort with time complexity of O(d*(n+k)), where k is maximum number of input list and d is maximum number of digits you may have in input list. Similar to counting sort assuming n>>k && n>>d => time complexity will be O(n)
Bucket sort with time complexity of O(n)
But in general due to limitation of each of these algorithms most implementation relies on O(n* log n) algorithms, such as merge sort, quick sort, and heap sort.
Also there are some sorting algorithms with time complexity of O(n^2) which are recommended for list with smaller sizes such as insertion sort, selection sort, and bubble sort.
Using a PLA it might be possible to implement counting sort for a few elements with a low range of values.
count each amount in parallel and sum using lg2(N) steps
find the offset of each element in lg2(N) steps
write the array in O(1)
Only massive parallel computation would be able to do this, general purpose CPU's would not do here unless they implement it in silicon as part of their SIMD.
Suppose you are given a sorted list of n elements followed by f(n) randomly ordered elements. How would you sort the list if (i) f(n) = O(logn). I feel best algo would be merge sort but I am not sure of the resulting time complexity.
You should first sort the f(n) elements with any sort method and then use merge sort for the final phase. The time complexity would be O(n) as O(log(n)2) is negligible compared to the linear scan of the sorted portion.
If by list you mean an array, you could reduce the number of comparisons to O(log(n)2) by looking for the insertion point into the left portion using binary search. It would still take O(n) copying operations, so depending on the relative costs of copying vs: comparing, the time might stay sub-linear even for moderately large values of n.
Consider this problem:
A comparison-based sorting algorithm sorts an array with n items. For which fraction of n! permutations, the number of comparisons may be cn where c is a constant?
I know the best time complexity for sorting an array with arbitrary items is O(nlogn) and it doesn't depend on any order, right? So, there is no fraction that leads to cn comparisons. Please guide me if I am wrong.
This depends on the sorting algorithm you use.
Optimized Bubble Sort for example, compares all neighboring elements of an array and swaps them when the left element is larger then right one. This is repeated until no swaps where performed.
When you give Bubble Sort a sorted array it won't perform any swaps in the first iteration and thus sorts in O(n).
On the other hand, Heapsort will take O(n log n) independent of the order of the input.
Edit:
To answer your question for a given sorting algorithm, might be non-trivial. Only one out of n! permutations is sorted (assuming no duplicates for simplicity). However, for the example of bubblesort you could (starting for the sorted array) swap each pair of neighboring elements. This input will take Bubblesort two iterations which is also O(n).
If I have a unsorted array A[1.....n]
using linear search to search number x
using bubble sorting to sort the array A in ascending order, then use binary search to search number x in sorted array
Which way will be more efficient — 1 or 2?
How to justify it?
If you need to search for a single number, nothing can beat a linear search: sorting cannot proceed faster than O(n), and even that is achievable only in special cases. Moreover, bubble sort is extremely inefficient, taking O(n2) time. Binary search is faster than that, so the overall timing is going to be dominated by O(n2).
Hence you are comparing O(n) to O(n2); obviously, O(n) wins.
The picture would be different if you needed to search for k different numbers, where k is larger than n2. The outcome of this comparison may very well be negative.