Heap sort and Quick sort stability - algorithm

i had tried with array that have 2 or 3 elements equal,of course it's not stable but if we have array that have all element equal like this [2,2,2,2,2,2,2] will it be stable or not ?
of course by using heap sort or quick sort.
thanks.

Stability for given data set depends on implementation details. For example, look at Lomuto partition from Wiki page
algorithm partition(A, lo, hi) is
pivot := A[hi]
i := lo - 1
for j := lo to hi - 1 do
if A[j] < pivot then
i := i + 1
swap A[i] with A[j]
swap A[i + 1] with A[hi]
return i + 1
We can see that i never increments, so last swap exchanges the first and the last items of range, breaking initial order.
Hoare partition implementation from the same page seems doesn't break order

Related

Why do we only include the pivot 1/2 of the time in Hoare's Quicksort?

According to Wikipedia, Hoare's partition (partial code) looks like:
// Sorts a (portion of an) array, divides it into partitions, then sorts those
algorithm quicksort(A, lo, hi) is
if lo >= 0 && hi >= 0 && lo < hi then
p := partition(A, lo, hi)
quicksort(A, lo, p) // Note: the pivot is now included
quicksort(A, p + 1, hi)
I was curious why the pivot is included in the lo...p call but not in the p + 1...hi call (whereas they are both excluded in Lomuto's partitioning).
Wikipedia wrote:
With this formulation it is possible that one sub-range turns out to be the whole original range, which would prevent the algorithm from advancing. Hoare therefore stipulates that at the end, the sub-range containing the pivot element (which still is at its original position) can be decreased in size by excluding that pivot, after (if necessary) exchanging it with the sub-range element closest to the separation; thus, termination of quicksort is ensured.
Why are we allowed to include the pivot in the lo...p subrange, but not in the p + 1...hi subrange? By the same logic in the above Wikipedia page, if the lo...p subrange is exactly the original range, wouldn't we run into the same infinite recursion problems?
The index p may not be the pivot index. Elements equal to the pivot or the pivot itself can end up anywhere during a partition step. After a partition step, elements <= pivot are to the left or at p, elements >= pivot are to the right of p. By allowing the pivot or elements equal to the pivot to be swapped, the inner loops do not need to do bounds checking. Another advantage of Hoare partition scheme is it becomes faster as the number of duplicates increases (despite often swapping equal elements), while Lomuto becomes slower, degrading to O(n^2) time complexity if all elements are equal.

Hoare's vs Lomuto's Partition

Can you give an example that 2 partition scheme give different result ?
With Lomuto's we have to write:
quicksort(A,l,p)
quicksort(A,p+1,h)
While with Hoare's:
quicksort(A,l,p+1)
quicksort(A,p+1,h)
(Operations performed in [low,high))
What's the difference ?
The basic Lomuto partition scheme swaps the pivot out of the way, does the partition, swaps the pivot into place and then returns an index to the pivot at its sorted position. In this case, the pivot can be excluded from the recursive calls:
The basic Hoare partition scheme scans from both ends towards some point within the partition, putting all elements less than the pivot to the left of all elements greater than the pivot, but any elements equal to the pivot, including the pivot itself, can end up anywhere in the partition, and the index returned is the split point between the left (elements <= pivot) and right (elements >= pivot), so the calling code cannot exclude the element at the index returned from Hoare partition function from recursive calls. If the Hoare scheme is modified to be similar to Lomuto, where it swaps the pivot to either end, does the partition, then swaps the pivot to the split index, then the calling code can exclude the pivot, but this ends up being slower.
Difference between these partitions is not in reqursive calls.
Really any partition (that support correct interface) might be used with the same implementation of the main routine.
Partition functiion usually returns index of pivot. This pivot already stands at the final place. It is not needed to treat this index again.
So for the case when low is included in treatment but highis not, we can write
pivotindex = partition(arr, low, high);
// Separately sort elements before pivotindex and after pivotindex
quickSort(arr, low, pivotindex);
quickSort(arr, pivotindex + 1, high);
To understand the difference, we also need to focus on the partition method and not just on the calls to quicksort.
Lomuto partition scheme:
algorithm partition(A, lo, hi) is
pivot := A[hi]
i := lo
for j := lo to hi - 1 do
if A[j] < pivot then
swap A[i] with A[j]
i := i + 1
swap A[i] with A[hi]
return i
Hoare partition scheme:
algorithm partition(A, lo, hi) is
pivot := A[lo + (hi - lo) / 2]
i := lo - 1
j := hi + 1
loop forever
do
i := i + 1
while A[i] < pivot
do
j := j - 1
while A[j] > pivot
if i >= j then
return j
swap A[i] with A[j]
(Adding the above as image because I couldn't insert formatted table here. Please click on the image for better view.)
Also, Hoare’s scheme is more efficient than Lomuto’s partition scheme because it does three times fewer swaps on average, and it creates efficient partitions even when all values are equal.
I have just mentioned the key differentiating points. I suggest you to read the above two hyperlinks. You might want to gain more knowledge on this topic by reading this.
Comment if you have any further doubts and we will help you solve the doubts.

replacing elements of one array from others to minimize the sum of first array [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 6 years ago.
Improve this question
We are given two arrays , A[] of size n and another array B[] , of size m , we can replace any number of elements in A[] , using elements of B[] , each element of B[] can be used only once to replace an element from array A[].
what will be the minimum sum of array A[] after doing such replacements .
My Approach was to :
sort array A and B
replace elements from end of A with elements from front of B until element at front of B is less than element at end of A
However i am getting a WA with this approach.
Your algorithm won't work. There is possibility that relatively larger elements will be remain in the final output array. I will post the relevant test-case here.
Just for now, I am telling what will work.
Solution #1
After sorting, this problem is just a sub-problem of merge sort. The is absolutely same as the merging step of merge sort.
Sort A and B
Merge A and B until the output array consists of n elements
The pseudo-code for merging will look like this:
function merge(int[] A, int[] B):
n := length(A)
m := length(B)
int[] output := new int[n]
i := 0
j := 0
k := 0
while i < n and j < m and k < n do
if A[i] <= B[j]
output[k] := A[i]
i := i + 1
else if A[i] > B[j]
output[k] := B[j]
j := j + 1
k := k + 1
end
while i < n and k < n do
output[k] := A[i]
i := i + 1
k := k + 1
end
while j < m and k < n do
output[k] := B[j]
j := j + 1
k := k + 1
end
return output
Time complexity for sorting O(nlogn) and merging require O(n) time and O(n) space.
Solution #2 [Faster approach]
Sort array A and B
Take the n smallest elements by binary search approach. This is similar to the way of finding median of two sorted array. Here, instead of finding median you will go for n smallest elements and take all elements within this range in your output array.
Time complexity is O(nlogn) for sort and O(log(n + m)) for second step.
Answer would essentially be the smallest n numbers from n+m numbers from A and B.
Sort A - O(nlogn) and B - O(mlogm). Then using an approach similar to merge of mergesort algorithm, find the minimum n elements among these two sorted lists - O(n) and fill in array A

Understanding quicksort

I'm having a hard time understanding quicksort, most of the demonstrations and explanations leave out what actually happens (http://me.dt.in.th/page/Quicksort/ for example).
Wikipedia says:
Pick an element, called a pivot, from the array. Partitioning: reorder
the array so that all elements with values less than the pivot come
before the pivot, while all elements with values greater than the
pivot come after it (equal values can go either way). After this
partitioning, the pivot is in its final position. This is called the
partition operation. Recursively apply the above steps to the
sub-array of elements with smaller values and separately to the
sub-array of elements with greater values.
How would that work with an array of 9,1,7,8,8 for example with 7 as the pivot? The 9 needs to move to the right of the pivot, all quicksort implementations are in place operations it seems so we can't add it after the 8,8, so the only option is to swap the 9 with the 7.
Now the array is 7,1,9,8,8. The idea behind quicksort is that now we have to recursively sort the parts to the left and right of the pivot. The pivot is now at position 0 of the array, meaning there's no left part, so we can only sort the right part. This is of no use as 7>1 so the pivot ended up in the wrong place.
In this image 4 is the pivot, then why is 5 going almost all the way to the left? It's bigger than 4! After a lot of swapping it ends up being sorted but I don't understand how that happened.
Quicksort
The Quicksort steps are:
Pick an element, called a pivot, from the list.
Reorder the list so that all elements with values less than the pivot come before the pivot, while all elements with values greater than the pivot come after it (equal values can go either way). After this partitioning, the pivot is in its final position. This is called the partition operation.
Recursively sort the sub-list of lesser elements and the sub-list of greater elements.
The base case of the recursion are lists of size zero or one, which never need to be sorted.
Lomuto partition scheme
This scheme chooses a pivot which is typically the last element in
the array.
The algorithm maintains the index to put the pivot in variable i and each time it finds an element less than or equal to pivot, this
index is incremented and that element would be placed before the
pivot.
As this scheme is more compact and easy to understand, it is frequently used in introductory material.
Is less efficient than Hoare's original scheme.
Partition algorithm (using Lomuto partition scheme)
algorithm partition(A, lo, hi) is
pivot := A[hi]
i := lo // place for swapping
for j := lo to hi – 1 do
if A[j] ≤ pivot then
swap A[i] with A[j]
i := i + 1
swap A[i] with A[hi]
return i
Quicksort algorithm (using Lomuto partition scheme)
algorithm quicksort(A, lo, hi) is
if lo < hi then
p := partition(A, lo, hi)
quicksort(A, lo, p – 1)
quicksort(A, p + 1, hi)
Hoare partition scheme
Uses two indices that start at the ends of the array being
partitioned, then move toward each other, until they detect an
inversion: a pair of elements, one greater than the pivot, one
smaller, that are in the wrong order relative to each other. The
inverted elements are then swapped.
There are many variants of this algorithm, for example, selecting pivot from A[hi] instead of A[lo]
partition algorithm (using Hoare partition scheme)
algorithm partition(A, lo, hi) is
pivot := A[lo]
i := lo – 1
j := hi + 1
loop forever
do
i := i + 1
while A[i] < pivot
do
j := j – 1
while A[j] > pivot
if i >= j then
return j
swap A[i] with A[j]
quicksort algorithm(using Hoare partition scheme)
algorithm quicksort(A, lo, hi) is
if lo < hi then
p := partition(A, lo, hi)
quicksort(A, lo, p)
quicksort(A, p + 1, hi)
Hoare partition scheme vs Lomuto partition scheme
The pivot selection
The execution speed of the algorithm depends largely on how this mechanism is implemented, poor implementation can assume that the algorithm is run at a slow speed.
The choice of pivot determines partitions the data list, therefore, this is the most critical part of the implementation of the Quicksort algorithm. It is important to try that selecting the pivot left and right partitions have an identical size as much as possible.
Best and worst case
Worst case
The most unbalanced partition occurs when the pivot divides the list into two sublists of sizes _0 and n − 1. This may occur if the pivot happens to be the smallest or largest element in the list, or in some implementations when all the elements are equal.
Best Case
In the most balanced case, each time we perform a partition we divide the list into two nearly equal pieces. This means each recursive call processes a list of half the size.
Formal analysis
Worst-case analysis = O(n²)
Best-case analysis = O(n) factor
Average-case analysis = O(n log n)
Examples source
Using additional memory
def quicksort(array):
less = []
equal = []
greater = []
if len(array) > 1:
pivot = array[0]
for x in array:
if x < pivot:
less.append(x)
if x == pivot:
equal.append(x)
if x > pivot:
greater.append(x)
return sort(less)+equal+sort(greater)
else:
return array
Usage:
quicksort([12,4,5,6,7,3,1,15])
Without additional memory
def partition(array, begin, end):
pivot = begin
for i in xrange(begin+1, end+1):
if array[i] <= array[begin]:
pivot += 1
array[i], array[pivot] = array[pivot], array[i]
array[pivot], array[begin] = array[begin], array[pivot]
return pivot
def quicksort(array, begin=0, end=None):
if end is None:
end = len(array) - 1
if begin >= end:
return
pivot = partition(array, begin, end)
quicksort(array, begin, pivot-1)
quicksort(array, pivot+1, end)
Usage:
quicksort([97, 200, 100, 101, 211, 107])
In your example
Debug Lomuto partition
References:
http://www.cs.bilkent.edu.tr/~atat/473/lecture05.pdf
http://codefap.com/2012/08/the-quick-sort-algorithm/
http://visualgo.net/sorting
https://en.wikipedia.org/wiki/Quicksort
Some day I found this jewel, which animates the different Sorting Algorhitms which helped me a lot in understanding them! But this is just a graphical explanation, the poster prior to me (#Hydex), already answered in a academically way ;-)

Partition method

I am trying to understand exactly what this method does, it say its suppose to
"Keep swapping the outer-most wrongly-positioned pairs". I put this into a program
and tried different array but the result make no sense to me, what exactly does this do
partition(A, p)
A: array of size n, p: integer s.t. 0 <= p < n
1. swap(A[0],A[p])
2. i <- 1, j <- n − 1
3. while i < j do
4. while A[i] <= A[0] and i < n do
5. i <- i + 1
6. while A[j] > A[0] and j > 0 do
7. j <- j − 1
8. if i < j then
9. swap(A[i], A[j])
10. swap(A[0], A[j])
11. return j
The algorithm this pseudocode implements is the partitioning phase of the quicksort sorting algorithm. It will arrange the array so that all values smaller than or equal to A[p] are at the left and all larger values at the right. It returns the index j that is the last index of the left side for which A[j] equals A[p].
If you are not familiar with quicksort, the intent is to use this partition algorithm to split the array into "small" and "large" parts and recursively sort each part. Since the small ones had been arranged to come before the large ones in the array, the array gets sorted. If p is picked appropriately so that A[p] is close to the middle of the values in A, this is a very fast sorting method.

Resources