Is there a way to find the log n greatest elements in an array with n elements in O(n) time?
I would create an array based HeapPriorityQueue, because if all elements are available the heap can be created in O(n) time using bottom up heap construction.
Then removing the first element of this priorityqueue should be in O(1) time isn`t?
Then removing the first element of this priority queue should be in O(1) time isn`t?
That will be O(logn), since you also remove the first element. Looking at it without removing is O(1). Repeating this removal logn times will be O(log^2(n)), which is still in O(n), so this solution will indeed meet the requirements.
Another option is to use selection algorithm to find the log(n)'th biggest element directly, which will be O(n) as well.
Basically, yes. The creation of the heap takes O(n) and this dominates the algorithm.
Removing the first element may take either O(1) if the heap does not updates it's keys after removing or O(log n) if it does. Either way the complexity of removing log(n) elements from the heap with and without updating would be O(log n * log n) and O(log n) respectively. Both of which are sublinear.
Related
For the following questions
Question 3
You are given a heap with n elements that supports Insert and Extract-Min. Which of the following tasks can you achieve in O(logn) time?
Find the median of the elements stored in the heap.
Find the fifth-smallest element stored in the heap.
Find the largest element stored in the heap.
Find the median of the elements stored in theheap.
Why is "Find the largest element stored in the heap."not correct, my understanding here is that you can use logN time to go to the bottom of the heap, and one of the element there must be the largest element.
"Find the fifth-smallest element stored in the heap." this should take constant time right, because you only need to go down 5 layers at most?
"Find the median of the elements stored in the heap. " should this take O(n) time? because we extract min for the n elements to get a sorted array, and take o(1) to find the median of it?
It depends on what the running times are of the operations insert and extract-min. In traditional heaps, both take ϴ(log n) time. However, in finger-tree-based heaps, only insert takes ϴ(log n) time, while extract-min takes O(1) time. There, you can find the fifth smallest element in O(5) = O(1) time and the median in O(n/2) = O(n) time. You can also find the largest element in O(n) time.
Why is "Find the largest element stored in the heap."not correct, my understanding here is that you can use logN time to go to the bottom of the heap, and one of the element there must be the largest element.
The lowest level of the heap contains half of the elements. More correctly, half of the elements of the heap are leaves--have no children. The largest element in the heap is one of those. Finding the largest element of the heap, then, will require that you examine n/2 items. Except that the heap only supports insert and extract-min, so you end up having to call extract-min on every element. Finding the largest element will take O(n log n) time.
"Find the fifth-smallest element stored in the heap." this should take constant time right, because you only need to go down 5 layers at most?
This can be done in log(n) time. Actually 5*log(n) because you have to call extract-min five times. But we ignore constant factors. However it's not constant time because the complexity of extract-min depends on the size of the heap.
"Find the median of the elements stored in the heap." should this take O(n) time? because we extract min for the n elements to get a sorted array, and take o(1) to find the median of it?
The median is the middle element. So you only have to remove n/2 elements from the heap. But removing an item from the heap is a log(n) operation. So the complexity is O(n/2 log n) and since we ignore constant factors in algorithmic analysis, it's O(n log n).
Given a binary heap, How can I convert it to a binomial queue in linear time- O(n)? I thought of splitting the heap however I got stuck as the time for deletion is O(lg n)
Assuming that you have access to the backing array that contains the binary heap and you can iterate over it in O(n) time, then you can create your binomial heap simply by doing n inserts. As the Wikipedia article says:
Inserting a new element to a heap can be done by simply creating a new
heap containing only this element and then merging it with the
original heap. Due to the merge, insert takes O(log n) time. However,
across a series of n consecutive insertions, insert has an amortized
time of O(1) (i.e. constant).
In other words, doing n inserts into the binomial heap will require O(n) time.
You cannot do this in O(n) time by using the standard binary heap remove operation. As you noted, that would be O(log n) for each removal, resulting in O(n log n) complexity.
So we were given a problem that if given an array of n elements, we need to extract the k smallest elements from it. Our solution should use heaps and the complexity should be O(n + k log n). I think I may have figured out the solution, but I'd like to be sure about it.
I'd say that the array must first be built into a heap using a typical buildHeap() function which starts at half the length of the array and calls a minHeapify() function to ensure each parent is at least less than its children. So that would be O(n) complexity all in all. Since we need to extract k times, we would use an extractMin() function, which would remove the minimum value and minHeapify() what remains to keep a Min Heap property. The extractMin() would be O(logn), and since it would done k times, this supports the overall complexity of O(n+klogn).
Does this check out? Someone told me it should also be sorted with a heapSort() function, but this didn't make sense to me, because heapSort() would add an O(nlogn) to the overall complexity, and you're still able to extract the min without sorting...
Yes, you are right. You don't need heapSort() but heapify() to re-order your heap.
Can we say that, when all elements are identical in an array A of size n then running time of heap sort is O(n)
--> If this is the case, Is O(n) best case running time of heapsort
When all elements are equal building the heap takes O(n) steps. Because when element gets added to the heap after one compare O(1) we see it is in the correct position.
Removing the root is also O(1), when we swap the tail and root, the heap property is still satisfied.
All elements get added to the heap in O(n), and removed in O(n). So, yes in this case heapsort is O(n). I can't think of a better case so heapsorts best case must be O(n).
'Heapsorts best case is O(n)' means in English something like: there exists arrays of size n such that heapsort needs at most k*n compares to sort it. That's nice in theory but in practice it doesn't say much about how good or fast heapsort is.
Is there one type of set-like data structure supporting merging in O(logn) time and k-th element search in O(logn) time? n is the size of this set.
You might try a Fibonacci heap which does merge in constant amortized time and decrease key in constant amortized time. Most of the time, such a heap is used for operations where you are repeatedly pulling the minimum value, so a check-for-membership function isn't implemented. However, it is simple enough to add one using the decrease key logic, and simply removing the decrease portion.
If k is a constant, then any meldable heap will do this, including leftist heaps, skew heaps, pairing heaps and Fibonacci heaps. Both merging and getting the first element in these structures typically take O(1) or O(lg n) amortized time, so O( k lg n) maximum.
Note, however, that getting to the k'th element may be destructive in the sense that the first k-1 items may have to be removed from the heap.
If you're willing to accept amortization, you could achieve the desired bounds of O(lg n) time for both meld and search by using a binary search tree to represent each set. Melding two trees of size m and n together requires time O(m log(n / m)) where m < n. If you use amortized analysis and charge the cost of the merge to the elements of the smaller set, at most O(lg n) is charged to each element over the course of all of the operations. Selecting the kth element of each set takes O(lg n) time as well.
I think you could also use a collection of sorted arrays to represent each set, but the amortization argument is a little trickier.
As stated in the other answers, you can use heaps, but getting O(lg n) for both meld and select requires some work.
Finger trees can do this and some more operations:
http://en.wikipedia.org/wiki/Finger_tree
There may be something even better if you are not restricted to purely functional data structures (i.e. aka "persistent", where by this is meant not "backed up on non-volatile disk storage", but "all previous 'versions' of the data structure are available even after 'adding' additional elements").