What is meant by re-buffering issue in a dequeue operation - algorithm

I was going through a circular queue post, and it mentioned about re-buffering problem in other queue datastructures.
In a standard queue data structure re-buffering problem occurs for each dequeue operation. This problem can be solved by joining the front and rear ends of a queue to make the queue as a circular queue.
Circular queue is a linear data structure. It follows FIFO principle.
Can someone explain me what is re-buffering problem and how it happens during a dequeue operation ?

In a standard queue, implemented using array, when we delete any element only front is increment by 1, but that position is not used later. So when we perform many add and delete operations, memory wastage increases. But in Circular Queue , if we delete any element that position is used later, because it is circular.
This re-buffering problem occurs if queue is implemented using array. A circular queue implemented using array does not have re-buffering issue for dequeue operation.

Related

how do queues provide O(1) performance on takes/puts?

I assume the constant time performance of takes/puts is achieved by allowing consumers and producers to access the tail/head of the queue without locking each other. How is this achieved for in-memory queues? Does the answer change for durable queues (probably)? How is this solved in system that imposes a limit on producers and consumers of 1 each? How about when the system allows concurrent access?
Queue uses doubly linked list as it's data structure. In fact queue in Java is declared like this:
Queue<SomeClass> q = new LinkedList<>();
LinkedList in Java is doubly linked list by default.
Now offer() or insertion at head is always O(1) as you don't need to traverse the whole list and same with poll() where you remove the tail and return it.
Now as far as concurrent access is concerned it should not have any effect on the time complexity of the code.

Why can't a priority queue wrap around like an ordinary queue?

I know that in order to improve efficiency, Queues use the wrap around method, to avoid to move everything down all the time that we delete an element.
However, I do not understand why Priority Queues can't wrap around like ordinary Queues. In my point of view, Priority Queues have more similar behaviour to Stack than to a Queue, how is it possible?
The most common priority queue implementation is a binary heap, which would not benefit from wrapping around. You could create a priority queue that's implemented in a circular buffer, but performance would suffer.
It's important to remember than priority queue is an abstract data structure. It defines the operations, but not the implementation. You can implement priority queue as a binary heap, a sorted array, an unsorted array, a binary tree, a skip list, a linked list, etc. There are many different ways to implement a priority queue.
Binary heap, on the other hand, is a specific implementation of the priority queue abstract data type.
As for stack vs queue: in actuality, stacks and queues are just specializations of the priority queue. If you consider time as the priority, then what we call a queue (a FIFO data structure), is actually a priority queue in which the oldest item is the highest priority. A stack (a LIFO data structure) is a priority queue in which the newest item is the highest priority.

Why insert from front of queue (deque)

Are there any reason why people will want to insert something at the front of the queue? I'm writing a report on double-ended queue and this is bugging me.
I'm assuming that maybe something of higher importance will be inserted at the front when needed, but then a priority queue will be more relevant.
One example where a deque can be used is Steal job scheduling algorithm.This algorithm implements task scheduling for several processors. A separate deque with threads to be executed is maintained for each processor. To execute the next thread, the processor gets the first element from the deque (using the "remove first element" deque operation). If the current thread forks, it is put back to the front of the deque ("insert element at front") and a new thread is executed. When one of the processors finishes execution of its own threads (i.e. its deque is empty), it can "steal" a thread from another processor: it gets the last element from the deque of another processor ("remove last element") and executes it. The steal-job scheduling algorithm is used by Intel's Threading Building Blocks (TBB) library for parallel programming.
Note that a priority queue is inherently different from a deque, with elements in a PQ being processed according to their priority while in a deque you can only remove and insert at the front or back of the queue. A possible application of a deque I can quickly think of is an "undo" feature enabling you to fall back to a previous state.

Are Priority Queues really Queues?

In Priority Queues, an element is inserted and deleted from the queue according to its priority, and because of which while writing the insertion and deletion code of elements for any priority queue; insertion and deletion are done according to the priority of the elements.
Suppose you have a queue with elements 1,5,6 and the priority of the elements is the value of the elements itself, and now one needs to insert an element of priority 3; then the elements is inserted at the second location in queue giving the new queue 1,3,5,6.
But a queue is defined as a data structure in which elements can be inserted at end and deleted at beginning but not in the middle, but in the above described case element is inserted at the second location (that is in the middle of queue). So if priority queues not obeying definition of queue so Are Priority Queues really Queues?
Kindly explain.
Priority queues are "queues" in one sense of the word, in that elements wait their turn. They are not a subtype of the Queue abstract data type.
A queue is characterized as an information structure in which components might be embedded at closure and erased at starting yet not in the center, however in the above portrayed case component is embedded at the second area (that is amidst queue).
Yes, a priority queue is still a queue in the sense that items are being served in the order in which they are located in the queue. However, in this case a priority is associated with each item and they are served accordingly.
A priority queue is a queue in the sense of the English word queue, not as a strict subtype of the other data structure named 'queue'. There is no inheritance going on there, they're just names that describe their purpose.

Name for a circular data structure delivery system?

Let's say I have a data structure, which is essentially a circularly linked list. The usage of this circularly linked list is to continuously be walked, and at each node, delivery the data at that node to consumers. Therefor, the more frequently the same thing appears in the circularly linked list, the more frequently it will be delivered to a consumer.
Is there a name for this data structure?
Circular queue, circular buffer, cyclic buffer or ring buffer. I think circular buffer is the most common name I've heard.
Ringbuffer.

Resources