How to implement a stack using a priority queue? - data-structures

A priority queue is used to implement a stack that stores characters.
Push(C) is used to implement Insert(Q,C,K) where K is the appropriate key chosen by the implementation.
Pop is implemented as Delete_Min(Q) ,for a sequence of operations in what order must the keys be chosen , strictly decreasing or strictly increasing ?

Let me begin by saying that priority queue and stack are two completely different data structures with different uses and applications. One can not always be used to implement the other.
Yes, there are instances where a data structure can be defined in terms of another: for example you can create a stack or queue using a linked list (quite trivially actually), however implementing a stack using a priority queue will not always work. Why?
Because a stack is first in last out. The last thing you push on a stack WILL be the first thing to pop out. A stack's sole job is to keep the order of pushed items intact and pop in the reverse order.
A priority queue however, will always give you the minimum (or maximum based on implementation) with a pop. A priority queue will have to -by definition- restructure itself to always maintain the "heap property". This means the original order in which you pushed will not necessarily be preserved.
Now, your question should be phrased as "In what situation will a priority queue and a stack behave the same way?"
You mentioned your priority queue pop() will delete the minimum value from your queue which indicates you have a min-heap at hand. In this scenario the only case where a series of pops from priority queue will resemble those that of a stack, would be when the items were all pushed in non-increasing order. It does not have to be strictly decreasing. (think about pushing all of the same values).

Related

Container of fixed size where items exist within it according to their demand

I'm trying to implement a container with the following characteristics:
The container has a fixed size n.
When an item is inserted into the container, if the item is in the data structure it will be moved to the front of it. If not it will be inserted to the front of the data structure, but the last item at the back of the container will be removed to respect the fixed size n.
Building on for 2, it will be required to check whether an item exists in this container in order to know whether to insert or move an item in the container.
The reasoning behind this container, is to keep frequently accessed items in the container. The cost of inserting a new item into the container is large thus it is in my interest to keep it in the container for as long as it is in demand.
Is there a container/data structure that exists that achieves something similar to what I've described? If not can you provide any advice on how to implement it? I'm using C++ but any examples or pseudocode will be equally appreciated.
Edit:
I suppose what I need is a kind of queue with no duplicate items. The queue needs to be searched to see if an item exists within it, and if so moves it to the front of the queue. A fixed size isn't that difficult to adhere to (just check the size before insertion and if it will go over remove the last item in the queue). Basically this post but not allowing any duplicates in the container, and also fast search capabilities to check if an item is within it.
I'm not following the requirements you gave but this seems like it can be implemented as as a double-ended queue (C++ deque or Java Deque). Each time an element is accessed implies a linear search (can't be avoided), then this element is moved to the front (constant time), and the last element removed (also constant time). This should result that the most frequently accessed elements move to the front of the queue over time, decreasing the real-time cost of a linear search.
A double-ended queue can be implemented as a ring-buffer or as a doubly-linked-list. Since you stated a fixed number of elements, the ring buffer seems like the better option.
However, I can't vouch for the implementations of C++ or Java deque.. you may look at the source code to see if its backed as an array or a linked node structure.
Maybe wrap a priority queue with elements having a last-accessed-time attribute?
You may check Splay Tree. If you do some operation on element X, that element move to root.

Can you reverse a queue by only reversing its head and tail pointers?

While attempting to reverse a queue, I found a generally agreed upon way:
You can dequeue through the queue, getting the dequeue value and pushing each one into a stack. Then you can go through that stack, popping each value and enqueueing it into the queue
By agreed upon, I mean most of my Google searches on reversing a queue end up taking me to that solution.
Even though that way is correct and relatively performant in linear time, I believe there's a better way that is simpler and more performant in constant time.
Assuming that a queue is implemented using a doubly-linked list, can't you reverse it in O(1) time by just reversing the head and tail pointers?
if you want to treat a doubly linked list as a queue, then it's only by convention which is the head and which is the tail by which way you want to iterate it. But the point of a queue is it's interface.... so given any arbitrary queue, implemented in an unknown way ( there are MANY things that implement queues, including queues that distribute themselves across many computers ) the question is, how can you you reverse it, and that means you cannot rely on an underlying implementation.
A specific implementation might implement optimizations for certain operations.
No, you cannot just swap the head and tail pointers of a doubly linked list. You also need to swap the next and previous pointers in each node. This will still take O(n) time.
Short Answer : NO
LONG ANSWER/REASON
The queue is an abstract data type. That means there is no physical existence of such a data structure. A queue can be implemented in many ways. The most basic implementation is the one that uses arrays.
struct Queue{
int elements[50];//The actual/physical (array)data structure which houses the data
int max=50;//The maximum no of elements that can be stored in this queue.
int front,rear;//pointers to the front and rear.
};
Now, I am sure you know how to define operations on this kind of a Queue.
void enQ(Queue &Q,int new_element);
int deQ(Queue &Q);
int getFront(Queue Q);
That means if I have to add an element 7 to the queue identified by Q, I need to execute enQ(Q,7). Let us say I have added 10 items like that by calling enQ 10 times. Then I add a number 89 to the queue. Now if I have to get this number 89(assume that all numbers are unique), I will first have to deQ the first 10 items and the call the deQ function again to get 89. I am sure you will agree, that is the principle of a queue.
Now time for some magic. If I knew that 89 is the 11th number I added, I can get it directly by Q.elements[(Q.front+11)%Q.max]. Also if I knew that 89 is the number I just added, I can also get it by using Q.elements[Q.rear].
Wow! Does that mean that the principles of the queue got violated? No. It just means that I am not using a queue anymore. I am actually using an array, but trying to fool myself, by doing it in a sophisticated manner(by putting it in a structure and all that).
If you are using a queue, You can only use the three methods I mentioned above. You are not allowed to meddle with the inner workings of the Queue. You might be thinking that your case is different because you are just wanting to change the front and rear values and not the actual data. But No. In a real queue, you are not even allowed to access the front and rear. All you have access to are the three methods I defined above.
That is why the actual implementation of a queue should be
class Queue{
int elements[50];//The actual/physical (array)data structure which houses the data
int max=50;//The maximum no of elements that can be stored in this queue.
int front,rear;//pointers to the front and rear.
public:
void enQ(int new_element);
int deQ();
int getFront();
};
Now we are upholding the real essence of a Queue. A similar layout should be used if you are implementing a queue using a Doubly-linked-list. The front and rear pointers should be private and inaccessible to the user.
Therefore it is not possible to reverse a QUEUE faster than O(n).
So the bottom line is: If you want to change the queue pointers, by using a doubly-linked list, by all means you can do it. But you cannot call it reversing a queue. Because, then you would not be using a queue. In fact, that would be a completely new data structure called DEQ. If you really want to implement reversing a queue in O(1) time complexity, I suggest you go ahead with your method. But you will have to stop calling it a queue because that is a DEQ(BTW, there is nothing wrong with using a DEQ, by all means, use it). Or if you don't like the sounding of a DEQ, you can define your own data structure called reversible queue.
You can define your data structure like
class ReversibleQueue{
DLL front,rear; //pointers to a DOUBLY-LINKED-LIST
public:
void enQ(int new_element);
int deQ();
int getFront();
void reverse();
};

Unique Queue Overflow Condition (having less elements than its size)

Got this competitive question:
Which of the following data structure may give overflow error, even though the current number of elements in it is less than its size?
a. Stack
b. Circular queue
c. Simple queue
d. None of the above
I tried to Google the answer for a proper explanation however several sources marked (c) and (b) as answers which confused me even more. What's the explanation and the proper answer?
Thanks.
This question seems somewhat strange because if you implement any of these structures correctly, there will be no such premature overflow.
With that in mind, Circular queue does seem like the most sensible answer. Here's why:
Note: In my explanation, the queue adds to back and removes from front
After certain number of insertions/deletions, the pointers to front and back in the circular queue (implemented as an array) may be on either side of each other.
This means that when adding elements to this queue, on top of standard checks, we also have to be aware of the relative position of the front and back pointers.
In the second picture above we have to realize that adding to back has to add to the beginning of the array now, since back is at the end of the array. In other words, adding elements has to "circle around". If we don't implement this check properly, we would end up with an overflow even though there's still room in the queue.
Ans is (c)simple queue.we are assuming that elements can be inserted using rear and can be deleted using front pointer..And also assume that maximum size of queue is 10(eg.) ....Now if the rear is point at index no 9...Means there are total 10 elements in queue. And the position of front is 0 index .Now if we remove elements from other end then the value of front become 1 .........here, actually queue is not full bcz index 0 is empty... But due to the condition of the rear pointer which is max-1 the output shows that queue is full....And the sol of that is implemented in circular queue.

Implement a priority queue using only ONE stack

Implementing a queue or a priority queue using two stacks are not hard.
Now the questions are
How about using only ONE stack to implement a priority queue?
How about using only ONE stack to implement a normal queue?
Are they even possible?
p.s. of course you should use constant extra space other than ONE stack if necessary
No, it's not possible using only methods provided through a stack interface (i.e. using only push and pop methods) with constant extra space. [1]
Consider trying to simulate a queue using a stack.
When enqueueing, if we simply push onto the stack, we'll end up with another element we need to do something with to get to the front of the queue for a dequeue. It's easy to see that a bunch of enqueues will make it impossible for the next dequeue to take a constant amount of space, as all these enqueued elements need to be popped to get to the front of the queue. We could also put the enqueued element a constant number of elements from the top of the stack, but this doesn't really help much either - the elements below it will need to be dequeued first, so we run into the same problem. Trying to put the enqueued element further than a constant number of elements from the top of the stack will of course take more than a constant amount of space already.
Now consider a priority queue where each new item has lower priority than all items already in the queue. This is synonymous to a simply queue, which, from the above argument, can't be simulated using a single stack with constant space.
[1]: If the stack was implemented as an array or linked-list, as it typically is, it would of course be possible using the functionality for those, but I'm sure that's not what you're asking.
Say you push n elements onto your stack. Now you want the element at the bottom of it (if you want to implement a queue). You're going to need extra O(n) space to keep your other n-1 elements so you can access the bottom one.
=> Under your constraints and only the stack interface methods, there's no way you're going to implement a queue using a single stack and constant space.
It is impossible to perform the task if you only allow standard stack operations (pop, push), as queueing an item would mean poping all the items from the stack, pushing the new item, and then push back all the items. If you use constant time, you can't pop all the items and keep track of them with O(1) memory.
Let's assume you implement a stack with a two-way linked list and allow a reverse method- meaning you flip the order of the stack (tail of the linked list becomes the tail).
Now, for a regular queue, you can perform:
queue- perform reverse, push the new item, and reverse again.
dequeue- perform pop.
Note once you allow a reverse action, it's not a standard stack anymore.
A priority queue is still impossible impossible to implement even if you allow reverse, as you'll still have to perform various comparisons when queueing or dequeueing, and will still encounter the problem of having no space to store the items you want to compare.
Here's a solution that is technically correct, but almost certainly useless for your purposes.
We use a stack S of queues (regular or priority). We initialise S to hold a single empty queue. For any queue operation, we peek at the top element of S and apply the queue operation to it in place. This way, S will only ever hold a single element.
This uses zero memory outside of the stack, if you take the viewpoint that the stack "owns" the memory of all its elements.

stack using two queues [duplicate]

This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
Implement Stack using Two Queues
I want the most efficient way in terms of time complexity to implement stack using two queues
I don't get why you'd ever need two queues. I've seen the answers here and in the dupe thread, and you can do it with just one queue.
From the dupe thread, this has two versions, one that optimizes push, and one that optimizes pop.
Push optimized:
push:
enqueue in queue
pop:
n = queue size.
dequeue an object, and enqueue it immediately after. Do this n - 1 times.
dequeue object, and return this.
Pop optimized:
push:
enqueue in queue
n = queue size.
dequeue an object, and enqueue it immediately after. Do this n - 1 times.
pop:
dequeue from queue and return.
Then again, I don't get why you'd ever want to do this ever. Lambast your professor for making you waste your time with pointless programming questions.
I may be wrong, but to me this does not compute.
A Queue is (typically) a FIFO structure, a Stack is a LIFO structure. I cannot off the top of my head envisage any simple combination of 2 FIFO's that will yield a LIFO, though I simply may not have had enough coffee yet today.
It may be possible, but I suspect that an implementation of a stack involving 2 queues is almost certainly going to take longer to implement and be more error prone than a simple implementation of a stack.
However, having said that...
If you already have a Queue implementation and if that Queue allows you to remove items from it's TAIL rather than from the HEAD (actual terms may differ in your implementation) then you can simply use the Queue as if it were a stack by retrieving items from the TAIL.
It is simple,
Let's say you have Queue A and Queue B
u use one queue to hold data, and other as a temporary container... BUT A and B interchange roles all the time:
When you first insert data, you insert into Queue A.
To POP the last Item you inserted, you DEQUE all of Queue A elements except the last one and ENQUEUE them in Queue B.
DEQUEUE the only element from Queue A and you've got what you want: The TOP of the stack, the last element... etc...
Now to POP the latest item, you re-do the same work but A and B interchange roles

Resources