is there a data structure with the following properties: - data-structures

given n elements, the datastructure has the following runtime complexities:
Finding the minimum element is Θ(1),
Deleting the minimum element is Θ(lg n)
Inserting an element is Θ(lg n)
i made research, i dont know this fast data structure

from wikipedia:
http://en.wikipedia.org/wiki/Heap_(data_structure)
Operation Binary Binomial Fibonacci
find-min Θ(1) Θ(1) Θ(1)
delete-min Θ(log n) Θ(log n) O(log n)*
insert Θ(log n) O(log n) Θ(1)
decrease-key Θ(log n) Θ(log n) Θ(1)*
merge Θ(n) O(log n)** Θ(1)
(*) Amortized time
(**) Where n is the size of the larger heap

Related

Is the theta bound of an algorithm unique?

For example, the tightest bound for Binary search is θ(logn), but we can also say it has O(n^2) and Ω(1).
However, I'm confused about if we can say something like "Binary search has a θ(n) bound" since θ(n) is between O(n^2) and Ω(1)?
The worst-case execution of binary search on an array of size n uses Θ(log n) operations.
Any execution of binary search on an array of size n uses O(log n) operations.
Some "lucky" executions of binary search on an array of size n use O(1) operations.
The sentence "The complexity of binary search has a Θ(n) bound" is so ambiguous and misleading that most people would call it false. In general, I advise you not to use the word "bound" in the same sentence as one of the notations O( ), Θ( ), Ω( ).
It is true that log n < n.
It is false that log n = Θ(n).
The statement log n < Θ(n) is technically true, but so misleading that you should never write it.
It is true that log n = O(n).
The "because" is wrong. Θ(n) is indeed compatible with O(n²) and Ω(1), but so is Θ(log n).
In the case of the dichotomic search, you can establish both bounds O(log n) and Ω(log n), which is tight, and summarized by Θ(log n).
You may not choose complexities "randomly", you have to prove them.

how to compute the big O of sort and then merge

if I sort in O(m+n) complexity and then mergesort in O(nlogn) complexity, is it over all the sum or the most significant complexity?
They are both independent from each other. Then (m + n) + nlogn = O(n logn) since n logn is little faster than linear time.

Why Merge Sort time complexity is not O(N)?

Merge Sort time complexity is O(n log n) so here n is dominate on logn , so Is Merge-Sort is O(N)
Thanks
O(n log n) is the best that you can get using tradional sort algorithms.
You can't say that O(n log n) == O(n) even if n dominates logn because they are multiplying not adding.
If you got n + logn and n dominates logn then you can say that O is O(n)

Asymptotic time complexity of inserting n elements to a binary heap already containing n elements

Suppose we have a binary heap of n elements and wish to insert n more elements(not necessarily one after other). What would be the total time required for this?
I think it's theta (n logn) as one insertion takes logn.
given : heap of n elements and n more elements to be inserted. So in the end there will be 2*n elements. since heap can be created in 2 ways 1. Successive insertion and 2. Build heap method. Amoung these build heap method takes O(n) time to construct heap which is explained in
How can building a heap be O(n) time complexity?. so total time required is O(2*n) which is same as O(n)
Assuming we are given:
priority queue implemented by standard binary heap H (implemented on array)
n current size of heap
We have following insertion properties:
W(n) = WorstCase(n) = Θ(lg n) (Theta). -> W(n)=Ω(lg n) and W(n)=O(lg n)
A(n) = AverageCase(n) = Θ(lg n) (Theta). -> W(n)=Ω(lg n) and W(n)=O(lg n)
B(n) = BestCase(n) = Θ(1) (Theta). -> W(n)=Ω(1) and W(n)=O(1)
So for every case, we have
T(n) = Ω(1) and T(n) = O(lg n)
WorstCase is when, we insert new minimal value, so up-heap has to travel whole branch.
BestCase is when, for minimal-heap (heap with minimal on top) we insert BIG (biggest on updated branch) value (so up-heap stops immediately).
You've asked about series of n operations on heap containing already n elements,
it's size will grow
from n to 2*n
what asymptotically is ...
n=Θ(n)
2*n=Θ(n)
What simplifies our equations. (We don't have to worry about growth of n , as it's growth is by constant factor).
So, we have "for n insertions" of operation:
Xi(n) = X_for_n_insertions(n)
Wi(n) = Θ(n lg n)
Ai(n) = Θ(n lg n)
Bi(n) = Θ(n)
it implies, for "all case":
Ti(n) = Ω(n) and Ti(n) = O(n lg n)
P.S. For displaying Theta Θ , Omega Ω symbols, you need to have UTF-8 installed/be compatible.
its not theeta(nlogn)... its order(nlogn) since some of the insertions can take less then exact logn time... therefore for n insertions it will take time <=nlogn
=> time complexity=O(nlogn)

Balanced Search Tree Query, Asymtotic Analysis

The situation is as follows:-
We have n number and we have print
them in sorted order. We have access
to balanced dictionary data structure,
which supports the operations serach,
insert, delete, minimum, maximum each
in O(log n) time.
We want to retrieve the numbers in
sorted order in O(n log n) time using
only the insert and in-order
traversal.
The answer to this is:-
Sort()
initialize(t)
while(not EOF)
read(x)
insert(x,t);
Traverse(t);
Now the query is if we read the elements in time "n" and then traverse the elements in "log n"(in-order traversal) time,, then the total time for this algorithm (n+logn)time, according to me.. Please explain the follow up of this algorithm for the time calculation.. How it will sort the list in O(nlogn) time??
Thanks.
Each insert is O(log n). You are doing n inserts, so that gives n * O(log n) = O(n log n) asymptotic time complexity. Traversing the tree is O(n), because there are n nodes. That adds up to O(n + n log n) which differs from O(n log n) by a constant, so the final asymptotic complexity is O(n log n)..
and then traverse the elements in "log n
Traversal is O(n), not O(log n). An insertion is O(log n), and you're doing n such insertions.

Resources