Data Structure: Big O time cost - algorithm

1) The time cost to add n elements to an initially empty singly linked list by
inserting at the front of the list.
the answer seems to be one of these O(n) or O(1).
I think it is O(1) because
inserting an element into empty list is just
for example Node element = 1;
But I'm still not sure about this.
2)What would be the best-case time cost to find a data element in a linked list with n elements.
The answer also seems to be either O(1) or O(n).
I think it's O(n) because it has to traverse through the list to find the element.

The time cost to add n elements to an initially empty singly linked list by
inserting at the front of the list.
It is O(1) per insertion, but you have n of those - so O(n) at total.
the best-case time cost to find a data element in a linked list with
n elements
It is O(1), because at best case - the searched element is the first one, so there is no need to traverse the list, after searching the first element (which is constant time) - you can halt.

Related

a heap with n elements that supports Insert and Extract-Min, Which of the following tasks can you achieve in O(logn) time?

For the following questions
Question 3
You are given a heap with n elements that supports Insert and Extract-Min. Which of the following tasks can you achieve in O(logn) time?
Find the median of the elements stored in the heap.
Find the fifth-smallest element stored in the heap.
Find the largest element stored in the heap.
Find the median of the elements stored in theheap.
Why is "Find the largest element stored in the heap."not correct, my understanding here is that you can use logN time to go to the bottom of the heap, and one of the element there must be the largest element.
"Find the fifth-smallest element stored in the heap." this should take constant time right, because you only need to go down 5 layers at most?
"Find the median of the elements stored in the heap. " should this take O(n) time? because we extract min for the n elements to get a sorted array, and take o(1) to find the median of it?
It depends on what the running times are of the operations insert and extract-min. In traditional heaps, both take ϴ(log n) time. However, in finger-tree-based heaps, only insert takes ϴ(log n) time, while extract-min takes O(1) time. There, you can find the fifth smallest element in O(5) = O(1) time and the median in O(n/2) = O(n) time. You can also find the largest element in O(n) time.
Why is "Find the largest element stored in the heap."not correct, my understanding here is that you can use logN time to go to the bottom of the heap, and one of the element there must be the largest element.
The lowest level of the heap contains half of the elements. More correctly, half of the elements of the heap are leaves--have no children. The largest element in the heap is one of those. Finding the largest element of the heap, then, will require that you examine n/2 items. Except that the heap only supports insert and extract-min, so you end up having to call extract-min on every element. Finding the largest element will take O(n log n) time.
"Find the fifth-smallest element stored in the heap." this should take constant time right, because you only need to go down 5 layers at most?
This can be done in log(n) time. Actually 5*log(n) because you have to call extract-min five times. But we ignore constant factors. However it's not constant time because the complexity of extract-min depends on the size of the heap.
"Find the median of the elements stored in the heap." should this take O(n) time? because we extract min for the n elements to get a sorted array, and take o(1) to find the median of it?
The median is the middle element. So you only have to remove n/2 elements from the heap. But removing an item from the heap is a log(n) operation. So the complexity is O(n/2 log n) and since we ignore constant factors in algorithmic analysis, it's O(n log n).

What is the Time Complexity for sorting all elements of ⌈logn⌉ sorted lists of ⌊n/logn⌋ elements each?

Suppose there are ⌈logn⌉ sorted lists of ⌊n/logn⌋ elements each. The time complexity of producing a sorted list of all these elements is: (Hint:Use a heap data structure)
A. O(nloglogn)
B. Θ(nlogn)
C. Ω(nlogn)
D. Ω(n3/2)
My Understanding:
There are logn list each containing n/logn elements then we can apply min heap procedure each of the list
it can be done in O(n/logn). Now we have logn list which satisfy the min heap property. Now how can i understand it further i am really confused here. Please help me to visualize it.
[I assume we're sorting into increasing order]
Build a heap of the smallest (ie: first) element of each list, (and for each, along with the value, keep a record of which list it came from at which index). Repeatedly remove the smallest element of this heap, and then insert the next element in the list it came from (if that list hasn't already been consumed). This gives you the sorted list of all the elements.
This heap has [log(n)] elements, so the initial cost of building this heap is O(log(n)), and each remove and insert takes O(log(log n)) time. So overall, the cost of this sort is O(log(n) + nlog(log n)) = O(nloglogn).

existence of a certain data structure

I'm wondering, can there exists such a data stucture under the following criterions and times(might be complicated)?
if we obtain an unsorted list L an build a data structure out of it like this:
Build(L,X) - under O(n) time, we build the structure S from an unsorted list of n elements
Insert (y,S) under O(lg n) we insert z into the structure S
DEL-MIN(S) - under O(lg n) we delete the minimal element from S
DEL-MAX(S) - under O(lg n) we delete the maximal element from S
DEL-MId(S) - under O(lg n) we delete the upper medial(ceiling function) element from S
the problem is that the list L is unsorted. can such a data structure exist?
DEL-MIN and DEL-MAX are easy: keep a min-heap and max-heap of all the elements. The only trick is that you have to keep indices of the value in the heap so that when (for example) you remove the max, you can also find it and remove it in the min-heap.
For DEL-MED, you can keep a max-heap of the elements less than the median and a min-heap of the elements greater than or equal to the median. The full description is in this answer: Data structure to find median. Note that in that answer the floor-median is returned, but that's easily fixed. Again, you need to use the cross-indexing trick to refer to the other datastructures as in the first part. You will also need to think how this handles repeated elements if that's possible in your problem formulation. (If necessary, you can do it by storing repeated elements as (count, value) in your heap, but this complicates rebalancing the heaps on insert/remove a little).
Can this all be built in O(n)? Yes -- you can find the median of n things in O(n) time (using the median-of-median algorithm), and heaps can be built in O(n) time.
So overall, the datastructure is 4 heaps (a min-heap of all the elements, a max-heap of all the elements, a max-heap of the floor(n/2) smallest elements, a min-heap of the ceil(n/2) largest elements. All with cross-indexes to each other.

Incorrect Worst-case time complexity of Insertion Sort

Hi I am new to algorithms and am quite fascinated by it.
I am trying to figure out worst case time complexity of insertion sort and it is mentioned as O(n**2). Instead, we can have the time complexity as O(N*logN).
Here is my explanation,
THe insertion sort looks at the 1st element and assumes it is sorted. Next it looks at the 2nd element and compares with the predecessor sorted sublist of 1 element and inserts it based on the comparison with elements in the predecessor sorted sublist. This process is repeated similarly.
Everywhere it is mentioned that to insert an element into the predecessor sorted sublist, basically linear search,it takes O(N) time and as we have do these operations for n elements it takes us O(N**2).
However, if we use binary insertion to insert the element into predecessor sublist it should take O(logn) time where n is the length of sublist. Basically compare the new element with the middle element of predecessor sorted sublist and if it is greater than the middle element then new element lies between the middle element and the last element of the sublist.
As we repeat the operations for n items it should take us O(N*logN). We can use binary search approach as we know the predecessor sublist is sorted.
So shouldn't the worst case time complexity be O(N*logN) instead of O(N**2).
Yes, you can find the insertion point in O(log n), but then you have to make space to insert the item. That takes O(n) time.
Consider this partially-sorted array:
[1,2,3,5,6,7,9,4]
You get to the last item, 4, and you do a binary search to locate the position where it needs to be inserted. But now you have to make space, which means moving items 9, 7, 6, and 5 down one place in the array. That's what makes insertion sort O(n^2).

Find nth Smallest element from Binary Search Tree

How to find nth Smallest element from Binary Search Tree
Constraints are :
time complexity must be O(1)
No extra space should be used
I have already tried 2 approaches.
Doing inorder traversal and finding nth element - Time complexity O(n)
Maintaining no. of small elements than current node and finding element with m small elements - Time complexity O(log n)
The only way I could think about is to change the data structure that holds the BST in memory. Should be simple if you actually consider every nodes as structure themselves (value, left_child and right_child) instead of storing them in a unordered array, you can store them in a ordered array. Thus the nth smallest element would be the nth element in your array. The extra computation will be at insertion and deletion. But it still would be more effective if you use for example a C++ set (log(n) for both insertion and deletion).
It mainly depends on your use case.
If you do not use data structure for handling the tree (based on array position) I don't think you cannot do it in something better than log(n).

Resources