Deletion complexity of last element in double linked list - data-structures

If you want to delete Node A then you have to traverse only one and complexity will O(1)
If you want to delete Node C then you have to traverse two times and complexity will O(n)
If you want to delete Node D then you have to traverse three times and complexity might be O(n)
However, the deletion complexity of the last node in a double linked list is O(1)
I don't get this point how it works?
I checked this link but I did not get/ did not understand my answer
Link

The complexity isn't in removing the item, but locating it.
In a doubly-linked list, you typically have a pointer to the last element in the list so that you can append. So if somebody asks you to delete the last element, you can just remove it.
If somebody asks you to delete the kth element of the list, you have to start at the beginning and traverse k links to find the element before you can delete it. That's going to be O(k), which in the worst case would be O(n-1).

Only case when deletion of last node from doubly linked list will be O(1) complexity is when you have direct access to this node , something like tail pointer. Otherwise you will have to traverse whole list which takes O(n).

Related

Linked List v.s. Binary Search Tree Insertion Time Complexity

Linked List
A linked list’s insertion time complexity is O(1) for the actual operation, but requires O(n) time to traverse to the proper position. Most online resources list a linked list’s average insertion time as O(1):
https://stackoverflow.com/a/17410009/10426919
https://www.bigocheatsheet.com/
https://www.geeksforgeeks.org/time-complexities-of-different-data-structures/
BST
A binary search tree’s insertion requires the traversal of nodes, taking O(log n) time.
Problem
Am I mistaken to believe that insertion in a BST also takes O(1) time for the actual operation?
Similar to the nodes of a linked list, an insertion of a node in a BST will simply point the current node’s pointer to the inserted-node, and the inserted-node will point to the current node’s child node.
If my thinking is correct, why do most online resources list the average insert time for a BST to be O(log n), as opposed to O(1) like for a linked list?
It seems that for linked list, the actual insertion time is listed as the insertion time complexity, but for BST, the traversal time is listed as the insertion time complexity.
It reflects the usage. It's O(1) and O(log n) for the operations you'll actually request from them.
With a BST, you'll likely let it manage itself while you stay out of the implementation details. That is, you'll issue commands like tree.insert(value) or queries like tree.contains(value). And those things take O(log n).
With a linked list, you'll more likely manage it yourself, at least the positioning. You wouldn't issue commands like list.insert(value, index), unless the index is very small or you don't care about performance. You're more likely to issue commands like insertAfter(node, newNode) or insertBeginning(list, newNode), which do only take O(1) time. Note that I took these two from Wikipedia's Linked list operations > Singly linked lists section, which doesn't even have an operation for inserting at a certain position given as an index. Because in reality, you'll manage the "position" (in the form of a node) with the algorithm that uses the linked list, and the time to manage the position is attributed to that algorithm instead. That can btw also be O(1), examples are:
You're building a linked list from an array. You'll do this by keeping a variable referencing the last node. To append the next value/node, insert it after that last node (an O(1) operation indeed), and update your variable to reference the new last node instead (also O(1)).
Imagine you don't find a position with a linear scan but with a hash map, storing references directly to linked list nodes. Then looking up the reference takes O(1) and inserting after the looked-up node also again only takes O(1) time.
If you had shown us some of those "Most online resources list a linked list’s average insertion time as O(1)", we'd likely see that they're indeed showing insertion operations like insertAfterNode, not insertAtIndex. Edit now that you included some links in the question: My thoughts on those sources regarding the O(1) insertion for linked lists: The first one does point out that it's O(1) only if you already have something like an "iterator to the location". The second one in turn refers to the same Wikipedia section I showed above, i.e., with insertions after a given node or at the beginning of a list. The third one is, well, the worst site about programming I know, so I'm not surprised they just say O(1) without any further information.
Put differently, as I like real-world analogies: If you ask me how much it costs to replace part X inside a car motor, I might say $200, even though the part only costs $5. Because I wouldn't do that myself. I'd let a mechanic do that, and I'd have to pay for their work. But if you ask me how much it costs to replace the bell on a bicycle, I might say $5 when the bell costs $5. Because I'd do the replacing myself.
A binary search tree is ordered, and it's typically balanced (to avoid O(n) worst-case search times), which means that when you insert a value some amount of shuffling has to be done to balance out the tree. That rebalancing takes an average of O(log n) operations, whereas a Linked List only needs to update a fixed number of pointers once you've found your place to insert an item between nodes.
To insert into a linked list, you just need to maintain the end node of the list (assuming you are inserting at the end).
To insert into a binary search tree (BST), and to maintain the BST after insertion, there is no way you can do that in O(1) - since the tree might re-balance. This operation is not as simple as inserting into a linked list.
Check out some of the examples here.
The insertion time of a Linked List is actually depends on where you are inserting and the types of Linked List.
For example consider the following cases:
You are using a single linked list and you are going to insert at the end / middle, you would have running time of O(n) to traverse the list till the end node or middle node.
You are using double linked list (with two pointer first pointer points to head element and second pointer points to last element) and you are going to insert at the end, this time still you would have O(n) time complexity since you need to traverse to the middle of the list using either first or second pointer.
You are using single linked list and you are going to insert at the first position of the list, this time you would have complexity of O(1) since you don't need to traverse any node at all. The same is true for double linked list and insert position at the end of the list.
So you can see in worst cases scenario a Linked list would take O(n) instead of O(1).
Now in case of BST you can come up with O(log n) time if your BST is balanced and not skewed. If your TREE is skewed (where every elements are greater than the prev elements) this time you need to traverse all the nodes to find the insertion position. For example consider your tree is 1->2->4->6 and you are going to insert node 9, so you need to visit all the nodes to find to insertion position.
1
\
2
\
4
\
6 (last position after which new node going to insert)
\
9 (new insertion position for the new node)
Therefore you can see you need to visit all the nodes to find the proper place, if you have n-nodes you would have O(n+1) => O(n) running time complexity.
But if your BST is balanced and not skewed the situation changes dramatically, since every move you can eliminate nodes which is not comes under condition.
PS: What I mean by not comes under the condition you can take this as homework!

Data Structure: Best and worst running time of an algorithm

I'm confused on which option is correct and why.So here are the questions:
What is the best-case running time of removing a node from the last position in a singly linked list.
(a) O(1)
(b) Ω(n)
(c) O(log n)
(d) Θ(n2)
What I think:
I think that the solution in b? because , I know that when you remove the last element of a linked list , it's O(n) since you have to traverse through all the elements of the linked list.
What is the worst case running time of pushing an element onto a stack implemented in a doubly-linked list?
(a) O(1)
(b) Q(8)
(c) O(n log n)
(d) Ω(n)
What I think:
I think that the solution is d, because the big Oh of inserting an element into the linked list is O(n) , where n is the number of elements you want to insert.
I'm really confused with this topic , if someone can modify my solution and understanding of why their solution is correct , then I would appreciate it. Thanks.
First one you are right. In order to delete the last element you need to traverse the list and modify the pointer there so it is Ω(n).
In the second one however it is O(1). Stack is LIFO. You have a pointer to the head of the stack. All you need to do is to create a node for the new element, make it's next the current head and set the head to the newly created one. Therefore the number of operations is nota function of the size of the stack. It can be done in constant time which is O(1).
EDIT: The above answer assumes the data structure is not implemented with some array implementation which might again be O(n) due to resizing.

Time complexity of deletion in a linked list

I'm having a bit of trouble understanding why time complexity of link lists are O(1) according to this website. From what I understand if you want to delete an element surely you must traverse the list to find out where the element is located (if it even exists at all)? From what I understand shouldn't it be O(n) or am I missing something completely?
No, you are not missing something.
If you want to delete a specific element, the time complexity is O(n) (where n is the number of elements) because you have to find the element first.
If you want to delete an element at a specific index i, the time complexity is O(i) because you have to follow the links from the beginning.
The time complexity of insertion is only O(1) if you already have a reference to the node you want to insert after. The time complexity for removal is only O(1) for a doubly-linked list if you already have a reference to the node you want to remove. Removal for a singly-linked list is only O(1) if you already have references to the node you want to remove and the one before. All this is in contrast to an array-based list where insertions and removal are O(n) because you have to shift elements along.
The advantage of using a linked list rather than a list based on an array is that you can efficiently insert or remove elements while iterating over it. This means for example that filtering a linked list is more efficient than filtering a list based on an array.
Your are correct.
Deletion:
1.If pointer is given in this case Time Complexity is O(1).
2.You DON'T have pointer to the node to be deleted(Search and Delete).
In this case Time Complexity is O(n).

Data Structure: Big O time cost

1) The time cost to add n elements to an initially empty singly linked list by
inserting at the front of the list.
the answer seems to be one of these O(n) or O(1).
I think it is O(1) because
inserting an element into empty list is just
for example Node element = 1;
But I'm still not sure about this.
2)What would be the best-case time cost to find a data element in a linked list with n elements.
The answer also seems to be either O(1) or O(n).
I think it's O(n) because it has to traverse through the list to find the element.
The time cost to add n elements to an initially empty singly linked list by
inserting at the front of the list.
It is O(1) per insertion, but you have n of those - so O(n) at total.
the best-case time cost to find a data element in a linked list with
n elements
It is O(1), because at best case - the searched element is the first one, so there is no need to traverse the list, after searching the first element (which is constant time) - you can halt.

Time complexity for Insertion and deletion of elements from an ordered list

Is the time complexity for both operations equal to O(log n)?
Remeber: the list is ordered, always ordered, and not double linked.
Both insertion and deletion in an ordered linked list is O(n) - since you first need to find what you want to delete/add [in deletion find the relevant node, and in insert - find the correct location of it] - which is O(n) - even if the list is ordered, because you need to get to this place while iterating from the head.
An efficient special type of list that allows fast insertion, deletion and look up is called a skip list, and it uses more nodes to iterate quickly between non adjacent nodes

Resources