Asymptotic running time insertion and searching in AVL - algorithm

I am learning about AVL trees. AVL trees are Binary Search Trees that balance themselves through rotations. Because they are balanced, the query time is O(log n). But the order in which the entries are added is also important to avoid the worst-case O(log n) rotations per insertion.
What is the asymptotic running time of:
(a) adding n entries with consecutive keys in an AVL tree (the time to insert all, not each of them)
b) searching for a key that is not in the tree.
What I understand is this height is O(log N), so insertion into an AVL tree has a worst case O(log N). Searching an AVL tree is completely unchanged from BST's, and so also takes time proportional to the height of the tree, making O(log N).
Is it correct?

Insertion in an AVL requires at most one rotation not O(log n) rotations( two if you count the double rotations individually). Asymptotically the order of insertion does not matter since a rotation takes constant time.
a) with n insertion the cost = n*(cost of finding the proper place to insert+actual creation and insertion of node +rotation if needed)=n*( O(log n)+O(1)+O(1))=O(n log n)
b) searching for an element is O(log n) since the tree is balanced
c) deleting a single element requires at most O(log n) rotations so the complexity of deletion is also O(log n)

Related

Tree Sort Performance

I have an AVL tree implementation where the insertion method runs in O(log n) time and the method that returns an in-order list representation runs in O(n^2) time. If I have a list needed to be sorted. By using a for-loop, I can iterate through the list and insert each element into the AVL tree, which will run in O(n log n) time combined. So what is the performance of this entire sorting algorithm (i.e. iterate through the list, insert each element, then use in-order traversal to return a sorted list)?
You correctly say that adding n elements to the tree will take time O(nlog(n)). A simple in-order traversal of a BST can be performed in time O(n). It is thus possible to get a sorted list of the elements in time O(nlog(n) + n) = O(nlog(n)). If the time complexity of your algorithm to generate the sorted list from the tree is quadratic (i.e. in O(n^2) but not always in O(n)) the worst case time complexity of the procedure you describe is in O(nlog(n) + n^2) = O(n^2), which is not optimal.

AVL tree worst case number of rotations during insertion and deletion

In an AVL tree, what is the worst case number of rotations during insertion and deletion of n elements ?
I think for insertion it should be O(n) and for deletion it should be O(nlogn). However, I am not that much sure about deletion .
Am I correct?
For both operations - inserting or deleting of a node x, there are cases that require rotations to be made on all nodes from x up to the root. Since height of a tree with n nodes is O(log n), the worst case for both operations take O(log n) rotations. For n insert/delete operations that gives O(n log n).

Is building a BST with N given elements is O(n lg n)?

What would be the worst case time complexity to build binary search tree with given arbitrary N elements ?
I think there is a difference between N given elements and the elements coming one by one and thereby making a BST of total N elements .
In the former case, it is O(n log n) and in second one is O(n^2) . Am i right ?
If Binary Search Tree (BST) is not perfectly balanced, then the worst case time complexity is O(n^2). Generally, BST is build by repeated insertion, so worst case will be O(n^2). But if you can sort the input (in O(nlogn)), it can be built in O(n), resulting in overall complexity of O(nlogn)
It BST is self-balancing, then the worst case time complexity is O(nlog n) even if we have repeated insertion.

T.C to find an element if it is already present in binary heap tree?

What will be the time complexity to find and element if it is already there in a binary heap ?
I think traversal operations are not possible in heap trees !!
In the worst case, one would end up traversing the entire binary heap to search for an element and therefore the time complexity would be O(N) that is linear in number of elements in the binary heap
Binary Heaps are not meant to be used as search data structure. They are used for implementing priority queues and they handle all the operations that are used frequently very well than linear time
insert: O(log n)
delete: O(log n)
increase_key: O(log n)
decrease_key: O(log n)
extract_max or extract_min: O(log n)
If you want the search operation to be always O(log n) use balanced search trees like AVL or Red Black trees.

AVL tree rotation and red-black tree color flips

As we all know, the insertion and deletion all require O(log n).
AVL tree require O(log n), because it need O(log n) to insert and O(log n) to rotation for balance.
RB tree require O(log n), because it need O(log n) to insert, In INTRODUCTION TO ALGORITHMS THIRD EDITION, the RB-INSERT-FIXUP need O(log n) for case 1(color flips), and at most 2 times to rotation.
So it seems that AVL require 2O(log n), but RB tree require 2O(log n)+C.
Why we think RB tree is more faster than AVL in insertion? Just because rotation need more time than color flips? A rotation and color flips both require O(1), why rotation is more time-consuming than color flips?
Thanks!:)
If I understand your question correctly, yes it is true that RB-Trees and AVL-Trees both offer lookup, insertion, deletion in O(logn) time.
AVL-Trees are more rigidly balanced than RB-Trees. To attain this, a lot of rotations are required, which are time consuming. RB-Trees are slightly unbalanced, as they have weaker rules for balancing, thus they need lesser operations for insertion and deletion. As a consequence, lookup in AVL-Trees is faster than RB-Trees, but insertion and deletion is faster in RB-Trees.
EDIT
Please read this blog post. The point is that RB trees balance faster than AVL trees after insertion. Yes, a rotation does take O(1) time in AVL trees, at most two rotations need to be done, but the point of rotation still needs to be found, and the time for rotation becomes O(logn). Whereas in RB trees, rebalancing after insertion runs in amortized constant time. So, O(1) amortized time for color flips, not O(logn).

Resources