What is the index of a child node in a heap in the array represenatation of a heap.
In my lecture notes and in this article
it is given as 2k and/or 2k+1
But in an array indexes start from 0,not 1 right?
Therefore shouldn't the children of node k be 2k+1 and/or 2k+2
Well as per going from general convention that the array indices usually start from 0.So in this case the root is assigned the 0th node.Then the child nodes are considered as 2k+1 and 2k+2;
But,even in the article they have clearly mentioned that
The root of the tree A[1] and given index i of a node, the indices of
its parent, left child and right child can be computed
PARENT (i)
return floor(i/2)
LEFT (i)
return 2i
RIGHT (i)
return 2i + 1
Hence,going as per the article,it should be 2k and 2k+1. Had the root been considered as index 0 of the array, then the indices of child nodes will be 2k+1 and 2k+2;
If the indices start from 0 it is 2k+1 and/or 2k+2.
If they start from 1, it is 2k and 2k + 1.
Related
Each insert to a python list is 0(n), so for the below snippet of code is the worst case time-complexity O(n+ 2k) or O(nk)? Where k is the elements, we move during the insert.
def bfs_binary_tree(root):
queue=[root]
result=[]
while queue:
node = queue.pop()
result.append(node.val)
if node.left :
queue.insert(0, node.left)
if node.right:
queue.insert(0, node.right)
return result
I am using arrays as FIFO queue, but inserting each element at the start of the list has O(k) complexity, so trying to figure out the total complexity for n elements in the queue.
Since each node ends up in the queue at most once, the outer loop will execute n times (where n is the number of nodes in the tree).
Two inserts are performed during each iteration of the loop and these inserts will require size_of_queue + 1 steps.
So we have n steps and size_of_queue steps as the two variables of interest.
The question is: the size of the queue changes, so what is the overall runtime complexity?
Well, the size of the queue will continuously grow until it is full of leaf nodes, which is the upper bound of the size of the queue. Since the number of leaf nodes is the upper bound of the queue, we know that the queue will never be larger than that.
Therefore, we know that the algorithm will never take more than n * leaf nodes steps. This is our upper bound.
So let's find out what the relationship between n and leaf_nodes is.
Note: I am assuming a balanced complete binary tree
The number of nodes at any level of a balanced binary tree with a height of at least 1 (the root node) is: 2^level. The max level of a tree is called its depth.
For example, a tree with a root and two children has 2 levels (0 and 1) and therefore has a depth of 1 and a height of 2.
Thhe total number of nodes in a tree (2^(depth+1))-1 (-1 because level 0 only has one node).
n=2^(depth+1)-1
We can also use this relationship to identify the depth of the balanced binary tree, given the total number of nodes:
If n=2^(depth+1) - 1
n + 1 = 2^(depth+1)
log(n+1) = depth+1 = number of levels, including the root. Subtract 1 to get the depth (ie., the max level) (in a balanced tree with 4 levels, level 3 is the max level because root is level 0).
What do we have so far
number_of_nodes = 2^(depth+1) - 1
depth = log(number_of_nodes)
number_of_nodes_at_level_k = 2^k
What we need
A way to derive the number of leaf nodes.
Since the depth == last_level and since the number_of_nodes_at_level_k = 2^k, it follows that the number of nodes at the last level (the leaf nodes) = 2^depth
So: leaf_nodes = 2^depth
Your runtime complexity is n * leaf_nodes = n * 2^depth = n * 2^(log n) = n * n = n^2.
In the heap data structure, do the two equations
left = 2i+1
right = 2i+2
apply on any given heap? if not, in what condition they shall be efficient?
According to D.S. Malik "A heap is a list in which each element contains a key, such that the key in the element at position k in the list is at least as large as the key in the element at position 2k + 1 (if it exists) and 2k + 2 (if it exists)."
You should also note the element's position in the list: "in general, for the node k,which is the k-1th element of the list, its left child is the 2kth (if it exists) element of the list, which is at position 2k-1 in the list, and the right child is the 2k + 1st (if it exists) element of the list, which is at position 2k in the list."
When studying I found it helpful to actually input some indexes and make sure you get the element you were hoping. Best of luck.
It depends on where the root is. If the root is at index 0 in the array, then your equations are correct. The left node is at index 2i + 1 and the right node is at index 2i + 2.
Many examples have the root at index 1 in the array. In that case, the left child is at index 2i, and the right node is at index 2i + 1.
In a B tree with minimum degree t, each non leaf node other than root has at least t children and at most 2*t children. Suppose that the keys {1,2,3...,n} are inserted into an empty B tree with minimum degree 2 in the sequence 1,2,3.....,n. How many nodes does the final B tree have?
From what I have understood, I feel it would be n/t since minimum number of keys each node can have is k, and the total number of keys is n. Am I correct?? If not Tell me where am I going wrong and how should I do this?
The answer is (n-2)*log(n-2) with t=2
We know that every node except the root must have at least t−1=1 keys, and at most 2t−1=3 keys. The final tree can have at most n−1 nodes when n≥2. Unless n=1 there cannot ever be n nodes since we only ever insert a key into a non-empty node, so there will always be at least one node with 2 keys. Next observe that we will never have more than one key in a node which is not a right spine of our B-tree. This is because every key we insert is larger than all keys stored in the tree, so it will be inserted into the right spine of the tree. The fewest possible number of nodes occurs when every node except the deepest node in the right spine has 2 keys and the deepest node in the right spline has 3 keys. So at height 1, 1 nodes, at height 2, 3 nodes, …, at level h, 2h−1 nodes. In this case, n =∑(i=1)^h▒2^i +1=2h+1−1 where h is the height of the B-tree, and the number of nodes in B-Tree is #nodes = ∑(i=1)^h▒〖(2^i-1)〗 = 2h+1−2−h = n−lg(n+1). So for any n, the final B-Tree must have n−⌊lg(n+1)⌋≤#nodes≤n−1 (if n≥2).
I stumbled upon the problem of finding the number of distinct elements to the left and less than the element for each position in array.
Example:
For the array 1 1 2 4 5 3 6 the the answer would be 0 0 1 2 3 2 5
It's straight forward to solve the problem in O(n2), I wish to know if the problem could be solved in O(n*lg(n)).
Yes, you can just insert the elements into a balanced (red-black, AVG, whatever) binary search tree, storing the total subtree node count in each node. Updates are O(log N), as you only update along the path to root, and checking the number of distinct elements is also O(log N), as it requires summing the nodecount of left subtrees on the path from the new element to root.
This is how a tree might look after inserting [0,1,2,3,5,6], the subtree nodecounts in parentheses.
2(6)
/ \
1(2) 5(3)
/ / \
0(1) 3(1)6(1)
While inserting 6 (assuming it's last), you add:
2 (node count of left subtree of 2)
1 (the node with 2, because you take the right path, so root is smaller)
1 (the left subtree of 5)
1 (the node with 5, same reason, no left subtree to add)
Total 5. The tree is a bit too small to see the savings from keeping the totals, but note that you don't need to visit the 0 node, it's accounted for in its parent - the 1 node.
I found this in book:
Design a data structure for maintaining dynamic sequence x_1, x_2,... , x_n
which provides operations:
Insert(a,i) - inserts a as x_i, indexes from i+1 to n go up by 1
Delete(i) - deletes x_i, indexes from i+1 to n go down by 1
Find(i) - returns element x_i
Sum_even() - returns sum of the elements with even indexes
The sequence is dynamic so I want to use AVL tree to keep it. So I think I can use standard trick for find, delete and insert - just keep in each node size of sub tree rooted in this node. Then I can easily navigate the tree to find ith element in O(log n) time. Inorder gives me then: x_1, x_2,..., x_n.
But for Sum_even() I probably should have also sum of elements with even and odd indexes in each node. Although it's difficult to update while inserting or deleting. How should it work, can anyone help?
You don't have to store size of subtree in a node. AVL tree stores only difference in heights of left and right subtree: -1, 0 or +1.
In order to easily maintain sum_even you need to store sum_even and sum_odd for each node. It will be the sum of values on even/odd indexes for subtree.
So each node will have variables:
difference difference in height of left and right subtrees
value
parity parity of subtree size (0 or 1)
sum_even
sum_odd
For inserts and deletes use standard algorithm http://en.wikipedia.org/wiki/AVL_tree.
But for each affected node (nodes on the way up to root and rotated nodes) update values:
parity := (left.parity + right.parity + 1) % 2
if left.parity == 0
sum_even := left.sum_even + right.sum_odd
sum_odd := left.sum_odd + right.sum_even + value
else
sum_even := left.sum_even + right.sum_even + value
sum_odd := left.sum_odd + right.sum_odd