Traversing complete binary tree to find min leaf sum - algorithm

I am trying to find the minimum path sum from the root node to any of the leaf node in a complete binary tree. I have the tree's level order traversal in an array. I have come up with the following recursion:
ans = tree[i] + min(rec_fn(2i+1), rec_fn(2i+2))
I have computed the time complexity of this recursive solution to be O(n) but I am not sure whether it is correct. Since I am traversering all the nodes once, I think the complexity should be O(n).
Also, if I store the results of the recursion in an array and use dynamic programming, will it help with the time complexity? will it bring the complexity down further?

Related

Time Efficiency of Binary Search Tree

for the time efficiency of inserting into binary search tree,
I know that the best/average case of insertion is O(log n), where as the worst case is O(N).
What I'm wondering is if there is any way to ensure that we will always have best/average case when inserting besides implementing an AVL (Balanced BST)?
Thanks!
There is no guaranteed log n complexity without balancing a binary search tree. While searching/inserting/deleting, you have to navigate through the tree in order to position yourself at the right place and perform the operation. The key question is - what is the number of steps needed to get at the right position? If BST is balanced, you can expect on average 2^(i-1) nodes at the level i. This further means, if the tree has k levels (kis called the height of tree), the expected number of nodes in the tree is 1 + 2 + 4 + .. + 2^(k-1) = 2^k - 1 = n, which gives k = log n, and that is the average number of steps needed to navigate from the root to the leaf.
Having said that, there are various implementations of balanced BST. You mentioned AVL, the other very popular is red-black tree, which is used e.g. in C++ for implementing std::map or in Java for implementing TreeMap.
The worst case, O(n), can happen when you don't balance BST and your tree degenerates into a linked list. It is clear that in order to position at the end of the list (which is a worst case), you have to iterate through the whole list, and this requires n steps.

Time complexity of tree search with three children at each node

In the case of a Binary search tree, the time complexity of finding an element is O(logn) as one each iteration, elements to be searched is halved. But if there is a tree with a maximum of three children under each node and consider we have some condition which branch to search next among the three branches, then what will be the time complexity. In this case, the element to be searched is reduced by 1/3 each time.
Read the answer here and understand why for binary tree the recurrence relation is
T(n) = T(n/2) + O(1)
In case where tree has 3 nodes or more generally k nodes, the same relation will be
T(n) = T(n/k) + O(1)
Follow along that answer and you will come to know that for any k-ary tree, the binary search will take O(logkn)
I would say what that the complexity would be O(log3N) given the fact that you have some condition which branch to search next among the three branches.
With every iteration you are reducing by 3-fold the number of remaining iterations in the worst case scenario.

Can we reduce the time complexity to construct a Binary tree?

In my yesterday's interview, i was asked time complexity to construct a binary tree from given inorder and preorder/postorder.
I came up with skewed tree, which needs O(N^2) and somehow if we can guarantee a balanced binary tree, then we can do it in O(N log N).
But, in order to reply unique, i came up with an idea that it can be done in O(N) time. Reason what i gave was
Put one by one all the nodes of inorder traversal in hash table in O(N).
Searching in hash table for a particular node can be done in amortized O(1).
Total time complexity can be theoretically reduced to O(N). (Actually, i haven't implemented it yet)
So, was i correct in my reply and the results haven't been announced yet.
It can be done in O(N) time and O(N) space (for a skewed binary tree), but instead of storing the elements of inorder traversal, store the indices of the elements in the hash table. Then the following algorithm should work:
Function buildTree (in_index1, in_index2, pre_index1, pre_index2)
root_index = hashEntry [pre-list[pre_index1]]
createNode (root)
root->leftChild = buildTree (in_index1, root_index-1, pre_index1 + 1, pre_index1 + (root_index-in_index1))
root->rightChild = buildTree (root_index+1, in_index2, pre_index1 + (root_index-in_index1) + 1, pre_index2)
return root
Note: Above is the basic idea. For the recursive calls, you will need to get the correct indices more carefully.

What is the time and space complexity of a breadth first and depth first tree traversal?

Can someone explain with an example how we can calculate the time and space complexity of both these traversal methods?
Also, how does recursive solution to depth first traversal affect the time and space complexity?
BFS:
Time complexity is O(|V|), where |V| is the number of nodes. You need to traverse all nodes.
Space complexity is O(|V|) as well - since at worst case you need to hold all vertices in the queue.
DFS:
Time complexity is again O(|V|), you need to traverse all nodes.
Space complexity - depends on the implementation, a recursive implementation can have a O(h) space complexity [worst case], where h is the maximal depth of your tree.
Using an iterative solution with a stack is actually the same as BFS, just using a stack instead of a queue - so you get both O(|V|) time and space complexity.
(*) Note that the space complexity and time complexity is a bit different for a tree than for a general graphs becase you do not need to maintain a visited set for a tree, and |E| = O(|V|), so the |E| factor is actually redundant.
DFS and BFS time complexity: O(n)
Because this is tree traversal, we must touch every node, making this O(n) where n is the number of nodes in the tree.
BFS space complexity: O(n)
BFS will have to store at least an entire level of the tree in the queue (sample queue implementation). With a perfect fully balanced binary tree, this would be (n/2 + 1) nodes (the very last level). Best Case (in this context), the tree is severely unbalanced and contains only 1 element at each level and the space complexity is O(1). Worst Case would be storing (n - 1) nodes with a fairly useless N-ary tree where all but the root node are located at the second level.
DFS space complexity: O(d)
Regardless of the implementation (recursive or iterative), the stack (implicit or explicit) will contain d nodes, where d is the maximum depth of the tree. With a balanced tree, this would be (log n) nodes. Worst Case for DFS will be the best case for BFS, and the Best Case for DFS will be the worst case for BFS.
There are two major factors of complexity
Time Complexity
Space complexity
Time Complexity
It is the amount of time need to generate the node.
In DFS the amount of time needed is proportional to the depth and branching factor. For DFS the total amount of time needed is given by-
1 + b + b2 + b3 + ... + bd ~~ bd
Thus the time complexity = O(bd)
Space complexity
It is the amount of space or memory required for getting a solution
DFS stores only current path it is pursuing. Hence the space complexity is a linear function of the depth.
So space complexity is given by O(d)

Median of BST in O(logn) time complexity

I came across solution given at http://discuss.joelonsoftware.com/default.asp?interview.11.780597.8 using Morris InOrder traversal using which we can find the median in O(n) time.
But is it possible to achieve the same using O(logn) time? The same has been asked here - http://www.careercup.com/question?id=192816
If you also maintain the count of the number of left and right descendants of a node, you can do it in O(logN) time, by doing a search for the median position. In fact, you can find the kth largest element in O(logn) time.
Of course, this assumes that the tree is balanced. Maintaining the count does not change the insert/delete complexity.
If the tree is not balanced, then you have Omega(n) worst case complexity.
See: Order Statistic Tree.
btw, BigO and Smallo are very different (your title says Smallo).
Unless you guarantee some sort of balanced tree, it's not possible.
Consider a tree that's completely degenerate -- e.g., every left pointer is NULL (nil, whatever), so each node only has a right child (i.e., for all practical purposes the "tree" is really a singly linked list).
In this case, just accessing the median node (at all) takes linear time -- even if you started out knowing that node N was the median, it would still take N steps to get to that node.
We can find the median by using the rabbit and the turtle pointer. The rabbit moves twice as fast as the turtle in the in-order traversal of the BST. This way when the rabbit reaches the end of traversal, the turtle in at the median of the BST.
Please see the full explanation.

Resources