Getting the time complexity of below recursive implementation - algorithm

/* Function to get diameter of a binary tree */
int diameter(struct node * tree)
{
/* base case where tree is empty */
if (tree == 0)
return 0;
/* get the height of left and right sub-trees */
int lheight = height(tree->left);
int rheight = height(tree->right);
/* get the diameter of left and right sub-trees */
int ldiameter = diameter(tree->left);
int rdiameter = diameter(tree->right);
return max(lheight + rheight + 1, max(ldiameter, rdiameter));
}
int height(struct node* node)
{
/* base case tree is empty */
if(node == NULL)
return 0;
/* If tree is not empty then height = 1 + max of left
height and right heights */
return 1 + max(height(node->left), height(node->right));
}
How the time complexity of finding diameter of tree with this implementation is O(n^2) where n is the number of nodes in the tree ??

It is O(n^2) because height calculation is also recursive.
You can write a recurrence relation and solve it.
Else by Master's theorem
You can see that f(n) is linear and hence c = 1
So the complexity is log a to b when a is 4 (recursion used 4 times) and b is 2 (on half of the tree)

Let D() denote the diameter() and H() denote the height(). For convenience let us assume the Binary Tree is Complete Binary Tree so that the left sub tree and the right sub tree has equal number of elements. Let us also assume that there are N elements in the Binary Tree. Now the time complexity of diameter function can be represented using the following recurrence relation.
D(N) = 2D(N/2) + 2H(N/2) + c1 ----- 1
Because of the following recursive calls in the diameter(),
int lheight = height(tree->left);
int rheight = height(tree->right);
int ldiameter = diameter(tree->left);
int rdiameter = diameter(tree->right);
Now let us analyze the height(),
The recurrence relation denoting the time complexity of height() is,
H(N) = 2H(N/2) + c2 ------ 2
Because of the following recursive calls in the height(),
return 1 + max(height(node->left), height(node->right));
Now H(N) = O(N logN) by applying Master Theorem on 2.
Substitute this in 1, We get,
D(N) = 2D(N/2) + c3 N logN + c1 ------ 3
Solve 3 using Master Theorem, We get D(N) = O(N logN).
So the complexity of the recursive function diameter() is O(N logN)

Related

What is the runtime of my recursive memoized solution

I am giving a binary search tree with n node with unique value 1 to n and need to compute how many structurally different binary search tree I can make from it. I use DFS with memoization to solve the problem. It is basically like if we have n node, the root node can be from 1 to n, then I recursively compute how many the subtree can the tree has. Also, I memoized the range of node value the tree can have and how many different tree can be made with that range of node value, so I dont recompute. I think the Time and Space are both O(n^2) as there can be n^2 different range for my tree node val. Can anyone comment on that?
class Solution {
public int numTrees(int n) {
// structrually unique BST with value from 1 to n
// same structure but different number? no, one way to arrange node
// from 1 to n start
// left has num candid - 1 to 1
// right has num candid + 1 to n
Map<Integer, Integer> memo = new HashMap<>();
return numWays(1, n, memo);
}
private int numWays(int low, int high, Map<Integer, Integer> memo) {
if(memo.containsKey(low * 100 + high)) {
return memo.get(low * 100 + high);
}
if(low >= high) return 1;
int ans = 0;
for(int i = low; i <= high; i++) {
ans = ans + numWays(low, i - 1, memo) * numWays(i + 1, high, memo);
}
memo.put(low * 100 + high, ans);
return ans;
}
}
The time complexity is currently O(n^3). It is true that there are only O(n^2) ranges, and at most O(n^2) pairs of (low, high) appearing as inputs to the numWays function. However, the numWays function takes O(high-low+1) steps after memoization, which is another O(n) factor.
To speed this up, you might notice that the number of BST's for [1,2,3,4] is the same as the number of BST's for [2,3,4,5] or for [3,4,5,6]; only the length of the array matters (giving you an O(n^2) algorithm from a tiny change). Another possible speedup comes from noticing that for every rooted binary tree with n nodes, there is exactly one way to label the nodes with [1,2,...,n] to get a BST, so you're looking for a way/recurrence to count rooted binary trees.
We could also use a formula:
const f = n => n < 2 ? 1 : (4*n - 2) / (n + 1) * f(n - 1);
for (let i=0; i<20; i++)
console.log(f(i));

Segment Tree Build?

I am getting conflicting evidence on the build time complexity for a recursive segment tree.
Some sources(wikipedia) claim it's O(N*log(N)), while others claim it's O(N). My intuition says it's O(N), because we have 2N nodes and 2N-1 edges.
Which one is it?
Note: We're building the segment tree with a function like such:
private int build(int[] a, int i, int l, int r){
if(l == r){
nodes[i] = a[l];
}else{
nodes[i] = Math.min(build(a, i*2, l, (l+r)/2),
build(a, i*2+1, (l+r)/2+1, r));
}
return nodes[i];
}
we're not doing point update for each value in the array.
Notice that its just a DFS algorithm and DFS time complexity is O(|V| + |E|).
So it means the complexity is O(2n + 2n - 1) = O(n)

how to find the position of right most node in last level of complete binary tree?

I am doing a problem in binary trees, and when I came across a problem find the right most node in the last level of a complete binary tree and the issue here is we have to do it in O(n) time which was a stopping point, Doing it in O(n) is simple by traversing all the elements, but is there a way to do this in any complexity less than O(n), I have browsed through internet a lot, and I couldn't get anything regarding the thing.
Thanks in advance.
Yes, you can do it in O(log(n)^2) by doing a variation of binary search.
This can be done by first going to the leftest element1, then to the 2nd leftest element, then to the 4th leftest element, 8th ,... until you find there is no such element.
Let's say the last element you found was the ith, and the first you didn't was 2i.
Now you can simply do a binary search over that range.
This is O(log(n/2)) = O(logn) total iterations, and since each iteration is going down the entire tree, it's total of O(log(n)^2) time.
(1) In here and the followings, the "x leftest element" is referring only to the nodes in the deepest level of the tree.
I assume that you know the number of nodes. Let n such number.
In a complete binary tree, a level i has twice the number of nodes than the level i - 1.
So, you could iteratively divide n between 2. If there remainder then n is a right child; otherwise, is a left child. You store into a sequence, preferably a stack, whether there is remainder or not.
Some such as:
Stack<char> s;
while (n > 1)
{
if (n % 2 == 0)
s.push('L');
else
s.push('R');
n = n/2; // n would int so division is floor
}
When the while finishes, the stack contains the path to the rightmost node.
The number of times that the while is executed is log_2(n).
This is the recursive solution with time complexity O(lg n* lg n) and O(lg n) space complexity (considering stack storage space).
Space complexity can be reduced to O(1) using Iterative version of the below code.
// helper function
int getLeftHeight(TreeNode * node) {
int c = 0;
while (node) {
c++;
node = node -> left;
}
return c;
}
int getRightMostElement(TreeNode * node) {
int h = getLeftHeight(node);
// base case will reach when RightMostElement which is our ans is found
if (h == 1)
return node -> val;
// ans lies in rightsubtree
else if ((h - 1) == getLeftHeight(node -> right))
return getRightMostElement(node -> right);
// ans lies in left subtree
else getRightMostElement(node -> left);
}
Time Complexity derivation -
At each recursion step, we are considering either left subtree or right subtree i.e. n/2 elements for maximum height (lg n) function calls,
calculating height takes lg n time -
T(n) = T(n/2) + c1 lgn
= T(n/4) + c1 lgn + c2 (lgn - 1)
= ...
= T(1) + c [lgn + (lgn-1) + (lgn-2) + ... + 1]
= O(lgn*lgn)
Since it's a complete binary tree, going over all the right nodes until you reach the leaves will take O(logN), not O(N). In regular binary tree it takes O(N) because in the worst case all the nodes are lined up to the right, but since it's a complete binary tree, it can't be

Diameter of Binary Tree - Algorithm Complexity

In another question about finding an algorithm to compute the diameter of a binary tree the following code is provided as a possible answer to the problem.
public static int getDiameter(BinaryTreeNode root) {
if (root == null)
return 0;
int rootDiameter = getHeight(root.getLeft()) + getHeight(root.getRight()) + 1;
int leftDiameter = getDiameter(root.getLeft());
int rightDiameter = getDiameter(root.getRight());
return Math.max(rootDiameter, Math.max(leftDiameter, rightDiameter));
}
public static int getHeight(BinaryTreeNode root) {
if (root == null)
return 0;
return Math.max(getHeight(root.getLeft()), getHeight(root.getRight())) + 1;
}
In the comments section it's being said that the time complexity of the above code is O(n^2). At a given call of the getDiameter function, the getHeight and the getDiameter functions are called for the left and right subtrees.
Let's consider the average case of a binary tree. Height can be computed at Θ(n) time (true for worst case too). So how do we compute the time complexity for the getDiameter function?
My two theories
Τ(n) = 4T(n/2) + Θ(1) = Θ(n^2), height computation is considered
(same?) subproblem.
T(n) = 2T(n/2) + n + Θ(1) = Θ(nlogn), n = 2*n/2 for height computation?
Thank you for your time and effort!
One point of confusion is that you think the binary tree is balanced. Actually, it can be a line. In this case, we need n operations from the root to the leaf to find the height, n - 1 from the root's child to the leaf and so on. This gives O(n^2) operations to find the height alone for all nodes.
The algorithm could be optimised if the height of each node was calculated independently, before finding the diameter. Then we would spend O(n) time for finding all heights. Then the complexity of finding the diameter would be of the following type:
T(n) = T(a) + T(n - 1 - a) + 1
where a is the size of the left subtree. This relation would give linear time for finding diameter also. So the total time would be linear.

BinaryTree functions - complexity

I am just writing down different function, which are operation what can be done on a binary tree.
I am wondering what is the running time of this function, trying to get rid with them:
getMaxDepth(Tree) //What can be the time complexity here?
if Tree.root = NIL return 0 // BaseCase
leftDepth := 1 + getMaxDepth(Tree.root.left)
rightDepth := 1 + getMaxDepth(Tree.root.right)
if leftDepth > rightDepth then return leftDepth;
else return rightDepth;
internalNodeCount(Node n) // And here?
if isLeaf(n) then return 0
return 1 + internalNodeCount(n.left) + internalNodeCount(n.right)
isLeaf(Node n)
return n=NIL OR (n.left=NIL AND n.right=nil);
GetMaxDepth I assume the time complexity is O(n) because I need to traverse the whole tree recursively for ever node....what can be a good explanation?
InternalNodeCount I guess it is the same complexity O(n) for the same reason.....
From what I understood it looks like you are looking for some proof.
For getMaxDepth here is the explanation:
T(1) = c1
T(n) = T(k) + T(n-k-1) + c2
where
T(n) = Time to process tree of n nodes
n = number of nodes
k = nodes in left subtree
n-k-1 = nodes in right subtree
c1, c2 = constants (not dependent upon n)
(Time to calculate the depth of the tree from given left and right subtree depth)
The same could be applied to internalNode too excepts the contants would be different.

Resources