Solving while loop time complexity algorithm - algorithm

I am having quite a hard time figuring out the time complexity of my algorithm. I know that the "for" portion of the algorithm will run in O(n) but I am unsure of my while loop. The problem involves creating a binary tree from a given vector. Each node is evaluated for its value and its index in the vector so, essentially, every following node must be to the right of the previous and depending on whether its value is greater or smaller, it will be a child node or a parent node. The children of the parents nodes must be smaller in value.
I have used the while loop for the case where a child node is smaller than the next node to be placed, and I follow up through the parents until I find the spot for the new node to be placed. I beleive this will run, in the worst case, k-1 times, k being the depth of the tree, but how would I represent this as a time complexity? O(kn)? Is that linear?
for(int i = 0; i < vecteur_source.size(); i++){
if( i == 0){
do bla....
}else if((vecteur_source.at(i) > vecteur_source.at(i-1)) && (m_map_index_noeud.at(i-1)->parent)){
int v = m_map_index_noeud.at(i-1)->parent->index;
while ((vecteur_source.at(i) >= vecteur_source.at(v))){
v = m_map_index_noeud.at(v)->parent->index;
}
}
}

Allow me to simplify this into pseudocode:
# I'm assuming this takes constant or linear time
do the thing for i = 0
for i ← 1 to n-1:
if source[i] > source[i-1] and nodes[i] is not the root node:
v ← nodes[i-1].parent.index
while source[i] > source[v]:
v ← nodes[v].parent.index
If I've understood it properly from your code, then your analysis is correct: the outer loop iterates O(n) times, the inner loop iterates up to O(h) times where h is the height of the tree, so therefore the time complexity is O(nh).
This is not linear time, unless h is guaranteed to be at most a constant. More usually, h is O(log n) for balanced trees, or O(n) in the worst case for unbalanced trees.

Related

Binary Tree Line by Line Level Order Traversal Time Complexity

Here is the Code of Level Order Traversal Line by Line. How come the Time Complexity is O(n) and not O(n2).
def levelOrder(root):
queue = [root]
while len(queue):
count = len(queue)
while count:
current = queue.pop(0)
print(current.data,end='\t')
if current.left:
queue.append(current.left)
if current.right:
queue.append(current.right)
count -= 1
print()
Code
I assume that by O(n2) you actually mean O(n^2).
Why should it be O(n^2)? Just because you have two nested loops it doesn't mean that the complexity is O(n^2). It all depends what you are iterating over and what you are doing inside the loop.
If you look at the execution of the code, you'll see every node in the tree is inserted and popped exactly once, and every iteration of the loop is productive (so there are no iterations that don't do anything). Therefore, the number of iterations is bounded by N, the number of nodes in the tree. So the overall complexity is O(N).
No this has only O(N*L) complexity, where N - number of nodes and L - number of level tree has. I explaining why:
Assume tree has N nodes
queue = [root] | O(1)
while len(queue): | size of level tree has : O(Level)
count = len(queue) | O(1)
while count: | it roughly depends number nodes are left after
| processing left and right sub-tree of the current
| node; O(Left sub tree nodes) + O(Right sub tree
nodes) => O(L+R) => O(N)
count -= 1 | O(1)
In terms of upper bound of an algorithm, it wrap into O(N * L * 1) => O(N*L), where is N-Number of Nodes and L-Number of Level, tree has.

Running time of algorithm with arbitrary sized recursive calls

I have written the following algorithm that given a node x in a Binary Search Tree T, will set the field s for all nodes in the subtree rooted at x, such that for each node, s will be the sum of all odd keys in the subtree rooted in that node.
OddNodeSetter(T, x):
if (T.x == NIL):
return 0;
if (T.x.key mod 2 == 1):
T.x.s = T.x.key + OddNodeSetter(T, x.left) + OddNodeSetter(T, x.right)
else:
T.x.s = OddNodeSetter(T, x.left) + OddNodeSetter(T, x.right)
I've thought of using the master theorem for this, with the recurrence
T(n) = T(k) + T(n-k-1) + 1 for 1 <= k < n
however since the size of the two recursive calls could vary depending on k and n-k-1 (i.e. the number of nodes in the left and right subtree of x), I can't quite figure out how to solve this recurrence though. For example in case the number of nodes in the left and right subtree of x are equal, we can express the recurrence in the form
T(n) = 2T(n/2) + 1
which can be solved easily, but that doesn't prove the running time in all cases.
Is it possible to prove this algorithm runs in O(n) with the master theorem, and if not what other way is there to do this?
The algorithm visits every node in the tree exactly once, hence O(N).
Update:
And obviously, a visit takes constant time (not counting the recursive calls).
There is no need to use the Master theorem here.
Think of the problem this way: what is the maximum number of operations you have do for each node in the tree? It is bounded by a constant. And what the is the number of nodes in the tree? It is n.
The multiplication of constant with n is still O(n).

Big O Time Complexity for Recursive Pattern

I have question on runtime for recursive pattern.
Example 1
int f(int n) {
if(n <= 1) {
return 1;
}
return f(n - 1) + f(n - 1);
}
I can understand that the runtime for the above code is O(2^N) because if I pass 5, it calls 4 twice then each 4 calls 3 twice and follows till it reaches 1 i.e., something like O(branches^depth).
Example 2
Balanced Binary Tree
int sum(Node node) {
if(node == null) {
return 0;
}
return sum(node.left) + node.value + sum(node.right);
}
I read that the runtime for the above code is O(2^log N) since it is balanced but I still see it as O(2^N). Can anyone explain it?
When the number of element gets halved each time, the runtime is log N. But how a binary tree works here?
Is it 2^log N just because it is balanced?
What if it is not balanced?
Edit:
We can solve O(2^log N) = O(N) but I am seeing it as O(2^N).
Thanks!
Binary tree will have complexity O(n) like any other tree here because you are ultimately traversing all of the elements of the tree. By halving we are not doing anything special other than calculating sum for the corresponding children separately.
The term comes this way because if it is balanced then 2^(log_2(n)) is the number of elements in the tree (leaf+non-leaf).(log2(n) levels)
Again if it is not balanced it doesn't matter. We are doing an operation for which every element needs to be consideredmaking the runtime to be O(n).
Where it could have mattered? If it was searching an element then it would have mattered (whether it is balanced or not).
I'll take a stab at this.
In a balanced binary tree, you should have half the child nodes to the left and half to the right of each parent node. The first layer of the tree is the root, with 1 element, then 2 elements in the next layer, then 4 elements in the next, then 8, and so on. So for a tree with L layers, you have 2^L - 1 nodes in the tree.
Reversing this, if you have N elements to insert into a tree, you end up with a balanced binary tree of depth L = log_2(N), so you only ever need to call your recursive algorithm for log_2(N) layers. At each layer, you are doubling the number of calls to your algorithm, so in your case you end up with 2^log_2(N) calls and O(2^log_2(N)) run time. Note that 2^log_2(N) = N, so it's the same either way, but we'll get to the advantage of a binary tree in a second.
If the tree is not balanced, you end up with depth greater than log_2(N), so you have more recursive calls. In the extreme case, when all of your children are to the left (or right) of their parent, you have N recursive calls, but each call returns immediately from one of its branches (no child on one side). Thus you would have O(N) run time, which is the same as before. Every node is visited once.
An advantage of a balanced tree is in cases like search. If the left-hand child is always less than the parent, and the right-hand child is always greater than, then you can search for an element n among N nodes in O(log_2(N)) time (not 2^log_2(N)!). If, however, your tree is severely imbalanced, this search becomes a linear traversal of all of the values and your search is O(N). If N is extremely large, or you perform this search a ton, this can be the difference between a tractable and an intractable algorithm.

how to find the position of right most node in last level of complete binary tree?

I am doing a problem in binary trees, and when I came across a problem find the right most node in the last level of a complete binary tree and the issue here is we have to do it in O(n) time which was a stopping point, Doing it in O(n) is simple by traversing all the elements, but is there a way to do this in any complexity less than O(n), I have browsed through internet a lot, and I couldn't get anything regarding the thing.
Thanks in advance.
Yes, you can do it in O(log(n)^2) by doing a variation of binary search.
This can be done by first going to the leftest element1, then to the 2nd leftest element, then to the 4th leftest element, 8th ,... until you find there is no such element.
Let's say the last element you found was the ith, and the first you didn't was 2i.
Now you can simply do a binary search over that range.
This is O(log(n/2)) = O(logn) total iterations, and since each iteration is going down the entire tree, it's total of O(log(n)^2) time.
(1) In here and the followings, the "x leftest element" is referring only to the nodes in the deepest level of the tree.
I assume that you know the number of nodes. Let n such number.
In a complete binary tree, a level i has twice the number of nodes than the level i - 1.
So, you could iteratively divide n between 2. If there remainder then n is a right child; otherwise, is a left child. You store into a sequence, preferably a stack, whether there is remainder or not.
Some such as:
Stack<char> s;
while (n > 1)
{
if (n % 2 == 0)
s.push('L');
else
s.push('R');
n = n/2; // n would int so division is floor
}
When the while finishes, the stack contains the path to the rightmost node.
The number of times that the while is executed is log_2(n).
This is the recursive solution with time complexity O(lg n* lg n) and O(lg n) space complexity (considering stack storage space).
Space complexity can be reduced to O(1) using Iterative version of the below code.
// helper function
int getLeftHeight(TreeNode * node) {
int c = 0;
while (node) {
c++;
node = node -> left;
}
return c;
}
int getRightMostElement(TreeNode * node) {
int h = getLeftHeight(node);
// base case will reach when RightMostElement which is our ans is found
if (h == 1)
return node -> val;
// ans lies in rightsubtree
else if ((h - 1) == getLeftHeight(node -> right))
return getRightMostElement(node -> right);
// ans lies in left subtree
else getRightMostElement(node -> left);
}
Time Complexity derivation -
At each recursion step, we are considering either left subtree or right subtree i.e. n/2 elements for maximum height (lg n) function calls,
calculating height takes lg n time -
T(n) = T(n/2) + c1 lgn
= T(n/4) + c1 lgn + c2 (lgn - 1)
= ...
= T(1) + c [lgn + (lgn-1) + (lgn-2) + ... + 1]
= O(lgn*lgn)
Since it's a complete binary tree, going over all the right nodes until you reach the leaves will take O(logN), not O(N). In regular binary tree it takes O(N) because in the worst case all the nodes are lined up to the right, but since it's a complete binary tree, it can't be

How to build a binary tree in O(N ) time?

Following on from a previous question here I'm keen to know how to build a binary tree from an array of N unsorted large integers in order N time?
Unless you have some pre-conditions on the list that allow you to calculate the position in the tree for each item in constant time it is not possible to 'build', that is sequentially insert, items into a tree in O(N) time. Each insertion has to compare up to Log M times where M is the number of items already in the tree.
OK, just for completeness... The binary tree in question is built from an array and has a leaf for every array element. It keeps them in their original index order, not value order, so it doesn't magically let you sort a list in linear time. It also needs to be balanced.
To build such a tree in linear time, you can use a simple recursive algorithm like this (using 0-based indexes):
//build a tree of elements [start, end) in array
//precondition: end > start
buildTree(int[] array, int start, int end)
{
if (end-start > 1)
{
int mid = (start+end)>>1;
left = buildTree(array, start, mid);
right = buildTree(array, mid, end);
return new InternalNode(left,right);
}
else
{
return new LeafNode(array[start]);
}
}
I agree that this seems impossible in general (assuming we have a general, totally ordered set S of N items.) Below is an informal argument where I essentially reduce the building of a BST on S to the problem of sorting S.
Informal argument. Let S be a set of N elements. Now construct a binary search tree T that stores items from S in O(N) time.
Now do an inorder walk of the tree and print values of the leaves as you visit them. You essentially sorted the elements from S. This took you O(|T|) steps, where |T| is the size of the tree (i.e. the number of nodes). (The size of the BST is O(N log N) in the worst case.)
If |T|=o(N log N) then you just solved the general sorting problem in o(N log N) time which is a contradiction.
I have an idea, how it is possible.
Sort array with RadixSort, this is O(N). Thereafter, use recursive procedure to insert into leafs, like:
node *insert(int *array, int size) {
if(size <= 0)
return NULL;
node *rc = new node;
int midpoint = size / 2;
rc->data = array[midpoint];
rc->left = insert(array, midpoint);
rc->right = insert(array + midpoint + 1, size - midpoint - 1);
return rc;
}
Since we do not iterate tree from up to down, but always attach nodes to a current leafs, this is also O(1).

Resources