A* complexities - algorithm

I have implemented an A* algorithm which makes use of min priority queue. Now the implementation is very similar to the uniform-cost search which has time complexity of O(mlogn) and space complexity O(n^2) - since it keeps track of the parents and the nodes traversed using A*. Doesn't the time and space complexity for A* remain the same? Because in all the explanations that I am finding, the time and space complexity of A* is O(b^n)

Related

Recursive and Iterative DFS Algorithm Time Complexity and Space Complexity

So I am having some problems understanding why the time complexity of a recursive DFS and an iterative DFS is the same, perhaps someone can guide me through an easy explanation?
Thanks in advance.
A recursive implementation requires, in the worst case, a number of stack frames (invocations of subroutines that have not finished running yet) proportional to the number of vertices in the graph. This worst-case bound is reached on, e.g., a path graph if we start at one end.
An iterative implementation requires, in the worst case, a number of stack entries proportional to the number of vertices in the graph. The same inputs reach this worst-case time as for the recursive implementation.

Purely functional amortized constant time `decrease-key` for priority queues in Dijkstra?

Dijkstra's shortest paths algorithm is O(E log V) when we re-insert nodes into the priority queue. However, if we use constant-time decrease_key, and Fibonacci heap is advertised for this, then the algorithm is O(V log V) because the main loop is not entered for every edge, only for every vertex. Is this possible in a purely functional setting? And how -- searching for the node to decrease key is already O(log n)? I have found on the internet that 2-3 finger trees can be used to implement amortized O(1) decrease_key, do Haskell's or OCaml's implementations support it?

A* time complexity

Wikipedia says the following on A*'s complexity:
The time complexity of A* depends on the heuristic. In the worst case,
the number of nodes expanded is exponential in the length of the
solution (the shortest path), but it is polynomial when the search
space is a tree...
And my question is: "Is A*'s time complexity exponential? Or is it not a time complexity, but memory complexity?"
If it is memory complexity, which time complexity does A* have?
In the worst case A* time complexity is exponential.
But, consider h(n) the estimated distance and h*(n) the exact distance remaining.
If the condition | h(n) - h*(n) | < O(log *h(n) ) holds, that is, if the error
of our estimate functions grows subexponential, then A* time complexity will be
polynomial.
Sadly, most of the time the estimate error grows linear, so, in practice,
faster alternatives are preferred, the price paid being the fact that optimality
is not achieved anymore.
Since each expanded node is stored to avoid visiting the same node multiple times, the exponential growth of the number of expanded nodes implies exponential time and space complexity.
Please note that exponential space complexity necessary implies exponential time complexity. The inverse is not true.
Is A* time complexity exponential or it's memory complexity?
That extract from Wikipedia suggests that that it's referring to time complexity

What is the A* time complexity and how is it derived?

I was wondering if anyone could explain the A* time complexity.
I am using a heuristic that uses euclidean distance for the estimate of the weight. There are no loops in the heuristic function.
So i think that the time complexity of the heuristic is O(1).
Taking this into account, what would the A* complexity be and how is that derived?
you can have your answer here:
Why is the complexity of A* exponential in memory?
time complexity is like memory complexity

What is the time and space complexity of a breadth first and depth first tree traversal?

Can someone explain with an example how we can calculate the time and space complexity of both these traversal methods?
Also, how does recursive solution to depth first traversal affect the time and space complexity?
BFS:
Time complexity is O(|V|), where |V| is the number of nodes. You need to traverse all nodes.
Space complexity is O(|V|) as well - since at worst case you need to hold all vertices in the queue.
DFS:
Time complexity is again O(|V|), you need to traverse all nodes.
Space complexity - depends on the implementation, a recursive implementation can have a O(h) space complexity [worst case], where h is the maximal depth of your tree.
Using an iterative solution with a stack is actually the same as BFS, just using a stack instead of a queue - so you get both O(|V|) time and space complexity.
(*) Note that the space complexity and time complexity is a bit different for a tree than for a general graphs becase you do not need to maintain a visited set for a tree, and |E| = O(|V|), so the |E| factor is actually redundant.
DFS and BFS time complexity: O(n)
Because this is tree traversal, we must touch every node, making this O(n) where n is the number of nodes in the tree.
BFS space complexity: O(n)
BFS will have to store at least an entire level of the tree in the queue (sample queue implementation). With a perfect fully balanced binary tree, this would be (n/2 + 1) nodes (the very last level). Best Case (in this context), the tree is severely unbalanced and contains only 1 element at each level and the space complexity is O(1). Worst Case would be storing (n - 1) nodes with a fairly useless N-ary tree where all but the root node are located at the second level.
DFS space complexity: O(d)
Regardless of the implementation (recursive or iterative), the stack (implicit or explicit) will contain d nodes, where d is the maximum depth of the tree. With a balanced tree, this would be (log n) nodes. Worst Case for DFS will be the best case for BFS, and the Best Case for DFS will be the worst case for BFS.
There are two major factors of complexity
Time Complexity
Space complexity
Time Complexity
It is the amount of time need to generate the node.
In DFS the amount of time needed is proportional to the depth and branching factor. For DFS the total amount of time needed is given by-
1 + b + b2 + b3 + ... + bd ~~ bd
Thus the time complexity = O(bd)
Space complexity
It is the amount of space or memory required for getting a solution
DFS stores only current path it is pursuing. Hence the space complexity is a linear function of the depth.
So space complexity is given by O(d)

Resources