In the A* path search algorithm the general definition for an consistent heuristic is h(m)<=h(n)+d(m,n) for any edge (m,n).
Is this true also for an undirected graph? In an undirected graph (m,n)=(n,m) and d(m,n)=d(n,m) and will also be true that h(n)<=h(m)+d(n,m), this means that h(n)=h(m) for all m and n. But this seams to be absurd.
Where am I doing wrong? Maybe in an undirected graph the consistency of a heuristic is h(m)<=h(n)+d(m,n) for m successor of n?
From what I understand, your reasoning is correct but your maths are not.
Yes, for an undirected graph you get two inequalities:
h(m) <= h(n) + d(m, n)
h(n) <= h(m) + d(m, n)
This does not imply that h(n) = h(m). If you flip the d term over in the latter, equation, you get
h(n) - d(m, n) <= h(m)
so you get
h(n) - d(m, n) <= h(n) + d(m, n)
so
-d(m, n) <= d(m, n).
The only way in which this implies h(n) = h(m) is if d(m, n) = 0.
Related
The time complexity of BFS, or DFS, on a graph is O(V+E) because we traverse all the nodes and edges of the graph. (I get that) But for a binary tree, the time complexity of BFS and DFS is O(V)... Why is that?
I am assuming is because of the following: O(V+E) = O(V + V-1) = O(2V) = O(V). Is this the correct reasoning? If not, an intuitive explanation would be much appreciated. Thanks
All trees have n - 1 edges, n being the number of nodes. The time complexity is still technically O(V + E), but that equates to O(n + (n-1)) = O(n).
you can actually see it in a different way, without the use of graphs.
n is the number of nodes.
And denote the steps required for traversing through the whole tree f(n) (note, the time complexity will then be O(f(n))).
Consider that for each node we need to:
either visit that, or traverse it through on the left, and traverse it through on the right, and eventually return on it at most one time.
All these 4 operations can happen at most Once for each node. Agree?
From this we deduce that f(n) <= 4n.
Agree? Because for each node we can have at most those 4 operations. Remind we have n nodes.
Obviously, at the same time, n <= f(n)
because we need to visit each node at least once.
Therefore,
n <= f(n) <= 4n
Applying the O notation, we get
O(n) <= O(f(n)) <= O(4n)
Reminding that O(4n) = O(n) by properties of O (invariance due to multiplicative constants different from 0), we get that
O(n) <= O(f(n)) <= O(4n) = O(n),
or
O(n) <= O(f(n)) <= O(n)
Notice the left side of this chain of inequality is equal to the right side of the chain, meaning that it is not only a chain of inequality, but a chain of equalities, or
O(n) = O(f(n)) = O(n)
meaning that the complexity is O(n)
I have read that in the dense graph, the number of edges is (n^2) and I don't know-how
If I have a graph and every node connected to other all nodes then the number edges will be ( (n-1) + (n-2) + (n-3) + ..... + 1) so, how the edges in dense graph are (n^2)
It depends on whether your graph is directed. In an undirected dense graph, the number of edges is (n · (n − 1) / 2) (which is equal to your series). In a directed graph, the number is double that, so just (n · (n − 1)).
This is not exactly (n²), but very close to it. You can say that n² is an upper bound, so it is maybe more appropriate to say O(n²) if that makes sense in the context.
It's the Big O notation, maybe what they mean is the complexity when you do a graph traversal.
In Big O notation : O(n²/2) = O(n²)
compatible heuristics (h) is the one that has below condition:
h(n) <= c(n,a,n') + h(n')
****************************************************
admissible heuristics (h) is the one that has below condition:
0 <= h(n) <= h*(n)
h*(n) is the real distance from node n to the goal
If a heuristic is compatible, how to prove it is admissible ?
Thanks a lot.
Assume that h(n) is not admissible, so there exists some vertex n such that h(n) > h*(n).
But because of the compatibility of h(n), we know that for all n` it holds that h(n) <= c(n,a,n') + h(n').
Now combine these two predicates when n` is the vertex G to deduce a contradiction, thus proving the required lemma reduction ad absurdum.
If you add an additional condition on h (namely that h(goal) = 0), you can prove it by induction over the minimum cost path from n to the goal state.
For the base case, the minimum cost path is 0, when n = goal. Then h(goal) = 0 = h*(goal).
For the general case, let n be a node and let n' be the next node on a minimal path from n to goal. Then h*(n) = c(n, n') + h*(n') >= c(n, n') + h(n') >= h(n) using the induction hypothesis to get the first inequality and the definition of compatibility for the second.
An algorithm decomposes (divides) a problem of size n into b sub-problems each of size n/b where b is an integer. The cost of decomposition is n, and C(1)=1. Show, using repeated substitution, that for all values of 2≥b, the complexity of the algorithm is O(n lg n).
This is what I use for my initial equation C(n) = C(n/b) + n
and after k-steps of substituting I get C(n) = C(n/b^k) + n [summation(from i=0 to k-1) of (1/b)^i]
k = log(base b) n
I'm not sure I'm getting all of this right because when I finish doing this i don't get n lgn, anybody can help me figure out what to do?
I think your recurrence is wrong. Since there are b separate subproblems of size n/b, there should be a coefficient of b in front of the C(n / b) term. The recurrence should be
C(1) = 1
C(n) = b C(n/b) +O(n).
Using the Master Theorem, this solves to O(n log n). Another way to see this is that after expanding the recurrence k times, we get
C(n) = bk C(n / bk) + kn
This terminates when k = logb n. Plugging in that value of k and simplifying yields a value that is O(n log n).
Hope this helps!
Assume an arbitrary r
T(n) <= cn + T(n/r) + T (3n/4)
show T(n) <= Dcn for some constant D
by reworking the induction proof, use the expression to argue that:
T(n) <= Dcn does not hold for r=3.
Have a look at the Akra-Bazzi theorem. This is a generalization of the master theorem that does not require subproblems of equal size.