Linear time algorithm for longest path in tree - algorithm

I have been given a question on an assignment that has got me stumped. I may just be thinking too hard about it... The question follows.
Give a linear time algorithm to determine the longest unweighed path in an acyclic undirected graph (that is, a tree).
My first intention is to go with a DFS. But it seems like a DFS would only give me the longest path from the node I start at to another vertex; however, the problem asks for the longest path in the tree... not the longest path from the node I start at. Could someone set me straight?
Thanks.

One such method is to pick any node, A, and in linear time compute distances to all other nodes. Suppose B is most distant from A. In step 2, find the node most distant from B.
Let d(P,Q) denote distance from P to Q. Note that if E is the lowest common ancestor of A, B, C, then d(A,B) = d(A,E)+d(E,B) and also note that d(E,B) ≥ d(E,C).
Edit 1: The algorithm or method – find B most distant from any A; find C most distant from B; claim that d(B,C) is maximal over all vertex pairs in the graph – seems to be sound, but the above does not prove it.
On one hand, it need not be that d(E,B) ≥ d(E,C), and on another, that alone would not be quite enough to establish d(B,C) ≥ d(F,G) where F, G are any nodes in the tree. ...

Related

Proof of having k edge-disjoint paths there and back

I have been trying to prove this for a decent amount of time, but nothing is coming to my mind. Anybody have some tips? I'd be happy to see an explanation.
I also found this question on StackOverflow, that says this:
If the direct u->v traffic doesn't knock out any links in v->u, then
the v->u problem is equivalent of finding the maximum flow from v->u
as if the u->v flow doesn't happen.
He describes how it can be solved, but still there is no answer to the question that the author asked.
So, the problem is:
Given a directed graph, at each vertex the number of incoming and outgoing edges is the same. Let there be k edge-disjoint paths from b to a in this graph.
How can I prove that it also contains k edge-disjoint paths from a to b?
Thanks in advance!
We can try to argue about the general case where the graph is a multi-graph (i.e. can have multi-edges and loops).
Note: Following the convention that two copies of an edge count towards in and degree twice. And a loop count towards both in and out degree once. Also assuming when you say paths, you mean simple paths.
Using induction on the number of vertices in the graph G.
Base case: G has only vertices a and b.
Now as there are k edge-disjoint paths from a to b, all of them are simply k copies of the edge a->b. Thus to have in and out degrees same for both vertices there have to be k copies of b->a, and thus the claim holds.
Induction G has n >=1 vertices apart from a and b.
Let the nth vertex be u. Let in-degree of u, same as its out-degree be d. Let the d vertices with edges "going into" u be s1,s2,..,sd and similarly the d vertices with edges going out from u be t1,t2,..,td (note all these vertices may not be unique). Just pair these vertices up. Say si with ti (1<=i<=d). Now just "short-circuit" the vertex u, i.e. rather than having the edge si->u->ti, just have si->ti. Let the new graph be G'. It is trivial to see in and out degree of vertices are still the same in G' (as it was in G). And it is not hard to argue that the new graph still has k disjoint paths from a to b. Additionally G' has one less vertex. So apply inductive hypothesis, and claim holds for G'. Lastly not hard to check again, un-short circuiting u still keeps the claim intact for G.

Find Two vertices with lowest path weight

I am trying to solve this question but got stuck.
Need some help,Thanks.
Given an undirected Connected graph G with non-negative values at edges.
Let A be a subgroup of V(G), where V(G) is the group of vertices in G.
-Find a pair of vertices (a,b) that belongs to A, such that the weight of the shortest path between them in G is minimal, in O((E+V)*log(v)))
I got the idea of using Dijkstra's algorithm in each node which will give me O(V*((E+V)logv))),which is too much.
So thought about connecting the vertices in A somehow,did'nt find any useful way.
Also tried changing the way Dijkstra's algorithm work,But it get's to hard to prove with no improvment in time complexity.
Note that if the optimal pair is (a, b), then from every node u in the optimal path, a and b are the closest two nodes in A.
I believe we should extend Dijkstra's algorithm in the following manners:
Start with all nodes in A, instead of a single source_node.
For each node, don't just remember the shortest_distance and the previous_node, but also the closest_source_node to remember which node in A gave the shortest distance.
Also, for each node, remember the second_shortest_distance, the second_closest_source_node, and previous_for_second_closest_source_node (shorter name suggestions are welcome). Make sure that second_closest_source_node is never the closest_source_node. Also, think carefully about how you update these variables, the optimal path for a node can become part of the second best path for it's neighbour.
Visit the entire graph, don't just stop at the first node whose closest_source and second_closest_source are found.
Once the entire graph is covered, search for the node whose shortest_distance + second_shortest_distance is smallest.

Designing an Algorithm to find the length of a simple cycle in a d-regular graph

I understand the question in general but don't know how to design and analyze the algorithm in the question. I was thinking of applying some sort of graph search algorithm like depth-first / breadth-first search.
UPDATE: This is what I have tried, starting from any Node of the graph (call it N), visit each of that node's d neighbors. Now, the last neighbor we just visited of N (call it L) visit any other neighbor of L that is not N ?
Others have already hinted on a possible solution in comments, let's elaborate. When d<=1, the solutions are immediate (and depend on your exact definition of cycle), so I'll assume d>1.
One such algorithm would be:
Build a path starting in any vertex V. Until the path has length d, don't allow vertices you've already visited.
Once the path is d vertices long, keep adding vertices to the path, but now only allow vertices different from the last d vertices of the path.
When you add a vertex that's already been used in the path, stop. You create the resulting cycle from a segment of the path starting and ending in that vertex.
In both (1) and (2), the existence of such a vertex is guaranteed by the fact that G is d-regular. When searching for the vertex to add, we only exclude the last d vertices, namely the last vertex (U) and its d-1 predecessors. U has d neighbors, so at least one of them has to be available.
The algorithm will stop, because of the condition (3) and the fact that G is finite.
It makes sense to prefer already visited vertices in (2), but it doesn't change the worst-case complexity.
This gives us the worst-case complexity of n*d - for we may have to visit once every vertex and check all of its edges.

Minimizing the number of connected-checks in finding a shortest path in an implicit graph

I'm quite surprised I couldn't find anything on this anywhere, it seems to be a problem that should be quite well known:
Consider the Euclidean shortest path problem, in two dimensions. Given a set of obstacle polygons P and two points a and b, we want to find the shortest path from a to b not intersecting the (interior of) any p in P.
To solve this, one can create the visibility graph for this problem, the graph whose nodes are the vertices of the elements of P, and where two nodes are connected if the straight line between them does not intersect any element of P. The edge weight for any such edge is simply the Euclidean distance between such two points. To solve this, one can then determine the shortest path from a to b in the graph, let's say with A*.
However, this is not a good approach. Creating the visibility graph in advance requires checking if any two vertices from any two polygons are connected, a check that has higher complexity than determining the distance between two nodes. So working with a modified version of A* that "does everything what it can before checking if two nodes are actually connected" actually speeds up the problem.
Still, A* and all other shortest path problems always start with an explicitly given graph for which adjacent vertices can be traversed cheaply. So my question is, is there a good (optimal?) algorithm for finding a shortest path between two nodes a and b in an "implicit graph" that minimizes checking if two nodes are connected?
Edit:
To clarify what I mean, this is an example of what I'm looking for:
Let V be a set, a, b elements of V. Suppose w: V x V -> D is a weighing function (to some linearly ordered set D) and c: V x V -> {true, false} returns true iff two elements of V are considered to be connected. Then the following algorithm finds the shortest path from a to b in V, i.e., returns a list [x_i | i < n] such that x_0 = a, x_{n-1} = b, and c(x_i, x_{i+1}) = true for all i < n - 1.
Let (V, E) be the complete graph with vertex set V.
do
Compute shortest path from a to b in (V, E) and put it in P = [p_0, ..., p_{n-1}]
if P = empty (there is no shortest path), return NoShortestPath
Let all_good = true
for i = 0 ... n - 2 do
if c(p_i, p_{i+1}) == false, remove edge (p_i, p_{i+1}) from E, set all_good = false and exit for loop
while all_good = false
For computing the shortest paths in the loop, one could use A* if an appropriate heuristic exists. Obviously this algorithm produces a shortest path from a to b.
Also, I suppose this algorithm is somehow optimal in calling c as rarely as possible. For its found shortest path, it must have ruled out all shorter paths that the function w would have allowed for.
But surely there is a better way?
Edit 2:
So I found a solution that works relatively well for what I'm trying to do: Using A*, when relaxing a node, instead of going through the neighbors and adding them to / updating them in the priority queue, I put all vertices into the priority queue, marked as hypothetical, together with hypothetical f and g values and the hypothetical parent. Then, when picking the next element from the priority queue, I check if the node's connection to its parent is actually given. If so, the node is progressed as normal, if not, it is discarded.
This greatly reduces the number of connectivity checks and improves performance for me a lot. But I'm sure there's still a more elegant way, in particular one where the "hypothetical new path" doesn't just extend by length one (parents are always actual, not hypothetical).
A* or Dijkstra's algorithm do not need an explicit graph to work, they actually only need:
source vertex (s)
A function next:V->2^V such that next(v)={u | there is an edge from v to u }
A function isGoal:V->{0,1} such that isGoal(v) = 1 iff v is a target node.
A weight function w:E->R such that w(u,v)= cost to move from u to v
And, of course, in addition A* is going to need a heuristic function h:V->R such that h(v) is the cost approximation.
With these functions, you can generate only the portion of the graph that is needed to find shortest path, on the fly.
In fact, A* algorithm is often used on infinite graphs (or huge graphs that do not fit in any existing storage) in artificial inteliigence problems using this approach.
The idea is, you only look on edges in A* from a given node (all (u,v) in E for some given u). You don't need the entire edges set E in order to do it, you can just use your next(u) function instead.

Minimum sum weight of connecting 3 vertices in an undirected, weighted graph, with only positive edge weights

I'm looking for pointers as to where one could start looking for a solution to this problem.
After googling for some time, the only problem I have found which is simmilar to my problem is a minimum spanning tree. The difference is that I am not looking for a tree that spans all the vertices in a graph, rather who spans 3 given vertices.
I am not looking for a complete program, but a pointer in the general direction of the answer.
Another idea I had was to run 3 searches with the Dijkstra's algorithm. The idea was to somehow find the best path by combining the different shortest paths. I do not know how this would be done.
Here is a graphical example of the type of graph I am talking about:
So the task is to find an way to find the minimum sum weight of connecting any 3 vertecies in this kind of graph.
EDIT :
I solved the problem by running 3 searches with Dijkstra's algorithem. Then I found the vertex which had the minimum sum weight connecting the 3 vertexes by adding togheter all uniqe edges. Thanks for all the help :)
I'm pretty sure you can do this with Dijkstra's algorithm, the only trick is you don't know what order to visit the nodes in, so you'd have to try all 6 orderings.
So if you've got nodes A, B, and C, for the first ordering A, B, C, you'd run Dijkstra's between A and B, between B and C, and between C and A. Then you'd do the next ordering A, C, B. And keep going from there.
Algorithm 1
I think that your idea of using Dijkstra is good.
One way in which you could make this work is by trying every vertex x as a start point and compute the smallest value for the sum w(x,a)+w(x,b)+w(x,c) where a,b,c are the 3 vertices you wish to connect and w(u,v) is the shortest path computed with Dijkstra.
I believe this smallest sum will be the minimum sum weight to connect the 3 vertices.
Unfortunately, this means running Djikstra 3.n times.
Algorithm 2
A better approach is to run Djikstra from each of your nodes to be connected, and store the distances to each node in your graph. (So wa(x) is the shortest distance from x to a, etc.)
You can then iterate over every vertex x as before and compute the smallest value for the sum wa(x)+wb(x)+wc(x)
This is equivalent to algorithm 1, but n times faster as Dijkstra is only run 3 times.
With the restrictions that the weights are all positive and the graph is undirected, you can solve the problem using Dijkstra's algorithm, as suggested, let us say that the nodes in question are A, B, C, all in some graph G.
Run Dijkstra's on G from:
A -> B
B -> C
C -> A
These form the edges of a triangle connecting the three vertices.
We can do this because of the condition that the graph is undirected which implies that the shortest path from A -> B is the same as the one from B -> A.
Because the weights are all positive, the shortest path connecting A, B, and C will contain precisely two edges. (Assuming you are happy ignoring the possible alternate solution of a cycle arising from three 0 weight paths in the "triangle").
So how do we pick which two edges? Any two edges will connect all three vertices, so we can eliminate any of the three, so we will eliminate the longest one.
So this algorithm will do it in the same time complexity as Dijkstra's algorithm.
Looks like generalization of minimum Steiner tree problem.

Resources