How to find the minimum set of vertices in a Directed Graph such that all other vertices can be reached - algorithm

Given a directed graph, I need to find the minimum set of vertices from which all other vertices can be reached.
So the result of the function should be the smallest number of vertices, from which all other vertices can be reached by following the directed edges.
The largest result possible would be if there were no edges, so all nodes would be returned.
If there are cycles in the graph, for each cycle, one node is selected. It does not matter which one, but it should be consistent if the algorithm is run again.
I am not sure that there is an existing algorithm for this? If so does it have a name? I have tried doing my research and the closest thing seems to be finding a mother vertex
If it is that algorithm, could the actual algorithm be elaborated as the answer given in that link is kind of vague.
Given I have to implement this in javascript, the preference would be a .js library or javascript example code.

From my understanding, this is just finding the strongly connected components in a graph. Kosaraju's algorithm is one of the neatest approaches to do this. It uses two depth first searches as against some later algorithms that use just one, but I like it the most for its simple concept.
Edit: Just to expand on that, the minimum set of vertices is found as was suggested in the comments to this post :
1. Find the strongly connected components of the graph - reduce each component to a single vertex.
2. The remaining graph is a DAG (or set of DAGs if there were disconnected components), the root(s) of which form the required set of vertices.

[EDIT #2: As Jason Orendorff mentions in a comment, finding the feedback vertex set is overkill and will produce a vertex set larger than necessary in general. kyun's answer is (or will be, when he/she adds in the important info in the comments) the right way to do it.]
[EDIT: I had the two steps round the wrong way... Now we should guarantee minimality.]
Call all of the vertices with in-degree zero Z. No vertex in Z can be reached by any other vertex, so it must be included in the final set.
Using a depth-first (or breadth-first) traversal, trace out all the vertices reachable from each vertex in Z and delete them -- these are the vertices already "covered" by Z.
The graph now consists purely of directed cycles. Find a feedback vertex set F which gives you a smallest-possible set of vertices whose removal would break every cycle in the graph. Unfortunately as that Wikipedia link shows, this problem is NP-hard for directed graphs.
The set of vertices you're looking for is Z+F.

Related

Finding a Minimal set of vertices that satisfy the given constraints

Note: no need for formal proof or anything, just the general idea of the algorithm and I will go deeper myself.
Given a directed graph: G(V,E), I want to find the smallest set of vertices T, such that for each vertex t in T the following edges don't exist: {(t,v) | for every v outside t} in O(V+E)
In other words, it's allowed for t to get edges from vertices outside T, but not to send.
(You can demonstrate it as phone call, where I am allowed to be called from outside and it's free but it's not allowed to call them from my side)
I saw this problem to be so close or similar to finding all strongly connected components (scc) in a directed graph which its time complexity is O(V+E) and I'm thinking of building a new graph and running this algorithm but not totally sure about that.
The main idea is to contract each strongly connected component (SCC) of G into a single vertex while keeping a score on how many vertices were contracted to create each vertex in the contracted graph (condensation of G). The resulting graph is a directed acyclic graph. The answer is the vertex with lower score among the ones with out-degree equal 0.
The answer structure is an union of strongly connected components because of the restriction over edges and you can prove that there is only a SCC involved in the answer because of the min restriction.

Will a standard Kruskal-like approach for MST work if some edges are fixed?

The problem: you need to find the minimum spanning tree of a graph (i.e. a set S of edges in said graph such that the edges in S together with the respective vertices form a tree; additionally, from all such sets, the sum of the cost of all edges in S has to be minimal). But there's a catch. You are given an initial set of fixed edges K such that K must be included in S.
In other words, find some MST of a graph with a starting set of fixed edges included.
My approach: standard Kruskal's algorithm but before anything else join all vertices as pointed by the set of fixed edges. That is, if K = {1,2}, {4,5} I apply Kruskal's algorithm but instead of having each node in its own individual set initially, instead nodes 1 and 2 are in the same set and nodes 4 and 5 are in the same set.
The question: does this work? Is there a proof that this always yields the correct result? If not, could anyone provide a counter-example?
P.S. the problem only inquires finding ONE MST. Not interested in all of them.
Yes, it will work as long as your initial set of edges doesn't form a cycle.
Keep in mind that the resulting tree might not be minimal in weight since the edges you fixed might not be part of any MST in the graph. But you will get the lightest spanning tree which satisfies the constraint that those fixed edges are part of the tree.
How to implement it:
To implement this, you can simply change the edge-weights of the edges you need to fix. Just pick the lowest appearing edge-weight in your graph, say min_w, subtract 1 from it and assign this new weight,i.e. (min_w-1) to the edges you need to fix. Then run Kruskal on this graph.
Why it works:
Clearly Kruskal will pick all the edges you need (since these are the lightest now) before picking any other edge in the graph. When Kruskal finishes the resulting set of edges is an MST in G' (the graph where you changed some weights). Note that since you only changed the values of your fixed set of edges, the algorithm would never have made a different choice on the other edges (the ones which aren't part of your fixed set). If you think of the edges Kruskal considers, as a sorted list of edges, then changing the values of the edges you need to fix moves these edges to the front of the list, but it doesn't change the order of the other edges in the list with respect to each other.
Note: As you may notice, giving the lightest weight to your edges is basically the same thing as you suggest. But I think it is a bit easier to reason about why it works. Go with whatever you prefer.
I wouldn't recommend Prim, since this algorithm expands the spanning tree gradually from the current connected component (in the beginning one usually starts with a single node). The case where you join larger components (because your fixed edges might not all be in a single component), would be needed to handled separately - it might not be hard, but you would have to take care of it. OTOH with Kruskal you don't have to adapt anything, but simply manipulate your graph a bit before running the regular algorithm.
If I understood the question properly, Prim's algorithm would be more suitable for this, as it is possible to initialize the connected components to be exactly the edges which are required to occur in the resulting spanning tree (plus the remaining isolated nodes). The desired edges are not permitted to contain a cycle, otherwise there is no spanning tree including them.
That being said, apparently Kruskal's algorithm can also be used, as it is explicitly stated that is can be used to find an edge that connects two forests in a cost-minimal way.
Roughly speaking, as the forests of a given graph form a Matroid, the greedy approach yields the desired result (namely a weight-minimal tree) regardless of the independent set you start with.

Minimum vertex cover

I am trying to get a vertex cover for an "almost" tree with 50,000 vertices. The graph is generated as a tree with random edges added in making it "almost" a tree.
I used the approximation method where you marry two vertices, add them to the cover and remove them from the graph, then move on to another set of vertices. After that I tried to reduce the number of vertices by removing the vertices that have all of their neighbors inside the vertex cover.
My question is how would I make the vertex cover even smaller? I'm trying to go as low as I can.
Here's an idea, but I have no idea if it is an improvement in practice:
From https://en.wikipedia.org/wiki/Biconnected_component "Any connected graph decomposes into a tree of biconnected components called the block-cut tree of the graph." Furthermore, you can compute such a decomposition in linear time.
I suggest that when you marry and remove two vertices you do this only for two vertices within the same biconnected component. When you have run out of vertices to merge you will have a set of trees not connected with each other. The vertex cover problem on trees is tractable via dynamic programming: for each node compute the cost of the best answer if that node is added to the cover and if that node is not added to the cover. You can compute the answers for a node given the best answers for its children.
Another way - for all I know better - would be to compute the minimum spanning tree of the graph and to use dynamic programming to compute the best vertex cover for that tree, neglecting the links outside the tree, remove the covered links from the graph, and then continue by marrying vertices as before.
I think I prefer the minimum spanning tree one. In producing the minimum spanning tree you are deleting a small number of links. A tree with N nodes had N-1 links, so even if you don't get back the original tree you get back one with as many links as it. A vertex cover for the complete graph is also a vertex cover for the minimum spanning tree so if the correct answer for the full graph has V vertices there is an answer for the minimum spanning tree with at most V vertices. If there were k random edges added to the tree there are k edges (not necessarily the same) that need to be added to turn the minimum spanning tree into the full graph. You can certainly make sure these new edges are covered with at most k vertices. So if the optimum answer has V vertices you will obtain an answer with at most V+k vertices.
Here's an attempt at an exact answer which is tractable when only a small number of links are added, or when they don't change the inter-node distances very much.
Find a minimum spanning tree, and divide edges into "tree edges" and "added edges", where the tree edges form a minimum spanning tree, and the added edges were not chosen for this. They may not be the edges actually added during construction but that doesn't matter. All trees on N nodes have N-1 edges so we have the same number of added edges as were used during creation, even if not the same edges.
Now pretend you can peek at the answer in the back of the book just enough to see, for one vertex from each added edge, whether that vertex was part of the best vertex cover. If it was, you can remove that vertex and its links from the problem. If not, the other vertex must be so you can remove it and its links from the problem.
You now have to find a minimum vertex cover for a tree or a number of disconnected trees, and we know how to do this - see my other answer for a bit more handwaving.
If you can't peek at the back of the book for an answer, and there are k added edges, try all 2^k possible answers that might have been in the back of the book and find the best. If you are lucky then added link A is in a different subtree from added link B. In that case you can confine the two calculations needed for the two possibilities for added link A (or B) to the dynamic programming calculations for the relevant subtree so you have only doubled the work instead of quadrupled it. In general, if your k added edges are in k different subtrees that don't interfere with each other, the cost is multiplied by 2 instead of 2^k.
Minimum vertex cover is an NP complete algorithm, which means that you can not solve it in a reasonable time even for something like 100 vertices (not to mention 50k).
For a tree there is a polynomial time greedy algorithm which is based on DFS, but the fact that you have "random edges added" screws everything up and makes this algorithm useless.
Wikipedia has an article about approximation algorithm, claims that it reaches factor 2 and claims that no better algorithm is know, which makes it quit unlikely that you will find one.

How do I explore a directed graph (DAG) by visting minimum number of starting vertices?

Given a DAG (possibly not strongly connected e.i consisting of several connected components), the goal is to find the minimum number of starting vertices required to visit to fully explore the graph.
One method I thought of was to generate all permutations of the given vertices and run a topological sort in that order. The one with the minimum backtracks would be the answer.
Is there an efficient algorithm to perform the above task?
This a famous problem called minimum path cover, it's a pity that wiki says nothing about it, you can search it in google.
As methioned, the minimum path cover problem is NP-hard in normal graph. But in DAG, it can be solved with Matching.
Method:
Dividing each vertex u into two different vertex u1 and u2. For every edge (u->v) in orginal graph, adding edge (u1->v2) in new graph. Then do any matching algorithm you like. The result is n - maximum matching, n is total number of vertex in orginal graph.

Minimal path - all edges at least once

I have directed graph with lot of cycles, probably strongly connected, and I need to get a minimal cycle from it. I mean I need to get cycle, which is the shortest cycle in graph, and every edge is covered at least once.
I have been searching for some algorithm or some theoretical background, but only thing I have found is Chinese postman algorithm. But this solution is not for directed graph.
Can anybody help me? Thanks
Edit>> All edges of that graph have the same cost - for instance 1
Take a look at this paper - Directed Chinese Postman Problem. That is the correct problem classification though (assuming there are no more restrictions).
If you're just reading into theory, take a good read at this page, which is from the Algorithms Design Manual.
Key quote (the second half for the directed version):
The optimal postman tour can be constructed by adding the appropriate edges to the graph G so as to make it Eulerian. Specifically, we find the shortest path between each pair of odd-degree vertices in G. Adding a path between two odd-degree vertices in G turns both of them to even-degree, thus moving us closer to an Eulerian graph. Finding the best set of shortest paths to add to G reduces to identifying a minimum-weight perfect matching in a graph on the odd-degree vertices, where the weight of edge (i,j) is the length of the shortest path from i to j. For directed graphs, this can be solved using bipartite matching, where the vertices are partitioned depending on whether they have more ingoing or outgoing edges. Once the graph is Eulerian, the actual cycle can be extracted in linear time using the procedure described above.
I doubt that it's optimal, but you could do a queue based search assuming the graph is guaranteed to have a cycle. Each queue entry would contain a list of nodes representing paths. When you take an element off the queue, add all possible next steps to the queue, ensuring you are not re-visiting nodes. If the last node is the same as the first node, you've found the minimum cycle.
what you are looking for is called "Eulerian path". You can google it to find enough info, basics are here
And about algorithm, there is an algorithm called Fleury's algorithm, google for it or take a look here
I think it might be worth while just simply writing which vertices are odd and then find which combo of them will lead to the least amount of extra time (if the weights are for times or distances) then the total length will be every edge weight plus the extra. For example, if the odd order vertices are A,B,C,D try AB&CD then AC&BD and so on. (I'm not sure if this is a specifically named method, it just worked for me).
edit: just realised this mostly only works for undirected graphs.
The special case in which the network consists entirely of directed edges can be solved in polynomial time. I think the original paper is Matching, Euler tours and the Chinese postman (1973) - a clear description of the algorithm for the directed graph problem begins on page 115 (page 28 of the pdf):
When all of the edges of a connected graph are directed and the graph
is symmetric, there is a particularly simple and attractive algorithm for
specifying an Euler tour...
The algorithm to find an Euler tour in a directed, symmetric, connected graph G is to first find a spanning arborescence of G. Then, at
any node n, except the root r of the arborescence, specify any order for
the edges directed away from n so long as the edge of the arborescence
is last in the ordering. For the root r, specify any order at all for the
edges directed away from r.
This algorithm was used by van Aardenne-Ehrenfest and de Bruin to
enumerate all Euler tours in a certain directed graph [ 1 ].

Resources