Quadratic-time vertex cover verification - algorithm

Suppose you are given an undirected graph G with n vertices and m
edges represented by an n x n adjacency matrix A, and you are also
given a subset of vertices S (represented by an array of size m).
How can you check whether S is a vertex cover of G with quadratic
time and space complexity?
By the definition of a vertex cover, I know that we require every edge must be incident to a vertex that's contained in S.
I can easily come up with a cubic algorithm: iterate over the adjacency matrix; each 1 represents an edge (u, v). Check whether u or v are in S. If not, the answer is no. If we get to the end of the adjacency matrix, the answer is yes.
But how can I do this in O(n^2) time? I guess the only real "observation" I've made so far is that we can possibly skip intermediate rows while iterating over the adjacency matrix if we've already found the vertex corresponding to that row in S. However, this has not helped me very much.
Can someone please help me (or point me in the correct direction)?
Thanks

Construct an array T which is the positions of all of the elements NOT in S.
And then:
for i in T:
for j in T:
if A[i][j] == 1:
return False
return True

Related

very hard and elegant question on shortest path

Given a weighed, connected and directed graph G=(V,E) with n vertexes and m edges, and given a pre-calculated shortest path distance's matrix S where S is n*n S(i,j) denotes the weight of shortest path from vertex i to vertex j.
we know just weight of one edge (u, v) is changed (increased or decreased).
for two specific vertex s and t we want to update the shortest path length between these two vertex.
This can be done in O(1).
How is this possible? what is the trick of this answer?
You certainly can for decreases. I assume S will always refer to the old distances. Let l be the new distance between (u, v). Check if
S(s, u) + l + S(v, t) < S(s, t)
if yes then the left hand side is the new optimal distance between s and t.
Increases are impossible. Consider the following graph (edges in red have zero weight):
Suppose m is the minimum weight edge here, except for (u, v) which used to be lower. Now we update (u, v) to some weight l > m. This means we must find m to find the new optimum length.
Suppose we could do this in O(1) time. Then it means we could find the minimum of any array in O(1) time by feeding it into this algorithm after adding (u, v) with weight -BIGNUMBER and then 'updating' it to BIGNUMBER (we can lazily construct the distance matrix because all distances are either 0, inf or just the edge weights). That is clearly not possible, thus we can't solve this problem in O(1) either.

Given an adjacency list for multigraph, compute adjacency list for equivalent (simple) undirected graph in O(|V|+|E|) time

We are given the adjacency list for a multigraph, G = (V, E) and need to find an O(V + E) algorithm to compute the adjacency list of an equivalent (simple) undirected graph.
I found the following solution in another post (it was part of the question section hence my repost):
"[H]aving an array of size |V| so as to mark the vertices that have been encountered at least once in adj[u], and thus preventing duplicates. The array is reset before traversing each adj[u]."
Forgive my ignorance, but I'm not sure how this is O(|V| + |E|). What is the cost of resetting a length |V| array |V| times?
Thank you.
You don't need to actually reset the array.
Say the array stores int. A vertex is marked iff mark[u] == v where v is the index or id of the current vertex.
When you move to the next vertex the value of v changes and all the entries in the array will evaluate to false without having to change the values in the array.

What is the order of efficiency of an Adjacency list

Is it O(n+m) or O(nm)? To construct it be O(nm) and if we wanted to search, add, or delete a value it will be O(n+m), right? Is there anything else that would be important to consider?
Also to convert a matrix into a list it takes O(n2) and to turn a list into a matrix it is only O(nm) correct?
The cost to build an adjacency list is O(m) from zero (because we can add any edge in O(1)) and O(n²) from an adjacency matrix (because we have to check every cell of the matrix).
Adding an edge u-v is O(1) because we can append an entry v to the end of the adjacency list of vertex u
Removing an edge u-v takes O(n) operations because we have to scan the adjacency list of vertex u in order to find the entry for v before we can remove it.
Finding if there is an edge u-v also takes O(n) steps because we must scan the adjacency list of vertex u and check if there is an entry for v
Remotion and search can me improved to O(logN) or average O(1) using a BST or hashing instead of a linked list to store the adjacencies, but most graph algorithms require us to scan the whole adjacency list of a vertex instead of checking individual entries, so we can usually work well with linked lists.
We can convert an adjacency list to an adjacency matrix in O(m), assuming the matrix is initially filled with zeroes. All we have to do is scan the adjacency list of every vertex, and for each edge U-V with weight W we can do matrix[U][V] = W (or matrix[U][V] = 1 if the graph is not weighted). Since we are looking to each edge exactly once (or twice if the graph is not directed), the complexity os O(m).
i think that when you convert the list to the matrix you go:
for each vertex `O(n)`
for each neighbour `O(n)`
and that's why its also O(n^2).
if m>n, one vertex cannot have all m neighbors, and that's why you avoid O(n^3)
exemple:
a: b, c, d
b: a, c, d
c: a, b, d
d: a, b, c
full graph: O(n^2) list size. although n = 4 and m = 6, the size is 4x4 and not 4x6.
(m = (4 * (4-1))/2 = 6 = O(n^2) --full graph formula)

Given an undirected graph G = (V, E), determine whether G is a complete graph

I'm pretty sure this problem is P and not NP, but I'm having difficulty coming up with a polynomially bound algorithm to solve it.
You can :
check that number of edges in the graph is n(n-1)/2.
check that each vertice is connected to exaclty n-1 distinct vertices.
This will run in O(V²), which is polynomial.
Hope it helped.
Here's an O(|E|) algorithm that also has a small constant.
It's trivial to enumerate every edge in a complete graph. So all you need to do is scan your edge list and verify that every such edge exists.
For each edge (i, j), let f(i, j) = i*|V| + j. Assuming vertices are numbered 0 to |V|-1.
Let bitvec be a bit vector of length |V|2, initialized to 0.
For each edge (i, j), set bitvec[f(i, j)] = 1.
G is a complete graph if and only if every element of bitvec == 1.
This algorithm not only touches E once, but it's also completely vectorizable if you have a scatter instruction. That also means it's trivial to parallelize.
Here is an O(E) algorithm:
Use O(E) as it is input time, to scan the graph
Meanwhile, record each vertex p's degree, increase degree only if the neighbor is not p itself (self-connecting edge) and is not a vertex q where p and q has another edge counted already (multiple edge), these checking can be done in O(1)
Check if all vertex's degree is |V|-1, this step is O(V), if Yes then it is a complete graph
Total is O(E)
For a given graph G = (V,E), check for each pair u, v in the V, and see if edge (u,v) is in E.
The total number of u, v pairs are |V|*(|V|-1)/2. As a result, with a time complexity of O(|V|^2), you can check and see if a graph is complete or not.

How can I get the antichain elements in SPOJ-DIVREL?

Problem: http://www.spoj.com/problems/DIVREL
In question, we just need to find the maximum number of elements which are not multiples (a divisible by b form) from a set of elements given. If we just make an edge from an element to its multiple and construct a graph it will be a DAG.
Now the question just changes to finding the minimum number of chains which contain all the vertices which equals the antichain cardinality using Dilworth's theorem as it is a partially ordered set.
Minimum chains can be found using bipartite matching (How: It is minimum path cover) but now I am unable to find the antichain elements themselves?
To compute the antichain you can:
Compute the maximum bipartite matching (e.g. with a maximum flow algorithm) on a new bipartite graph D which has an edge from LHS a to RHS b if and only if a divides b.
Use the matching to compute a minimal vertex cover (e.g. with the algorithm described in the proof of Konig's theorem
The antichain is given by all vertices not in the vertex cover
There cannot be an edge between two such elements as otherwise we would have discovered an edge that is not covered by a vertex cover resulting in a contradiction.
The algorithm to find the min vertex cover is (from the link above):
Let S0 consist of all vertices unmatched by M.
For integer j ≥ 0, let S(2j+1) be the set of all vertices v such that v is adjacent via some edge in E \ M to a vertex in S(2j) and v has not been included in any
previously-defined set Sk, where k < 2j+1. If there is no such vertex,
but there remain vertices not included in any previously-defined set
Sk, arbitrarily choose one of these and let S(2j+1) consist of that
single vertex.
For integer j ≥ 1, let S(2j) be the set of all vertices u
such that u is adjacent via some edge in M to a vertex in S(2j−1). Note
that for each v in S(2j−1) there is a vertex u to which it is matched
since otherwise v would have been in S0. Therefore M sets up a
one-to-one correspondence between the vertices of S(2j−1) and the
vertices of S(2j).
The union of the odd indexed subsets is the vertex cover.

Resources