Express Knowledge base in CNF and resolution in prolog - prolog

I have a knowledge base as CNF and i'd like to use prolog to make a resolution, but
i can't wrap my head around how to formulate the problem in prolog.
I have
KB = { P v Q, Q => (R ^ S), (P v R) => U }
that i put in CNF as:
KB = { P v Q, not(Q) v R, not(Q) v S, not(P) v U, not(R) v U }
and i'd like to prove that KB entails U ( KB |= U ).
I can prove in manually by refutation but i'd like to know how this can be done using prolog ?
Thx

The answer is: it is impossible.
not(Q) v R, not(Q) v S, not(P) v U, not(R) v U
can be expressed by r:-q., s:-q., u:-p. and u:-r..
Questioning ? u. amounts to asking for a proof by refutation.
However,
P v Q
cannot be expressed as a fact/rule in a prolog program. Prolog is restricted to Horn clauses, i.e. disjunctions of literals that contain at most one positive literal.

Related

proving that a language is part of a grammar and vice versa

so here a grammar R and a Langauge L, I want to prove that from R comes out L.
R={S→abS|ε} , L={(ab)n|n≥0}
so I thought I would prove that L(G) ⊆ L and L(G) ⊇ L are right.
for L (G) ⊆ L: I show by induction on the number i of derivative steps that after every derivative step u → w through which w results from u according to the rules of R, w = v1v2 or w = v1v2w with | v2 | = | v1 | and v1 ∈ {a} ∗ and v2 ∈ {b} ∗.
and in the induction start: at i = 0 it produces that w is ε and at i = 1 w is {ε, abS}.
is that right so far ?
so here a grammar R and a Langauge L, I want to prove that from R comes out L.
Probably what you want to do is show that the language L(R) of some grammar R is the same as some other language L specified another way (in your case, set-builder notation with a regular expression).
so I thought I would prove that L(G) ⊆ L and L(G) ⊇ L are right.
Given the above assumption, you are correct in thinking this is the right way to proceed with the proof.
for L (G) ⊆ L: I show by induction on the number i of derivative steps that after every derivative step u → w through which w results from u according to the rules of R, w = v1v2 or w = v1v2w with | v2 | = | v1 | and v1 ∈ {a} ∗ and v2 ∈ {b} ∗. and in the induction start: at i = 0 it produces that w is ε and at i = 1 w is {ε, abS}.
This is hard for me to follow. That's not to say it's wrong. Let me write it down in my own words and perhaps you or others can judge whether we are saying the same thing.
We want to show that L(R) is a subset of L. That is, any string generated by the grammar R is contained in the language L. We can prove this by mathematical induction on the number of steps in the derivation of strings generated by the grammar. We start with the base case of one derivation step: S -> e produces the empty word, which is a string in the language L by choosing n = 0. Now that we have established the base case, we can state the induction hypothesis: assume that for all strings derived from the grammar in a number of steps up to and including k, those strings are also in L. Now we must prove the induction step: that any string derived in k+1 steps from the grammar is also in L. Let w be any string derived from the grammar in k+1 steps. From the grammar it is clear that the derivation of w must be S -> abS -> ababS -> ... -> abab...abS -> abab...abe = abab...ab. But this derivation is the same as the derivation of a string from the grammar in k steps, except that there was one extra application of S -> abS before the application of S -> e. By the induction hypothesis we know that the string w' derived in k steps is of the form (ab)^m for some m at least zero, and adding an extra application of S -> abS to the derivation adds ab. Because (ab)^m(ab) = (ab)^(m+1) we can choose n = m+1. So, all strings derived from the grammar in k+1 steps are also in the language, as required.
To prove that all strings in the language can be derived in the grammar, consider the following construction: to derive the string (ab)^n in the grammar, apply the production S -> abS a number of times equal to n, and the production S -> e exactly once. The first step gives an intermediate form (ab)^nS and the second step gives a closed form string (ab)^n.

Proof by contradiction

I'm trying to do a proof by contradiction, but don't quite understand how to write it down formally or how to come to an answer in this case. I'm doing a conditional statement.
The problem I'm trying to solve is "Given the premises, h ^ ~r and (h^n) --> r, show that you can conclude ~n using proof by contradiction.
I've taken the negation of both h ^ ~r and (h^n) --> r, but I'm unsure how to use these two to prove ~n
so far I've written:
(i.)~((h^n) --> r)
(ii.)~(h ^ ~r)
therefore, ~n
The hardest part I'm having is that this isn't an actual statement that I can imagine a negation of, and a step by step answer of how to do one of these proofs would be really useful, thanks!
Suppose
~(((h ^ ~r) ^ ((h^n) --> r)) --> ~n)
Then,
~(~((h ^ ~r) ^ ((h^n) --> r)) v ~n)
=> ~(~(h ^ ~r) v ~((h^n) --> r)) v ~n)
=> ~((~h v r) v ~(~(h^n) v r)) v ~n)
=> ~((~h v r) v ((h^n) ^ ~r)) v ~n)
=> ~((~h v r) v (h ^ n ^ ~r)) v ~n)
=> ~((((~h v r v h) ^ (~h v r v n) ^ ((~h v r) v ~r)) v ~n)
=> ~(((true) ^ (~h v r v n) ^ (true)) v ~n)
=> ~((~h v r v n) v ~n)
=> ~(~h v r v n v ~n)
=> ~((~h v r) v (n v ~n))
=> ~((~h v r) v (true))
=> ~(true)
=> false //contradiction
Therefore,
((h ^ ~r) ^ ((h^n) --> r)) --> ~n
Let's define:
p1 := h ^ ~r, p2 := (h ^ n) -> r and q := ~n
we want to prove that p1 ^ p2 -> q.
Assume by contradiction that q=false. Then n=true. There are two cases r=true and r=false.
Case r=true
Then p1 cannot be true because ~r=false. Contradiction.
Case r=false
From p2 we deduce that (h ^ n) must be false. And given that we have assumed n=true, it must be h=false, in contradiction with p1.
Direct proof
From p1 we get h=true and r=false. Now from p2 we deduce (h ^ n) = false. And since h=true, it must be n=false, or ~n=true.
I think the OP is probably asking about, or mis-interpretting, the structure of a proof by contradiction rather than requesting a detailed proof for the specific example.
The structure goes like this ...
We've been told to assume a set of things A1, A2, ... An
Let's also assume the negation of what we eventually hope to prove, i.e. ~C
Do some logic that ends with any contradiction, by which we mean any statement of the form X & ~X
Now we ponder what that means. Since a contradiction can never be true, there must be something wrong with at least one of our n+1 assumptions. Could be that several or all of the assumptions are false. But if any n of the assumptions are true then the remaining one cannot be true. We cannot tell at this stage which one is the problem.
In this case we have been told ahead of time to accept A1, A2, ... An, and on that basis we can select the assumption of ~C as the one to be rejected.
As a final step we conclude that if A1, A2, ... An are true then C must be true.

Proof by resolution - Artificial Intelligence

I'm working with an exercise where I need to show that KB |= ~D.
And I know that the Knowledge Base is:
- (B v ¬C) => ¬A
- (¬A v D) => B
- A ∧ C
After converting to CNF:
A ∧ C ∧ (¬A v ¬B) ∧ (¬A v C) ∧ (A v B) ∧ (B v ¬D)
So now I have converted to CNF but from there, I don't know how to go any further. Would appreciate any help. Thanks!
The general resolution rule is that, for any two clauses
(that is, disjunctions of literals)
P_1 v ... v P_n
and
Q_1 v ... v Q_m
in your CNF such that there is i and j with P_i and Q_j being the negation of each other,
you can add a new clause
P_1 v ... v P_{i-1} v P_{i+1} ... v P_n v Q_1 v ... v Q_{j-1} v Q_{j+1} ... v Q_m
This is just a rigorous way to say that you can form a new clause by joining two of them, minus a literal with opposite "signs" in each.
For example
(A v ¬B)∧(B v ¬C)
is equivalent to
(A v ¬B)∧(B v ¬C)∧(A v ¬C),
by joining the two clauses while removing the opposites B and ¬B, obtaining A v ¬C.
Another example is
A∧(¬A v ¬C)
which is equivalent to
A∧(¬A v ¬C) ∧ ¬C.
since A counts as a clause with a single literal (A itself). So the two clauses are joined, while A and ¬A are removed, yielding a new clause ¬C.
Applying this to your problem, we can resolve A and ¬A v ¬B, obtaining ¬B.
We then resolve this new clause ¬B with B v ¬D, obtaining ¬D.
Because the CNF is a conjunction, the fact that it holds means that every clause in it holds. That is to say, the CNF implies all of its clauses. Since ¬D is one of its clauses, ¬D is implied by the CNF. Since the CNF is equivalent to the original KB, the KB implies ¬D.

return minimum or less than a particular number - prolog

The question requires us to code Prolog that if there is no element in the list
L that is less than A, is satisfied when M is A; otherwise it is satisfied when M
is the minimum of the list L
minLessThan([],A,A).
minLessThan([H|T], A, M) :-
H >= A,
M is A,
minLessThan(T, A, M).
minLessThan([H|T], A, M) :-
H < A,
M is H,
minLessThan(T, A, M).
Now that, my result valids for the first part of the sentence, however it keeps returns me false when the M is the minimum of the list L, i assume that the problem occurs when the list L is empty, which it returns the A, any way to solve this?
As this is homework I am only going to give you some advice:
For what you are trying to do, you should not instantiate M in the second and third clause until the recursive step.
That is, you don't have to put the M is A and M is H because M will get instantiated in the base case (the first clause).
Also, the recursive call in the third clause should use H instead of A as the second argument, because you have found an item less than the current minimum.
Note that the second parameter of minLessThan can be read as "the current minimum found", so you should call the procedure giving some initial value, otherwise instantiation errors will occur.
minLessThan([],A,A).
minLessThan([H|T], A, M) :-
H >= A,
minLessThan(T, A, M).
minLessThan([H|T], A, M) :-
H < A,
minLessThan(T, H, M).
Here is my edited answer, works fine for now .

Proving the Associativity of OR

I need help proving the following:
(a ∨ b) ∨ c = a ∨ (b ∨ c)
I don't want the answer... just a hint that will help me understand the process of proving this.
Thank you.
Why not just prove it by doing all possible values of a, b and c = True, False? -- there are only 2^3 = 8 different cases.
Here's a start, for a=T, b=F, c=T
(a v b) v c = a ∨ (b ∨ c)
(T v F) v T = T v (F v T)
T v T = T v T
T = T
(However, this isn't really a programming question...)
What is your axiom set?
Not knowing the set, you could build a truth table

Resources