Can you do without conditional proof if you have indirect proof?

Did post #19 suffice to answer your original question?

Yes it did–thanks!

-KR

You don’t even need all of those. Literally all you need for prop logic is DeMorgan’s law, double negation (A ^ B) = --(A^B) = -(-A v -B) and the rest can follow from resolution – i.e.

A v -B
-A v C
Therefore, B v C

All prop logic statements can be converted into a sequence of or statements, and all proofs can be done by resolution.

(Technically this means all you need for logic is an or operator and a not operator, but and is so common we’ll be generous and grant it and DeMorgan’s at minimum)

Forgot to mention:

Proofs by resolution are proofs by contradiction. You assume the opposite is true and derive the empty set.

I think you mean

A v B
-A v C
Therefore, B v C

And you’ll also need a rule saying “If you can derive falsehood (i.e., the 0-ary disjunction) from the added assumption of A, then you can conclude -A”; otherwise, you’d never be able to get off the ground. And, for Frylock’s purposes, with “->” in the language, you’ll also need a rule defining “->” in terms of &, v, and - (“A -> B = -A v B”, say). But, yes, that’d be enough.

On edit: Whoops, should have refreshed. You noted the proof by contradiction bit already.

Oops

But I was just mentioning all you need for prop logic, obviously you can define “->” if you really want (just like you can define xor if you really want, even though a lot of people don’t use it) – I was just pointing out that you don’t HAVE to.

Sure, sure. I mean, there’s nothing special about v, &, and - either; you don’t have to have them either. Your language could be just the 0-ary connectives true and false, and the 3-ary connective (a ? b : c) [i.e., *if a, then b, else c*], and your rules could be

true ? b : c = b
false ? b : c = c
f(x) = x ? f(true) : f(false)

with a derivation of a statement being a derivation of its equality to true.

Indeed, as I see it, this is the soul of Boolean algebra. Then, of course, you could define all the other connectives out of these (-a = a ? false : true, a v b = a ? true : (b ? true : false), a -> b = a ? b : true, and so on…).

Whoops, come to think of it, you didn’t say enough: you need a distributivity rule to be able to rewrite things into conjunctive normal form.

That is, your rules will not be enough to derive a contradiction from

A v (B & C), -((A v B) & (A v C))

Proof: All your rules would still be valid if we interpreted negation standardly, but re-interpreted “A v B” as meaning “true, no matter what A and B are”, and “A & B” as meaning “false, no matter what A and B are”. However, under this re-interpretation, one cannot obtain a contradiction from the above two propositions, as both will amount to tautologies.

So all spelled out, the resolution rules are like so:

The primitive operations are negation (-) and disjunction (v)
Disjunctions are an abelian monoid; you can re-order and re-associate them at will. [Alternatively, you could reword the resolution rule to be agnostic to such issues]
Negation is involutive; you may replace --X by X or vice versa anywhere at will.
Disjunction distributes over its De Morgan dual [the operation defined by A & B = -(-A v -B)]; you may replace X v (Y & Z) by (X v Y) & (X v Z) anywhere at will.
From A v B and -A v C, you may derive A v C
If adding the assumption A lets you derive the empty disjunction, you may derive -A

It’s alright, but it doesn’t seem so much more minimalistic to me than anything else.

Er, and also, toss “From A & B, you may derive A” on top of all that.

If we’re talking about minimal sets of operators, you just need the sheffer stroke. But what I was working on was something intended (at least originally) to be more lay-intuitive.

I’m actually really surprised to hear you say that anything is the soul of anything. :wink:

Can you say something, or point me to something, explaining in what sense false is the 0-ary case of v, and true the 0-ary case of &?

Well, try looking at it this way:

(false v A) is logically equivalent to A - it evaluates to true if A is true, otherwise it is false. In the same way (false v A) v B is equivalent to A v B

(true & A) is logically equivalent to A, in the same way.

Therefore, if you were to ‘run the v program’ without supplying any inputs at all, it would default to false - it returns true iff at least one of the inputs is true. Similarly, if you had an & program that could take arbitrarily many inputs, the easiest way to describe its behavior is that it returns false iff at least one of the inputs is false. Saying that it ‘returns true iff all the inputs are true’ is logically equivalent, but when you put it in terms of ‘at least one’, it’s easier to see the logically consistent behaviour for 0 inputs. So without any input, the & program will return true.

Does that make any sense?

It does make a kind of sense, but feels like fudging when you start talking about how to make it “easier to see” something or the “easiest way to describe” behavior.

Okay, then, consider this…

A: All the barbers in this town are Chinese.

B: You’re lying. I’ve met everybody in town, and I didn’t see any Chinese barbers.

A: Did you see any barbers at all?

B: Well, no, I guess not. But if there aren’t any barbers, then you can’t say anything about them.

A: No, it’s true. All the barbers of this town are Chinese, and what’s more, they’re all missing a finger.
Do you agree with B, or can you see from A’s (slightly whimsical) point of view? Speaking logically, ‘all of group A can be described by B’ is true if group A is empty - just like the empty set is a subset of every other set.

Sure. I can point you to your own attempt to explain it, slightly generalized.

Think of A & B & C & D & … as meaning “All of the conjuncts are true”. If there are no conjuncts, this is vacuously true.

Think of A v B v C v D v … as meaning “There exists a conjunct which is true”. If there are no conjuncts, this is vacuously false.

Also: Think of True as positive integers, and False as 0. Then & is multiplication, and v is addition. Empty multiplication is 1, empty addition is 0.

In general, whenever one has a monoid (an associative operation with an identity element), one thinks of the identity element as the 0-ary case of the monoid operation. This lets one cleanly unify many analyses. [And also means monoids can alternatively be characterized by the description in the second link above.]

We can also just look at the 0-ary case of the proof rules you gave:

We can derive A_1 & … & A_n from the prerequisites of each of A_1, …, A_n. In the 0-ary case, there are no prerequisites; thus, we can derive 0-ary conjunction for free. This makes 0-ary conjunction as good as True.

We can derive C from A1 v _ … _ A_n, along with the further requirements of each of A_1->C, …, A_n -> C. In the 0-ary case, there are no further requirements; thus, we can derive arbitrary C from 0-ary disjunction. This makes 0-ary disjunction as good as False.

Don’t worry. If anyone pressed me on it, I’d quickly concede that Boolean algebra is a multifarious subject which can be fruitfully understood from many disparate perspectives and via many disparate mathematical analogies, and also, that there’s no such thing as souls. :slight_smile:

If I may continue my pedantry, the (2-ary) Sheffer stroke isn’t quite sufficient to axiomatize Boolean algebra. You can’t build any of the 0-ary operations from it (sure, you can get 1-ary operations which ignore their input, but that’s not the same thing. The empty set can be given a Sheffer stroke structure, but the empty set is not a Boolean algebra.). Easily patched, of course; a proper minimal set of operators would be the 2-ary Sheffer stroke along with the 0-ary operation False (that is, the negations of the 2-ary and 0-ary ANDs).

I have no recollection of writing that at all. But now that I’ve read it, and the rest of your post, I can see how the analogy works, and can understand the justification in terms of unified analyses.

I will (apparently) not remember any of this four years hence I’m afraid. :frowning: