[FOM] EFQ and Tennant's consistency--reply to Arnon Avron

Tennant, Neil tennant.9 at osu.edu
Wed Sep 9 22:03:39 EDT 2015


Arnon Avron writes

"In his postings Neil maintains that ~p & p |- q is [an] invalid inference
that mathematicians never use, while ~p |- p->q is valid."

I don't think I've actually said that mathematicians 'never use' EFQ or any of its obvious equivalents; this would be rash, because many of them--Harvey, for example---are steeped in an orthodox logical training, and are aware of what the orthodox logical systems allow and prohibit, and make their mathematical moves accordingly. What I am maintaining is that all those apparently essential appeals to a *rule* of EFQ are very easily accommodated by a relevantist who refuses to take on EFQ as a general rule in his logical system. When one inspects the fine grain of mathematical reasoning more closely, one finds that the apparent appeals to EFQ arise in two contexts: wanting to infer P->Q after deriving # from P; and wanting, in a proof by cases where one case-proof ends with #, to bring down as the main conclusion the conclusion of the other case proof. If you have EFQ, as an orthodox logician does, you simply apply it to "top up" the subordinate conclusion to be what it needs to be in order for the conventionally formulated rule of conditional proof (->I) to be applicable, or in order for the conventionally formulated rule of proof by cases (vE) to be applicable.

The alternative that the Core logician commends is that one make do without EFQ as an explicit rule, and simply re-formulate ->I and vE so as to permit one to infer P->Q directly upon deriving # from P, and to infer the desired case-proof conclusion directly upon having closed off the other case-proof with # as its conclusion.

Arnon writes further

"The latter [i.e., ~p |- p->q] is justified by [Neil] (in the classical case, I guess)
by the truth table of the material implication (and that of negation
is needed too, right?)."

No. First, it is well known that for the intuitionist/constructivist, the truth tables are acceptable, row by row, reading from left to right (that is, from the status of the constituent propositions to the staus of the compound in question). The core logician, who is a constructivist seeking to ensure relevance of premises of proofs to their conclusions, need not 'think classically' at all in accepting the truth tables. Secondly, that part of the modified rule of ->I in Core Logic that permits one to conclude P->Q upon deriving # from P makes no appeal at all to the truth table for any connective other than ->.

Arnon asks rhetorically

"... how exactly is the validity of ~p |- p->q forced by the truth-tables of -> and ~?"

But rather than waiting for the Core logician's answer, he rushes to provide a conjectured one himself:

"Answer: it is because according to these truth table every model of ~p is also
a model of p->q. In other words: because the set of models of ~p is a subset of the
set of models of p->q."

For the record: I simply don't see the justification as proceeding along such lines at all. Arnon's talk of models is his, not mine. For me, the justification is carried out entirely within an inferentialist framework. I look at the four rows of the truth table for ->, and note that in the two rows where the antecedent is false, the whole conditional is true. So the inference to P->Q from possession of a refutation of P is immediate *and relevant*. One does not even need a step of negation introduction (to get ~P), before proceeding immediately to the conclusion P->Q. And one need not entertain any qualm on the basis of the allegation that one might be making a classical semantic assumption to the effect that the consequent has to be either true or false!  Even if the consequent lacks a truth value, the falsity of the antecedent should suffice for the truth of the conditional, since it suffices both when the consequent is true, and when the consequent is false.

Arnon continues

"But by the same token, the truth tables of & and ~ force  the set of models
of ~p & p to be  a subset of the set of models of q (remember
that Neil did accept that the empty set is a subset of any other set,
and he even proved this in "core logic"). So according to his
own arguments, Neil should accept the validity of ~p & p |- q!"

Again, no. The talk of models is Arnon's alone, and is being foisted on the core logician for no good reason. The core logician might in due course take an interest in preservation of truth from premises to conclusions across all models; but also insist on regarding as needing-to-be-proved only those valid arguments whose premises are genuinely relevant to their conclusions---which is always the case in mathematical reasoning.

For me, models earn their keep by serving as counterexamples to (classically) invalid arguments. X:A is invalid, by definition, just in case some model makes all of X true and makes A false. So no invalid argument has an unsatisfiable set of premises---whence even a complete proof system fails to furnish a proof of the inconsistency of the premise-set of any invalid argument. When we have a sound and complete proof system, we know that the validity of an argument can be shown by means of a checkable proof, and that the invalidity of an argument consists in its having a counterexample; hence every argument has either a proof or a counterexample (completeness);  and no argument has both a proof and a counterexample (soundness). But all this fails to be enough to ensure that valid arguments do not involve spectacularly counterintuitive irrelevancies. For A,~A:B is valid, and is spectacularly irrelevant.

Mathematical reasoning, by contrast, is always valid AND relevant. Hence we are not setting the bar high enough, for a logical system that is to serve for the regimentation of mathematical reasoning, if we require only validity of the arguments that the system proves. We need more. We need to know also that system proves only those valid arguments whose premises are relevant to their conclusions.

That is why I commend Classical Core Logic as the system of choice for the regimentation of informal but expert classical mathematical reasoning. I commend it also for the fields of both automated and interactive theorem-proving, because of the powerful `relevance filter' that it affords. The formalization of any good mathematical argument (i.e., the sequent X:A that is required to be proved) should get through the filter.

Neil Tennant
-------------- next part --------------
An HTML attachment was scrubbed...
URL: </pipermail/fom/attachments/20150910/a8fd1991/attachment-0001.html>


More information about the FOM mailing list