[FOM] The Role of Formalization

Neil Tennant neilt at mercutio.cohums.ohio-state.edu
Mon Aug 2 15:46:42 EDT 2004


On Mon, 2 Aug 2004 henriknordmark at mac.com wrote:

> Recently, there has been a fair amount of interesting discussions about 
> the scope of mathematics and about mathematical rigor. This reminded me 
> of an anecdote of a student of axiomatic set theory who insisted in 
> writing all of his proofs for homework exercises in a extremely 
> rigorous fashion. All of his proofs were symbolic and with the 
> appropriate software these proofs could probably have been checked by a 
> computer. The grader was a bit confused as to what to do with these 
> proofs since they were correct and very rigorous but it was not really 
> what was expected from him. After a while, the professor and the grader 
> recognized the fact that it was in some sense admirable that he wrote 
> his proofs so formally, but requested him to start writing his proofs 
> at a higher level of abstraction, which was more appropriate for 
> human-readability.

Perhaps readability by human beings is best ensured not so much by
"writing ... proofs at a higher level of abstraction", as by endeavouring
always to use such formalisms as permit one to give perfectly formal
proofs that are more nearly *homologous* to the patterns of informal
reasoning that human mathematicians typically employ. 

The student who was being "too rigorous" or "too formal" might have been
judged that way only because the formal system within which he was
constructing his proofs was itself wanting, in not being "natural" enough.
No one would wish to read proofs even of simple mathematical results if
the formalism were, say, based on a Hilbert system for logic. Matters are
greatly improved if one moves to a Gentzen--Prawitz-style system of
natural deduction. But even that won't be enough, say, for set theory, if
one then grafts on the axioms for ZF as long formulae using only epsilon
as an extra primitive.

IMHO the formalization of mathematics, if it is too enable
machine-checking as well as readability-by-human-beings, is going to need
much more careful design of its basic rules of inference. We need to deal
not only with connectives, quantifiers and the identity sign, but also 
directly with the various mathematical primitives. By "mathematical
primitives" here I mean not just the epsilon in a Bourbaki-style
axiomatization of a given branch of mathematics in ZFC. I mean, rather,
the notions that are treated as linguistically primitive in the usual
course of ordinary mathematical talk. Only occasionally does one 
explicitly relate such primitives to other ones via definitions. These
definitions of mathematical notions need to be framed by means of
primitive rules of inference, in a well-founded fashion, so that one's
reasoning can remain, as much as possible, at the higher levels of
abstraction, without using the rules to "unpack" any definitions.
Readability by human beings will then be a byproduct of ensuring that one
does not expose more logical structure than is needed in order to carry
out a proof. (Maxim of shallow analysis) Moreover, by keeping all premises
and rules *relevant* to the train of reasoning, one ensures that one uses
as few premises as possible in the antecedent of the sequent proved.
(Maxim of narrow analysis)

Neil Tennant




More information about the FOM mailing list