[FOM] Proof Identity

Jon Cohen Jonathan.Cohen at anu.edu.au
Wed Dec 7 18:56:45 EST 2005


Hi,

On Wed, 2005-12-07 at 07:26, Alessio Guglielmi wrote:
> About Prawitz's normalisation: for sure, a lot of interesting research has come out 
> of it. However, we all know that it doesn't work well for classical 
> logic. Proof theorists instinctively associate cut-free proofs *in 
> some way* to proof identity. 

I think that part of the problem here is in extending Prawitz's claim to
sequent calculus in the first place. Indeed, Prawitz made the claim for
ND only. In the same volume that Prawitz made his claim (Proceedings of
the Second Scandanavian Logic Symposium), Kreisel (page 113) argues that
care must be taken in carrying the claim over to sequent calculus:

"...the normalization procedure for natural deduction does not
correspond to a particularly natural cut elimination procedure.
Consequently we shall have to look at the usual formulations of a
calculus of sequents and vary the \emph{order} in which different cuts
are considered, and see if different orders lead to noncongruent
cut-free derivations..."

A more modern way to address these concerns is to enrich the categorical
semantics in such a way that it captures differences in cut reductions
(for instance, Fuhrmann and Pym's Classical Categories
http://www.cs.bath.ac.uk/~pym/semclasspro.html). Another possibility is
to enrich in a rewrite system, which leads to the notion of
sesquicategories, discussed by Street in his article in Volume 1 of the
Handbook of Algebra.

Now, one may reasonably wonder whether Prawitz's claim gives the whole
story for ND. This is certainly not obvious, since it amounts to saying
that beta-reduction is the best way of capturing equivalence of lambda
terms and it is not obvious that this is fine grained enough. Similar
concerns are raised by van Bentham in the context of categorial grammars
(http://www.illc.uva.nl/Publications/ResearchReports/PP-2003-20.text.pdf
note 1). 

> especially because deep 
> inference gets rid of all the bureaucracy associated to rule 
> permutations. 

Does ND not do this?

> A question that 
> should be asked, then, is: how can I design inference rules such that 
> I capture a given notion of proof identity? 

It is inescapable that proof theory and category theory are intimately
linked - one may always study the category freely generated by a
deductive system. Now, various deductive rules give rise to various
structures on the resultant category. The question of whether the
inference rules are natural then becomes the question of whether they
create something categorically natural. Moreover, one wants to know that
the various structures imposed by the inference rules (tensor product,
function space etc.) play nicely with each other. In other words, one
wants a coherence theorem. Armed with a coherence theorem, one then has
a natural notion of proof identity. So, my interpretation of your
question is "How should one design coherence conditions?". In general, 
this is a deep problem in category theory. If one is to pursue a deep
inference proof theory as distinct from categorical proof theory, then
the relationship with categorical proof theory should be explicated and
the points raised by Hughes
(http://citeseer.ist.psu.edu/hughes04deep.html) ought to be addressed.

best,
Jon
-- 
http://rsise.anu.edu.au/~jon



More information about the FOM mailing list