FOM: Proposal: Unified Foundations

Vaughan R. Pratt pratt at cs.Stanford.EDU
Wed Oct 29 15:33:02 EST 1997


(Sorry about the turgid prose in the following, it's a bad habit of
mine.  I cleaned up some of it but it's still bad in spots.  -vp)

I'd like to propose the following as a suitable program of research for
anyone interested in making the sort of sense of structuralism that
Harvey has been quite rightly demanding.  The general idea is to unify
the propositional and constructive views on foundations as a single
coherent framework.

The rationale for this program is as follows.

First, with suitable care this program has the potential to inherit
essentially intact the entire tradition and technology of two important
tracks in foundations.  The first is the conservative track defending
FOL, more than adequately represented on FOM.  The second is the
constructive track originating with Brouwer (on whose life btw Dirk van
Dalen will speak at LICS this year).  More specifically it focuses on
the path through Gentzen culminating in Girard's linear logic (LL).
This path though alive and well in some CS and logic circles,
especially in Europe, seems almost completely unrepresented here.

The program proposed here is in a position to make full and effective
use of both the "king" (Elvis?) as Harvey calls first order logic and
the emerging picture of the structure of linear logic, admittedly not
yet even royalty in the more conservative sectors of logic yet already
quite detailed after only a decade.

Second, this program sets more specific and ambitious goals than other
proposals to develop structuralism along other than the
Brouwer-Gentzen-Girard (BGG) track (the BeeGee's? no, the Beatles).
The future I see for those programs is gradual alignment with both FOL
and LL as they develop.  (I'd be very happy to be proved wrong by the
emergence of a visibly better track, now *that* would be exciting new
logic.)

Third, even if you don't agree that this is the best way of going about
formalizing structuralism, it provides a reference against which to
measure the breadth of scope and technical depth of the goals and
progress of competing approaches.  The basis for this claim is the
combined authority of both tracks, the prospective unification of which
is hard to imagine beating as a benchmark for goals in FOM.

Fourth, the goals of this program are both realistic and very
worthwhile.  I would not have said so two years ago, but since then I
have seen enough new results and built up enough perspective on the
overall picture to feel justifiably confident of this position.

I think it's good to have a role model when one exists.  An excellent
role model I see for this program, at more than one level, is physics.
Physics took light to consist of waves in the 19th century, but in the
20th century saw that this model is isomorphic (more precisely, dual)
to what had been considered a competing but failed model, light as
particles.  Each view is mathematically equivalent to the other, but
one or the other may be psychologically clearer depending on the
circumstances.  The most primitive duality containing the seeds of this
correspondence is that between elements and characters of finite
abelian groups, with the next two small steps in the elaboration being
locally compact abelian groups and then Hopf algebras.

(The "will" in the next bit is not intended as a prediction, only as a
literary device for my aspirations.)

With respect to this particular role model for foundations, and no
other, I see mathematics as following in physics' footsteps a century
behind.  In the 20th century we seem to have committed to a view of
mathematics as the sum of its theorems.  In the 21st century we will
see that this view is dual to that of mathematics as the fabric of its
proofs.  Circumstances will determine which of these mathematically
equivalent views give the psychologically clearer picture for any given
situation.  The most primitive mathematical duality containing the
seeds of this correspondence is that between vertices (as propositions)
and edges (as proofs) in (not graphs but) finite chains with bottom,
with the next two small steps being topological chains and then Stone
duality.  The most primitive commonsense counterpart of this duality is
that of being and doing.

As a double instance of ontogeny recapitulating phylogeny (both the
finite chains and the sum-fabric distinction), mathematics itself from
a sufficient distance will be seen as having the form of an edge
connecting two vertices.  At one end sit the sets, at the other the
(complete atomic) Boolean algebras.  Along the interior of the edge are
arrayed "the rest of mathematics".  The objects of all toposes,
including directed graphs, simplicial sets, and more generally
presheaves, as well as such nontopos objects as posets, topological
spaces, metric spaces, and small categories themselves, all sit near
the set end.  Distributive and algebraic lattices, frames, fields, and
the objects of all dual toposes take up their positions near the
Boolean end.  Chains, abelian groups, R-modules, vector spaces, Banach
spaces, Hilbert spaces, and the objects of all other abelian
categories, sit at or near the middle.  (Through an obscure artifact of
how I assign positions to finite objects, a conspicuously large gap
appears immediately on either side of the middle.)

First-order logic, with Goedel's completeness proof intact, will be
absorbed into a larger completeness proof for the whole picture, in
which both the discreteness of sets and the coherence of structure will
participate equally.  The general pattern of development will be along
the lines of the full completeness proofs emerging in the last four
years for fragments of linear logic as models of fragments of
mathematics, with both fragments continuing to expand until the picture
envisaged by this program is complete.  This expansion with that
eventual result, a complete reconciliation of Elvis and the Beatles, as
the goal in mind is where my current research is focused.

-------------------

I'm not sure whether the following adds to or subtracts from the
program as conceived above.  Let me include it anyway as an attachment
and see what sort of reaction it elicits.  If largely negative I'll
infer that it would be better kept to myself for the time being.

Cartesian dualism, Descartes' partitioning of the universe into mind
and body, supplies another perspective on this program.

The initial focus of FOL is on the endpoints of the mathematical
universe, with sets of individuals at the set end and Boolean algebras
of predicates at the Boolean end, correlating with respectively body
and mind.  FOL then moves inwards to reach the rest of mathematics via
nonlogical axioms.

The initial focus of linear logic is the center, with "perp" realizing
reflection about the middle and with growth outward from the middle.
The role of the exponentials !A and ?A is to define the ends of the
universe, with objects of the form !A being setlike inasmuch as they
constitute a cartesian closed category (not necessarily a topos).  As
such they should create tension stretching the models of LL out to
actual sets and Boolean algebras.  However Girard's own models of LL,
in particular Hilbert spaces realizing his geometry of interaction
program and more recently complete Banach spaces, are perfect examples
where !A and ?A are not creating enough tension to expand out from the
middle.  I'm a bit vague on what's going on here, but I suspect this
incomplete stretching stems from Girard's reluctance to make the
exponentials idempotent, which they aren't at the middle but are at the
ends of the full mathematical universe.

I see the corresponding dichotomy in physics, common to both classical
and quantum mechanics, as putting the individuals of first order logic
in correspondence with points of space-time and its predicates with
values of momentum-energy.

(To ramble even further off the track here, for those expecting entropy
rather than energy as the natural correlate of
predicates/mind/information in the above, the discrepancy boils down to
a matter of statistical mechanics.  The equation dQ = T dS, when
divided by dt to relate watts (energy flux dQ/dt) to baud (negative
information flux dS/dt), quantifes the situation of someone having to
talk louder to be heard over the prevailing level of noise T as
temperature.  The basis for the formula, namely the energy partition
function, is the same statistical phenomenon that makes the
distribution of maximal cliques in a 1000-vertex random graph peak
sharply at 15, the "temperature" of the universe of graphs of that
size.  This statistical connection is transparent to computer
programmers, who measure mental flux in units of baud rather than
watts, by virtue of bits being implemented with thousands of atoms, a
situation expected to change qualitatively around 2020.)

Russell's theory of neutral monism resolves the Hobbes-Berkeley choice
problem, mind or body but not both, in essentially the same way that
physics resolves the wave-or-particle-but-not-both choice, namely by
declaring them equivalent, but without any technical depth to speak of,
let alone a formal mathematical basis.  In
http://boole.stanford.edu/pub/ratmech.ps.gz (TAPSOFT'95) I outline a
way to make mathematical sense of neutral monism based on the duality
of Chu spaces.  The trick is to derive each of body-body interaction
and mind-mind interaction from mind-body interaction instead of the
much harder (though not intrinsically impossible) other direction as
attempted and generally despaired of by Descartes and his critics,
which I do in the paper.

Vaughan Pratt



More information about the FOM mailing list