the physicalization of metamathematics

Vaughan Pratt pratt at cs.stanford.edu
Mon Aug 1 02:41:08 EDT 2022


Some 20 years before Stephen Wolfram's book *A New Kind of Science*
appeared in 2002, Stephen called me at Stanford from LA to let me know that
very simple cellular automata could exhibit very complex behavior.  As this
was not news to me, I was at a loss as to how to respond, but was happy to
discuss his insight with him.

Forty years later Stephen is still on that kick.  In response to Monroe
Eskew's reasonable question last week, "Does it make testable
predictions?", Stephen offered what I would call an enigmatic response:
"Rather amazingly, yes."

In those forty years I've had time to mull over this question of whether
any combinatorial insights can contribute to the foundations of physics,
and if so what.

I have a short list of criteria for what might count as a "testable
prediction".  At the top of my list is:

1.  Planck's constant h.  Does your combinatorial theory predict its value?

Second on my list is:

2.  Boltzmann's constant k.  Same question.

The remaining predictions have to do with the concept of a particle as an
excited state of Fock space, which can be either a boson or a fermion, or
some superposition thereof (many years ago I attended a physics talk at
Stanford by a Yale graduate student whose thesis was on that third
possibility).

What I'd like to see is all of these notions flowing from the idea of the
Big Bang as simply a discrete "topological" space of cardinality 2^2^2^3 =
115,792,089,237,316,195,423,570,985,008,687,907,853,269,984,665,640,564,039,457,584,007,913,129,639,936.
Stephen wants physics to flow from something simple, and 2^2^2^3 is surely
simple, yet large enough to account for the size of what the Big Bang
produced.

I put "topological" in quotes to indicate that the customary requirement of
closure of the open sets under arbitrary union and finite intersection did
not apply in this idea.  Instead this is a large Chu space, which simply
drops those constraints without however changing the definition of a
continuous function.  Time emerges as a consequence of there being more
transformations (continuous functions) from a more discrete space to a less
discrete one than vice versa.  Planck's constant emerges as the reciprocal
of the product of the number of points and number of open sets aka states
of any given space, suitably scaled, obtained from its role in Heisenberg's
uncertainty principal.

What I don't have is an account of Boltzmann's constant k, in part for lack
of a concept of temperature.  In particular I have no idea how many
millions of degrees this model of the Big Bang started with.

Where my account differs from Stephen's proposed connections between math
and physics is that, rather than seeking to identify physical concepts in
combinatorial notions, it aims to steer the combinatorics of finite but
very large Chu spaces towards fundamental physics.

In my account, the vast majority of my Big Bang will surely have wandered
off into universes totally unlike those of our experience, having in common
only Planck's constant and perhaps also Boltzmann's constant.  What's
special about our universe is that it has particles as excited states of
Fock space, whose origin from the Big Bang needs to be retraced.

Our observable universe is special in that it is a far lower dimensional
temporal descendant of the Big Bang than the vast majority of its other
descendants.

Vaughan Pratt
-------------- next part --------------
An HTML attachment was scrubbed...
URL: </pipermail/fom/attachments/20220731/0c7fd646/attachment-0001.html>


More information about the FOM mailing list