the physicalization of metamathematics

José Manuel Rodríguez Caballero josephcmac at gmail.com
Fri Aug 5 10:32:20 EDT 2022


Vaughan Pratt asked:

> I have a short list of criteria for what might count as a "testable
> prediction".  At the top of my list is:
> 1.  Planck's constant h.  Does your combinatorial theory predict its value?
> Second on my list is:
> 2.  Boltzmann's constant k.  Same question.


I will answer as an external affiliate of the Wolfram Physics Project,
based on my own research, but my answer may differ from what S. Wolfram has
in mind as there are various approaches in the group. So, I only speak for
myself.

My starting point is to clarify that the goal of the Wolfram Physics
Project is to find a fundamental theory of physics. Hence, we are only
required to explain the fundamental constants (not other constants, which
may be emergent or change of unit of measure). According to John Baez

https://math.ucr.edu/home/baez/constants.html

the fundamental constants are these 26 numbers:

   - the mass of the up quark
   - the mass of the down quark
   - the mass of the charmed quark
   - the mass of the strange quark
   - the mass of the top quark
   - the mass of the bottom quark
   - 4 numbers for the Kobayashi-Maskawa matrix

   - the mass of the electron
   - the mass of the electron neutrino
   - the mass of the muon
   - the mass of the mu neutrino
   - the mass of the tau
   - the mass of the tau neutrino
   - 4 numbers for the Pontecorvo-Maki-Nakagawa-Sakata matrix

   - the mass of the Higgs boson
   - the expectation value of the Higgs field

   - the U(1) coupling constant
   - the SU(2) coupling constant
   - the strong coupling constant

   - the cosmological constant


>From the point of view of fundamental physics, the answer to this question
is trivial: both Planck's constant and Boltzmann's constant are equal to 1
because they are used as units of measure in this field. In the same
spirit, the speed of light in the vacuum c is 1 in fundamental physics.

There is a computational interpretation of the U(1) coupling constant
(listed above), which is in the spirit of the Wolfram Physics Project but
was developed independently of it by S. Lloyd. We can find it on page 17 of
his preprint:

https://arxiv.org/pdf/quant-ph/9908043.pdf

Notice that S. Lloyd expressed his computational interpretation in terms of
the fine structure constant, which, according to J. Baez's essay cited
above,

The electromagnetic coupling constant [aka the U(1) coupling constant] is
> just another name for the fine structure constant


My research at the Wolfram Physics Project was in the direction of the
computational interpretation of the cosmological constant (listed above).
My idea was to try to link it to the growth of the descriptive complexity
of the universe as a function of time, which I predicted was a logarithmic
function for the whole universe (including all the many worlds of quantum
mechanics). Nevertheless, each copy of us in a given world (in a quantum
mechanical sense) will see that this function is linear instead of
logarithmic This is for the same reason that, generically, a binary tree of
height n can be described using log n bits, but each branch needs n bits to
be described (of course, in come cases, there can be compression of
information). Among all the fundamental constants, I think that this one is
the closest to the physicalization of mathematics and the easier to adapt
to math: the cosmological constant of the expansion of the mathematical
space as a deductive system. My hypothesis is that an expanding space is
linked to an increment in descriptive complexity as a function of time
corresponding to the system inducing this space. I would like to clarify
that this is not a result, but just an idea to motivate research in that
direction.

Finally, I would like to conclude with a testable prediction of Wolfram's
framework

Wolfram, Stephen. "Undecidability and intractability in theoretical
physics." Physical Review Letters 54.8 (1985): 735.

which according to I. Pitowsky, is one version of the physical
Church-Turing thesis

Pitowsky, Itamar. "The physical Church thesis and physical computational
complexity." Iyyun: The Jerusalem Philosophical Quarterly/עיון: רבעון
פילוסופי (1990): 81-99.‎

the other version is due to D. Deutsch,

Deutsch, David. "Quantum theory, the Church–Turing principle and the
universal quantum computer." *Proceedings of the Royal Society of London.
A. Mathematical and Physical Sciences* 400.1818 (1985): 97-117.

The prediction is that everything in the universe (when including all the
many worlds of quantum mechanics) is computational in the sense of
computers based on classical physics. Some people argue that the existence
of quantum algorithms that outperform the classical ones in a given task is
a refutation of this thesis, but this is wrong: quantum mechanics can be
simulated using a classical computer, and people inside the simulation may
feel it as real. The point of the Wolfram model is to explain how an
observer, inside a computer simulation, perceives the algorithm. This is
why it is natural to consider the question of the mathematical observer: an
intelligent entity that is embedded in the development of mathematics as a
computational process. By observer, I mean a machine with memory, as it was
defined by H. Everett III:

Everett, Hugh. "The theory of the universal wave function." The many-worlds
interpretation of quantum mechanics. Princeton University Press, 2015.
1-140.

Kind regards,
Jose M.

Some essays that I wrote about the Wolfram model (these are essays, not
scientific articles and they may have typos):

https://arxiv.org/pdf/2108.08300.pdf

https://arxiv.org/pdf/2108.03751.pdf

https://arxiv.org/pdf/2006.01135.pdf
-------------- next part --------------
An HTML attachment was scrubbed...
URL: </pipermail/fom/attachments/20220805/ce2090fc/attachment-0001.html>


More information about the FOM mailing list