monads and Turing machines

José Manuel Rodríguez Caballero josephcmac at gmail.com
Tue Feb 28 02:56:01 EST 2023


 Denis Hamilton asked:

> > The numbers 1/3 and 1/5 are defined as cluster points. Indeed, the closed
> > interval of real numbers between 0 and 1 can be defined as the set of
> > cluster points associated with this computation.
>
> I don't understand how that addresses my concern that the rationals are
> denumerable and for the interval [0, 1] there are binary schemes that
> capture
> all of them exactly.  It does not bear on ideas about convergence at all, a
> matter that does attend attempts to represent irrational and transcendental
> numbers by rational approximations taken to some finite degree of
> proximity.
>

Yes, rational numbers between 0 and 1 can be enumerated in many ways, I
don't deny that. I'm just pointing out that they can also be defined as
limits of rational number sequences having a denominator which is a power
of 2. My choice of rational numbers of this particular type is only
motivated by the simplicity of the representation in memory.

 Denis Hamilton asked:

> > Q3:
> > > One can defend the thesis that this computation defines the set of
> > > functions
> > > of the closed interval between 0 and 1 to itself, which has a
> cardinality
> > > strictly greater than the cardinality of the set of real numbers.
>
> > > It is not clear to me that this claim is defensible.  First,
> > > denumerability
> has not been escaped, so it is unclear how a non-denumerable set of
> functions
> have been represented.
>

The non-denumerable set of functions from the closed interval of real
numbers between 0 and 1 to itself is not represented in the memory of the
Turing machine, but defined in a non-computational way. That is, any
non-computable function f corresponds to a non-constructive choice of a
sequence of amoebas for the argument x and another choice of the sequence
of amebas inside the amoebas for image f(x).

We may say that this game about non-constructive processes inside
computation is useless, because humans can only deal with constructive
entities. That would be Errett Bishop's argument. Nevertheless, in
statistics, we are dealing with a non-constructive framework of many worlds
and our goal is to find in which of all the possible worlds the dataset is
most likely to be generated. That is, statistics is not about how the
universe works. That is the job of fundamental physics. Statistics is about
the observer's position in the multiverse of possible realities, maybe no
physical realities as in the case of economical and social realities.
Using Hugh Everett's metaphor, the goal of a statistician Alice is to
decide which amoeba she most likely lives in. Assuming eternity, the set of
possible Alice's possible histories is uncountable, even if at any time
there is only a finite set of amoebas.

Ronald Fisher's insistence on randomization in experimental design can be
interpreted as constructive selections being the enemy of statistical
inference [2]. Non-constructive mathematics seems to be the natural
foundation of statistics. I'm not sure that formal statistics and its
intuition can be developed coherently in an entirely constructive
framework, avoiding uncountable sets and non-computational
processes. Because of its social importance, any practical foundation of
mathematics should allow the development of statistics. Furthermore, the
foundations of statistics cannot be reduced to the foundations of
probability theory: some logic of inference and an ontology involving
possible worlds are also required.

 Denis Hamilton asked:

> I am reminded that not even the set of all functions over N onto N is
> countable, and, if one accepts the set-theoretic notion of that set, the
> consequence is that there are non-computable functions.  So we are already
> constrained under a ceiling of denumerability (as to computation) and we
> are
> not going to get any higher cardinality of computable functions by
> entertaining approximations of functions over R onto R.
>
> So to suddenly consider functions, f, on the closed interval [0,1] of real
> numbers and the cardinality of those seems to be unapproachable, if we're
> talking about Turing Amoebas.
>
> I don't see how to make any leap to the contrary.
>

As explained in the previous answer, the function f is not represented in
the Turing's machine memory. It is expressed as a non-constructive sequence
of choices inside the computation during an infinite time.

Zhang Yinsheng wrote:

> That is, randomness is non-Turing computable.


Yes, but randomness can exist inside computational processes. Just write
the numbers 1, 2, 3,... in binary code. This can be implemented using a
Turing machine, but most of them will be random sequences of bits (except
the first one from left to right, which is always 1). The question that I
am addressing is how an observer, inside a computation, can perceive
randomness. This idea of an observer inside a computation is borrowed from
Hugh Everett [3]. This idea of perception inside a computation was studied
by S. Wolfram in chapter 10 of A New Kind of Science [1]. It plays an
important role in S. Wolfram's framework for studying the physical
Church-Turing thesis, i.e., the computational ontology of fundamental
physics, as opposed to Leibniz's non-computational ontology, in which :

(i) nature does not jump and,

(ii) a set can be contained in a proper part of itself (as in the case of
Leibniz's monads).

Well, we can reconcile Leibniz and the physical Church-Turing thesis if we
assume that the universe described by Leibniz is implemented in a Turing
machine using a Lazy evaluation [4]. That is how I imagine the Turing
machines being evaluated inside the amoebas. If we insist that the
evaluations are not lazy, then such a reconciliation is not possible.

Kind regards,
Jose M.

[1] Wolfram, Stephen. A New Kind of Science. Champaign: Wolfram media, 2002.
Chapter 10:
https://www.wolframscience.com/nks/chap-10--processes-of-perception-and-analysis/

[2] Hall NS. R. A. Fisher and his advocacy of randomization. J Hist Biol.
2007 Summer;40(2):295-325. doi: 10.1007/s10739-006-9119-z. PMID: 18175604.

[3] Everett, Hugh. *The Everett interpretation of quantum mechanics:
Collected works 1955-1980 with commentary*. Princeton University Press,
2012.

[4] Edalat, Abbas, Peter John Potts, and Philipp Sünderhauf. "Lazy
computation with exact real numbers." Proceedings of the third ACM SIGPLAN
international conference on Functional programming. 1998.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: </pipermail/fom/attachments/20230228/c9c70232/attachment-0001.html>


More information about the FOM mailing list