[FOM] Unreasonable effectiveness
Antonino Drago
drago at unina.it
Sat Nov 2 17:20:35 EDT 2013
I seem that Chow's premise (i.e. the effective existence in the history of
science, of a persisting matching of mathematics with natural science) is
ill-stated.
Wigner supported the idea of the unreasonable effectiveness of mathematics
in theoretical physics by referring to the mathematics of differential
equations.
First. No whatsoever mathematical theory is suitable for a specific natural
science. Along centuries physics developed differential equations, but (pace
Rashevsky) last century theoretical biology started by means of an at all
different kinds mathematics.
Second. Wigner's consideration was anticipated by M. Born, who was surprised
that thermodynamics resisted to be formulated by means of differential
equations. He asked to the friend C. Carathéodory to provide a suitable
system. The latter suggested a new formulation of thermodynamics, but 1) it
starts from abstract, unoperative axioms, as Carathéodory himself
recognised; 2) the differential equation is rather the resolution of the
problem of the exact differential (so, a 1st order equation, like Fourier's
equation of heat transport; very different from the reversible equations).
Third. The time in which Born suggested the problem, was just before that
theoretical physics changed abruptly its mathematics from the continuous one
to the discrete one (se the introduction of 1905 Enistein's paper on
quanta); which was the the mathematics of chemistry since two centuries.
Fourth. The mathematics in physics changed again abruptly in '60s, after the
discovery of the non parity by Yang and Lee; rather than differential
equations, symmetries became the basic mathematical technique (for
elementary particles, e.g.).
Fifth. Also the effectiveness of the mathematics of the symmeties is not
universal. There is no formulation of quantum mechanics entirely based on
symmetries (although Weyl tried it since the beginnings of the theory). By
reflecting about all past physical theories, A.O. Barut wrote a fine
reflection on the complementarity of the two mathematical techniques in
theoretical physics.
In conclusion, in order to develop a scientific theory of nature, one has to
try to apply or even to invent (e.g. the infinitesimal analysis for Neton
mechanics) one among several kinds mathematical theories, without any
persisteance if not along some periods of time (and so contributing to some
paradigms). Hence, there is no constant historical phenomenon to be
parametrized.
Best greetings
Antonino Drago
----- Original Message -----
From: "Timothy Y. Chow" <tchow at alum.mit.edu>
To: <fom at cs.nyu.edu>
Sent: Saturday, November 02, 2013 4:40 PM
Subject: [FOM] Unreasonable effectiveness
> In 1960, Wigner argued for the unreasonable effectiveness of mathematics
> in the natural sciences, and his thesis has been enthusiastically accepted
> by many others.
>
> Occasionally, someone will express a contrarian view. The two main
> contrarian arguments I am aware of are:
>
> 1. The effectiveness of mathematics is about what one would expect at
> random, but humans have a notorious tendency to pick patterns out of
> random data and insist on an "explanation" for them when no such
> explanation exists.
>
> 2. The effectiveness of mathematics is higher than one would expect from a
> completely random process, but there is a form of natural selection going
> on. Ideas are generated randomly, and ineffective ideas are silently
> weeded out, leaving only the most effective ideas as survivors. The
> combination of random generation and natural selection suffices to explain
> the observed effectiveness of mathematics.
>
> Unfortunately, the application of mathematics to the natural sciences is
> such a complex and poorly understood process that I see no way of modeling
> it in a way that would allow us to investigate the above controversy in a
> quantitative manner. I am wondering, however, if recent progress in
> computerized formal proofs might enable one to investigate the analogous
> question of the (alleged) "unreasonable effectiveness of mathematics in
> mathematics."
>
> I am not sure exactly how this might go, but here is a vague outline.
> Theorems are built on lemmas. We want to construct some kind of model of
> the probability that Lemma X will be "useful" for proving Theorem Y. This
> model would be time-dependent; that is, at any given time t, we would have
> a probabilistic model, trained on the corpus of mathematics known up to
> time t, that could be used to predict future uses of lemmas in theorems.
> This model would represent "reasonable effectiveness." Then the thesis of
> "unreasonable effectiveness" would be that this model really does evolve
> noticeably over time---that the model at time t systematically
> underestimates uses of Lemma X in Theorem Y at times t' > t.
>
> I am wondering if anyone else has thought along these lines. Also I am
> wondering if there is any plausible way of using the growing body of
> computerized proofs to make the above outline more precise. There is of
> course the problem that the "ontogeny" of computerized proofs does not
> exactly recapitulate the "phylogeny" of how the theorems were arrived at
> historically, but nevertheless maybe something can still be done.
>
> Tim
> _______________________________________________
> FOM mailing list
> FOM at cs.nyu.edu
> http://www.cs.nyu.edu/mailman/listinfo/fom
>
More information about the FOM
mailing list