Very weak metatheories

dennis.hamilton at acm.org dennis.hamilton at acm.org
Sat Jul 30 16:24:27 EDT 2022


I don’t think it is appropriate to apply “consistency [strength]” to the operation of the computer hardware by which a seemingly mathematical structure is manifest.
 
There are two things with respect to the software.  First concerns soundness and simple correctness for the function being simulated (whether proof checker or any other purpose), including the correctness of the representation/simulation.  The second has to do with the dependability of the software-hardware combination (and also how we might discern that/when it is not dependable in some fashion).
 
It becomes intensely empirical to consider the physical processor.  The amazing achievement is the extreme dependability of computer processors and memories when you consider the extent of their capacities and operations.
 
For example, we enjoy the conceit that there are 0s and 1s in there.  That’s not the case.  However, what we identify as manifest 0s and 1s preserves that interpretation/representation with such great fidelity that we are excused from having to look deeper into the electronic means by which that is accomplished.  A company such as Intel can explain the instruction processing and operations in those terms in its documentation for use by software developers.  In the early days of digital computing, we had no such luxury, when continued reliable operation of a processor was measured in minutes, not years, exposing the fragility of the electronic (or electro-mechanical) substrate.
 
If you want to know what the producers of all that silicon have to deal with (along with model-checking of the complexities for high performance), consider the problems of radio-frequency emanations, capacitor effects, induction, interference and coupling, and also heat transfer among those extremely small traces and transistors.  Delivery of reliable power and cooling are non-trivial considerations.  Even though there is a kind of logical purpose, the physics is very far removed from that when you tunnel in deep enough.  
 
I daresay we can’t address that level with the same sort of presumed precision and certainty that we embrace in our consideration of mathematical entities. It might be better for checking our models and proofs than we are though.
 
A simple way of putting it might be to say that what is down there does not look like what its operation evokes for us.  Aside: That happens with algorithms too.  A good algorithm is unlikely to resemble the problem it is being used to solve.  Sometimes algorithms are rejected for that reason, those who want to rely on the algorithm being unequipped to trust the superior simpler one.
 
*	Dennis
 
PS: “As far as the laws of mathematics refer to reality, they are not certain; and as far as they are certain, they do not refer to reality.”  Albert Einstein, 1921.
 
 
From: Pedro Sánchez Terraf
Sent: Friday, July 29, 2022 13:58
Subject: Re: Very weak metatheories
 
[orcmid] [ … ]
But actually, that "proof" is ultimately running in the computer hardware, which is a physical system. The question is, which is the consistency strength of that physical system?
I'll try to make sense of the question, at least one interpretation of it (but I'd be interested in any other). The design of the components and circuitry of the computer respond to some mathematical model (laws, equations, etc.) of the underlying physics. Therefore, I should consider that my meta-meta-theory to be that supporting this model. Hence if the strength needed to support that model is supposedly stronger than PA, there is no point in lowering the intermediate meta level (that is, the logic of the proof-assisting software).
Thanks in advance for any new pointers (if there are any) on the discussion of this topic.
Best regards,
[orcmid] [ … ] 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: </pipermail/fom/attachments/20220730/6ded0a2c/attachment-0001.html>


More information about the FOM mailing list