[FOM] 576: Game Correction/Simplicity Theory 1
hendrik at topoi.pooq.com
Thu Jan 29 14:21:47 EST 2015
On Tue, Jan 27, 2015 at 10:39:49AM -0500, Harvey Friedman wrote:
> SIMPLICITY THEORY 1
> COMPLEXITY DEFINITION. Let M be a relational structure. We use and,
> or, not, if then, iff, formal, therexists, =, together with the
> relation, constant, and function symbols interpreted by M. The
> complexity of a formula in L(M) is the total number of relation,
> constant, and function symbols appearing in the formula.
I was playing with the complexity of definition of specific integers
some decades ago. I used more or less the same definition for
complexity you did, but I used Church lambda expressions instead of the
your relational notation. Because you use existential quantifiers
you'll have the sema potential difficulty I did. Quantifiers can be
nested deeply, leading to more and more different bound variables. It
becommes possible to pack a *large* amount of into a variable. Then
each use of a variable can contribute a large amount of information
(and intuitive complexity) to the definition.
I decided to measure complexity of variables by the logarithm of the
number of variables there were to choose from. Perhaps requiring
variables to be words made of letters in a fixed alphabet is a more
intuitive (and computationally simpler) to accomplish the same objective.
> Often there are a few closely related formulations that are each
> interesting. E.g., for groups, there is (G,dot), (G,dot,1), (G,dot,-),
> (G,dot,-,1). We learn that adding structure like this is a good idea,
> partly because it reduces complexity.
> Or we may simply
> want to restrict the kind of definitions being considered such as
> "purely universal definitions". Of course, with that approach, some of
> the notions don't even have purely universal definitions, something
> that is often easy to prove.
For some concepts, adding structure like that can eliminate the need
for existential quantifiers.
More information about the FOM