FOM: Precision and Turing-computability

Vasco Brattka Vasco.Brattka at FernUni-Hagen.de
Fri May 24 09:56:40 EDT 2002


Dear All,

To underline Martin Davis' point of view, I would like to mention
that by Kolmogorov's Superposition Theorem, *any continuous* real 
function (on a compact domain) can be represented by a feedforward 
neural network (if arbitrary weights are allowed).

Effective versions of this theorem show that exactly the class of
computable real number functions can be represented by feedforward
neural networks with computable weights and computable activation
functions.

Using non-computable parameters, non-computable basic functions,
or non-effective closure schemes can, of course, lead to non-computable
functions. In Cris Moore's papers you can find several remarks emphasising 
that such modifications increase the degree of "unphysicalness".

The computable Kolmogorov Theorem and other related topics have
been studied in Computable Analysis, which is the Turing machine
based theory of computable real number functions, see

http://www.informatik.fernuni-hagen.de/cca/

And so far, physical phenomena which have been considered turned out
to be computable in a reasonable sense.

Vasco

_______________________________________________________________________

Vasco Brattka
FernUniversitaet           Phone: +49-2331-987-2723
Theoretische Informatik I    Fax: +49-2331-987-319 
Informatikzentrum            WWW: www.informatik.fernuni-hagen.de/thi1/ 
D-58084 HAGEN                     vasco.brattka 
Germany                   E-Mail: Vasco.Brattka at FernUni-Hagen.de  
_______________________________________________________________________




> At 05:51 AM 5/23/2002, José Félix Costa wrote:
> >Dynamical systems with real-valued parameters can compute non-computable
> >functions. In particular neural nets with real weights.
> >
> >Discrete time:
> >Low dimension dynamical systems -- see Cris Moore, Pascal Koiran, ...
> >High dimension dynamical systems -- see Hava Siegelmann and Eduardo Sontag,
> >Max Garzon, ...
> 
> I'm familiar with Hava Siegelmann's work. It is quite misleading. Her 
> neural nets provide, in effect, computable functions of the real weights. 
> Since a real number can be thought of as a coded set of natural numbers, it 
> is hardly surprising, that when she permit arbitrary real numbers, her nets 
> generate ALL languages on a two-letter alphabet. Non-computable inputs 
> produce non-computable outputs.
> 
> She proves that with rational weights she gets precisely the computable 
> langages. Although she doesn't pause to observe this, the same will be true 
> if the weights are limited to COMPUTABLE REALS.
> 
> The accompanying talk about a physics with infinite precision real numbers 
> has no relation to the physics of serious working physicists.
> 
> Martin
> 
> 
> 
> 
>                            Martin Davis
>                     Visiting Scholar UC Berkeley
>                       Professor Emeritus, NYU
>                           martin at eipye.com
>                           (Add 1 and get 0)
>                         http://www.eipye.com
> 
> 
> 
> 




More information about the FOM mailing list