FOM: Precision and Turing-computability
Martin Davis
martin at eipye.com
Mon May 27 13:27:03 EDT 2002
At 02:49 AM 5/24/2002, José Félix Costa wrote regarding the work of Hava
Sigelmann:
The inputs are not non-computable. They are finite and satisfying classical
constraints. ARRNs are seen as high-dimensional dynamical system. What is
addressed is not the possibility of recognizing all languages but the fact
that P/poly coincides with with ARNNs computing in polynomial time.
I am not saying that such a compter is physicaly realizable.
In addition to an article in SCIENCE, I have read Siegelmann's monograph:
"Neural Networks and Analog Computation" subtitled "Beyond the Turing
Limit". It is the claim, emphasized in the subtitle, that Siegelmann's
model of computation somehow manges to compute things that are not
Turing-computable, that concernes me. (As with any model of computation,
non-trivial questions arise as soon as finite bounds are imposed and/or
complexity issues are introduced; but my message was not concerned with
that.) In the monograph, she demonstrates that networks permitted arbitrary
real weights can compute any language on two letters, hence, in particular,
non-Turing computable languages. Although she doesn't call the weights on
the neurons "inputs", that is what in effect they are. So as I said, by
allowing arbitrary inputs, she gets arbitrary outputs, not an astounding
feat, and one that if claimed to take us "beyond the Turing limit" is
decidedly misleading.
Martin
Martin Davis
Visiting Scholar UC Berkeley
Professor Emeritus, NYU
martin at eipye.com
(Add 1 and get 0)
http://www.eipye.com
More information about the FOM
mailing list