[FOM] The Lucas-Penrose Thesis

Eray Ozkural examachine at gmail.com
Mon Oct 2 20:09:45 EDT 2006


On 10/2/06, John McCarthy <jmc at steam.stanford.edu> wrote:
> Humans can observe some of our mental state, but
> computer programs can be much more conscious than that.

Agreed. Here are some ways that a computer program can be more
conscious than a human.

1) It can know its programming precisely.
2) It can keep a full trace of execution and then
change/debug its programming. Using the trace it can
perfectly recreate previous mental states
3) It can rewrite itself from scratch if it feels like.
4) It can extend its mind, for instance by forming new
perception systems that can explore another sensory modality.
5) Turn on/off subsystems at will, precisely manage
computational power given to processes.

That is, it can be self-aware at the level of its programming.

I think much more can be said of this issue, but my general
impression is that there is no function in consciousness that
cannot be implemented as a computer program. I do not understand
why locality is an issue. It may seem subjectively that our brain
operates as a "whole", i.e. like in a quantum superposition, but this
may well be an illusion, similar to the way that our folk psychology
is often wrong about how the brain/world works. For instance, we may
neglect the delays involved in the processing and regard the output
of a mental process as if it were instantly conceived, while in fact
this is one part of the brain making a wrong inference about the process...
Since the brain does not have the above kind of traces of execution (i.e. what
happened in which part of the brain) it can easily fall prey to simple
theories that are wrong.

Best Regards,

-- 
Eray Ozkural, PhD candidate.  Comp. Sci. Dept., Bilkent University, Ankara
http://www.cs.bilkent.edu.tr/~erayo  Malfunct: http://myspace.com/malfunct
ai-philosophy: http://groups.yahoo.com/group/ai-philosophy


More information about the FOM mailing list