X-Message-Number: 7994
From: 
Date: Wed, 2 Apr 1997 21:53:26 -0500 (EST)
Subject: Once more, with feeling

More brief clarifications & rejoinders re consciousness & the self circuit:

1. Graham Clarke (#7969) says that "...the self circuit may have originated
as the home of the survival instinct, taking input from all senses to produce
a fight/flight response." This is similar to my conjectures about the
evolutionary origin of feeling--that feeling tends to promote survival
because it makes certain kinds of information processing more efficient, in
effect packaging and categorizing them. 

He also mentions "feelings, emotions, and consciousness," allowing the "self"
to emerge. I define "consciousness" as "the integration of feeling and
computing." This means that certain kinds of qualia constitute the
subjective, detailed representation of detailed sensory stimulation. (I see a
complex picture, and know that I see it.) Of course qualia are not only the
result of sensory stimulation, but then can also act as stimuli to other
parts of the brain, producing decisions and actions (or sometimes actions
without decisions).

Incidentally, I would not draw any serious distinction between "awareness"
and "self-awareness." It isn't possible to have feelings without "knowing"
you have them--and therefore, in effect, knowing you exist as a person.
"Self-awareness" as often used is just this same realization raised to the
level of directed attention and cogitation. I have a distinct memory, as a
very young child, of difficulty with the concept of "I"--and yet I was
certainly self-aware.

2. John Clark (#7971) says that genetic drift will soon eradicate any trait
that doesn't have positive value for survival or proliferation. I think one
could find many counter-examples--traits that persisted for long periods with
no obvious material benefit. As a possible example ("possible" because I
haven't checked out all the details and time periods) consider the black hair
and brown eyes of Asians, probably over periods longer than the persistence
of several sub-species. Why should other colors have less survival or
proliferative value?...For that matter, there are traits and organs that have
survived DESPITE actual mild negative value, such as the vermiform
appendix....I believe I could generalize the concept, with rationale, if I
wanted to take the time.

John also quotes me, "Feeling could have survival value by making certain
'computations' or decisions or reactions more efficient." He then says,
"[This] means it would be easier to make a conscious intelligent computer
than an unconscious one." (He is again trying to bolster his dogma that
intelligence implies consciousness.)

Doesn't follow. It isn't necessarily easier to make something that is more
efficient--usually the contrary, since usually the first effort in anything
is followed by a series of improvements. Even in math or physics, the elegant
solution isn't always the first. Years later, Einstein found a much simpler
way to prove E = MC^2.

Later in the same post, John says, "I assume intelligent behavior implies
consciousness. I'll never be able to prove it." As to why he needs this
faith, he says, "I must accept it as an axiom because I could not function if
I thought it was untrue; I very much doubt that you [Donaldson} could
either."

John also says (#7977) that nature found it easier to produce feeling than
intelligence, since feeling preceded intelligence. Well, first of all, WE
have not found it easier to produce feeling than intelligence, since we HAVE
produced some recognizable degree of "intelligence" in computers, with no
recognizable feeling. Also, some primitive creatures display some degree of
"intelligence" but with no proven capacity for feeling.

I suspect this "need" to accept something like the Turing test is very
different in the cases of Donaldson and Clark. In Donaldson's case (or most
cases) the problem only arises in connection with other people; and as many
times pointed out, since other people and other mammals are very much like
ourselves biologically, it makes ordinary common sense to assume they have
feelings--it doesn't have to be an axiom. In Clark's case, however--if I may
be so presumptuous--it seems fairly clear that he has some psychological need
to reject the reasonable possibility that a "robot" (intelligent but without
feeling) might exist. 

He seems to equate such a doubt with a kind of racism or meat chauvinism.
Without getting political, it should not need to be said that there ARE
innate differences (of many kinds) between individuals, families, subspecies,
species, and between carbon and silicon. A silicon being (let alone artifact)
might be very different in many ways, including the point at issue.... More
than a century ago, we are told, German artisans made automatons that looked
like ducks, walked like ducks, and quacked like ducks. 

3. "Hacker" (#7975) says, in effect, that someone at a different location
(from mine) can't be me. And presumably nobody is going to get HIM into a
beam-me-up machine. (Me either, in our present state of knowledge.) But both
he and the infofreaks (sorry--just a convenient abbreviation) are wrong, in
the sense that they are selective in their survival criteria with no clear,
rigorous rationale for what they accept and what they reject. NO MATTER WHAT
POSITION YOU  TAKE, I CAN PRODUCE A THOUGHT EXPERIMENT THAT  DEMONSTRATES (or
strongly suggests)THE OPPOSITE. 

Therefore what? Therefore we obviously don't yet have a handle on it, and it
is useless to pretend we have any solid conclusion as to what constitutes
survival. Why insist you know the answer? Why not defer a conclusion until we
are better informed (and maybe smarter)? The information still lacking,
besides the  anatomy/physiology of feeling, includes the nature of time and
probably much else. 

4. Tim Freeman (#7976) thinks the term "seat of feeling" may be unnecessary
or even misleading. 

One of his examples seems faulty. "I can't say which of those pesky
transistors is the seat of computation for the computer." But it is pretty
clear the CPU chip is the "seat of computation." The seat of computation is
more or less localized, and more or less excludes such necessary but
ancillary parts as the chassis.

Another of his examples is better: Which part of an automobile is the seat of
forward-going? In this case, the "seat" is distributed over several parts
including the engine, transmission, wheels, etc.

If you don't like "seat," Tim, how about "basis?" Regardless of language,
feeling must have its basis or seat in some part(s) /aspect(s) of the
anatomy/physiology of the brain. This is PLAINLY true. How USEFUL it may be
to use the term "self circuit" is another story. All I claim is that it tends
to focus attention where attention is needed, and may give hints for
experimentation.

5.  Tim Freeman (#7980) discusses a suggestion that consciousness is just "a
property of an ongoing computation that is controlling some device that is
interacting with the rest of the world." A system whose behavior is part of a
plan-making process is an individual ("me" to itself). 

I haven't yet read the provided reference, and Tim's brief discussion may not
do it justice, but as far as I can see at this point, it doesn't say anything
useful and bypasses the crux of the matter, viz., the
anatomical/physiological nature of feeling.

Remember Grey Walter's turtles? They were small, crude, rolling automatons
that sought out electric outlets to "feed" (recharge). They exhibited
goal-directed behavior and some adaptability--which we might call a degree of
intelligence. They displayed a disjuncture between "self" and environment
also--but we have no reason at all to think they had the slightest
consciousness.   

Robert Ettinger

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=7994