X-Message-Number: 14757
Date: Sat, 21 Oct 2000 17:57:36 -0700
From: Mike Perry <>
Subject: Feeling and Sentience

Bob Ettinger, #14745:

>The physical basis of feeling has to be something which is not just a symbol 
>or a representation of something else; it has to be the thing-in-itself or 
>ding-an-sich. Experiment will decide whether my idea is correct, if we find 
>the standing waves (or something similar) and correlate them with reported 
>feelings.

I'll grant that the physical basis of feeling has to be something physical.
But exactly what and exactly what other properties it may or must have are
still unknowns, though perhaps not as unknown as some would claim. The
following thought experiment occurs to me. Suppose it is found that some
particular characteristic, such as a standing wave produced by some
structure in the brain that also outputs a voltage, is found to always
correlate with feeling (or what seems to be feeling as far as anyone can
see) in animals and people. Then you design an artificial system. It has a
standing wave simulator, which does not produce a true standing wave but
just outputs the same voltages. Your system appears to be sentient too, and
that's what you expect, given the details of its construction. Say you then
implant the standing wave simulator in a mouse, replacing the natural part
of its brain, and the mouse behaves normally. Say a stroke victim whose real
standing wave generator has degenerated and who is profoundly vegetative is
fitted with a simulator, and then becomes alert and says she feels fine and
seems normal on every test of behavior, motivation, and intelligence. Do we
say in these cases that sentience is there, without the real standing wave,
or that no matter how convincing, it is just a clever imitation, with no
real consciousness at all? (Or is there some intermediate between these
opposites?) I would cast my vote unhesitatingly for true sentience, but not
because I could "prove" it. Such proof may be impossible in principle.
Perhaps you can never know, in appropriate cases, if you are confronting the
genuine article or an imitation. But I wouldn't worry over it. My definition
of sentience, what I would feel comfortable with, would be adjusted, if and
as necessary, to accommodate the different mechanism.

It is worth a reminder that this is not a trivial issue for cryonics,
because it touches on what would be acceptable repair procedures. Do we have
to restore all the wetware of a cryonics patient, where such wetware is
prone to damage (brain strokes for instance) or would it be acceptable to
replace some or all of the brain with a more durable mechanism that in some
reasonable sense functioned equivalently?

Bob also says,

>Minsky, Dennett, and a few others claim to have made inroads on the problem 
>of sentience, but I (and most others) do not agree. In effect, they merely 
>restate the "emergence" notion of sentience--that when an information 
>processing system becomes complicated enough, somehow sentience is just 
>there. That doesn't cut the mustard.

I would amend that to "when an information processing system becomes
complicated enough *in the right way*, sentience is there." That is my view.
Individual atoms, most people agree, are not sentient, nor are simple
combinations of them like water molecules. Yet a functioning human brain,
made of roughly 10^26 atoms, is sentient. Somehow sentience emerges. We
don't yet understand just how, but we are learning and the remaining
mysteries could be solved fairly soon. Meanwhile there seems to be an issue
of whether we should regard sentience as something that is easy to achieve
in a limited way, or something that does not happen at all until a high
level of advancement is reached. In other words, are insects and some
present-day robots at least weakly sentient, meaning that we basically
understand the rudiments of sentience already, or is sentience an elusive
attribute possessed only by more advanced life forms such as dogs, cats, and
people? I can see this issue leading to the same sort of proof problem I
refer to above. How will you ever know? My gut feeling is to be lenient and
regard true sentience as an easy attribute to possess, at least in limited
quantity. So I would grant it to insects and some artificial devices, and in
larger proportion to more advanced creatures. And I think sentience can be
understood in terms of basic things like goal-seeking behavior, that we
already have some understanding of, rather than its being still mysterious
and not at all comprehended.

The lenient view might be criticized on the ground of its being nothing but
assertions with no backing. And indeed there are many unknowns at present
relating to sentience, so that no definitive argument will support my case.
But some reasonable evidence that sentience is "easy" not "hard" can be seen
by just considering animal life. If the "lower" creatures like insects,
though equipped with functioning brains, still do not have sentience, where
does it begin? With amphibians? birds? apes? humans only? I see no reason to
favor any of these thresholds--the functioning brain seems sentient even in
its weaker versions. If we can accept that, again it means that sentience is
not such a mysterious thing, and we do have some handle on it already.

Mike Perry 

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=14757