X-Message-Number: 2775
Date: Mon, 23 May 1994 11:52:25 -0400
From: "James B. Wetterau Jr." <>
Subject: CRYONICS re: #2770

Oh dear!

I seem to have plunged myself into a debate over the nature of values.
I don't wish to take up valuable bandwidth in a personal debate;  I'll
answer again and then if anyone wishes me to drop the matter, just
email me and I'll do so.

Mr. Ettinger wrote:
>
>Mr. Wetterau says the goal of human existence must be "sui generis in each
>individual"--unique. I think this is by no means obvious, and probably not
>true except perhaps in trivial ways.
>

What I meant by sui generis is "self-created."  Since we live (I hope)
as individuals, using our own minds to guide us, each of us must
decide the goal of our own (human) existence.  Even if we decide that
the goal is unthinking obedience to a higher power, (a decision I hope
that none of us make) *we each* decide.  We are responsible for our
own goals and actions in accordance with them.  That was my point.


Mr. Ettinger wrote:
>As I think I noted previously, Mr. Wetterau simplifies my "feel-good" in ways
>I didn't intend. When I say the only goal conceivable (to me) is feel-good,
>that does not imply an idiot's orgy. 

I didn't mean to imply that maximizing "feel-good" would be an idiot's
orgy, but I did raise the following example:

Suppose in the future human beings can create their own universes, and
can use nanotechnology to create life-sustaining cocoons.  (True
immortality in a blissful universe.)  Would any of us accept floating
in such a cocoon?  Suppose further that the chance of interruption is
0.  You've gone into your own universe and closed the door after you.
You have guaranteed maximum possible feel-good.  Why wouldn't you do
it?

(I grant that this situation seems improbable, but this is a
gedankenexperiment so I'm allowed to raise an extreme case.)

My answer is that I would not do this because I *value* certain things:
the company of others, continued learning, diversity of experience.
These are some of my values.  Some may conflict - some may not exist
in easily translatable units - some are subject to change.  I believe
Mr. Ettinger is well aware of this type of problem in ordering values
and accounted for it in his "Natural History."  We have no
disagreement here at all.

I *do* disasgree with Mr. Ettinger when I assert that these values
exist somewhat apart from the desire to feel good.  Mr. Ettinger may
say that I make a probability calculation of just how good I will feel
when I prefer an active, engaged existence to bliss in a cocoon.  I
think that these values must be determined at a _higher level_ (higher
in the sense that our analysis must go further) than my feelings.  

I hope that these values do not conflict with my desire to feel good.
I am no Kantian - I reject the idea that that which is morally good is
duty, and the purest duty is in conflict with my desires.  I
categorically oppose any philosophy which compels me to renounce
feeling good and doing what feels good.  

However, I feel that how we choose what sort of good feeling to pursue
is determined by our values, and not merely by a calculation of what
will make us feel the best over time.

The values are a super-structure which relates to our talents and
interests and possibly our genetics.  

Again Mr. Ettinger may say that if we are interested in a thing that
is due to the good feelings we experience when we ponder it.  If we
have a talent and use it, that is due to the good feeling we get when
we exercise our abilities.  

True enough; but why are we interested in one thing and not another?
Do these things really make us feel that much better, or are we, at
some point, guided by extra-logical facts about our own minds?

My point was that the "Natural History" contains an analytical flaw by
reducing values to what makes us feel good.  The mistake of "nothing
buttery" is the mistake of looking at a thing at the wrong level.  If
I say that human beings are "nothing but" a pile of atoms, I fail to
take into account the structure of those atoms and their bonds,
arranged into water molecules, and DNA strands, and proteins and
lipids and so on.  I would fail to explain human biology and anatomy
if I said "human beings are made up of carbon and hydrogen and some
other elements.  These elements, in certain combinations, give rise to
human behaviour."  Well yes, but how?

Similarly, if someone tells me that values are based on our actions
designed to make us feel good on some level, I object that this in no
way explains _how_ we arrived at the particular set of values.

I'll state my belief, as long as I've opened the question:  I believe
humans value things for a variety of reasons but particularly those
experiences that challenge their cogitative powers.  I believe that
this situation arises because humans have a unique evolutionary
advantage in thought and have discovered that exercising thought aids
their survival and proves to be an inexhaustible source of desire and
then satisfaction.  The number of questions we have to analyze can
only increase as our inquiries go further.

To return to the analogy of the self-circuit:  I see our values as one
of the pieces of data about the circuit.  I see good-feeling as
similar to the phenomenon of current flowing successfully through the
circuit.  To say that the goal of the circuit is to pass the current
through is an inadequate description; it is nothing-buttery.  The
circuit is also its structure, and its function.  To explain what it
is intended to do, and how it does it, one must analyze its structure
further.

Similarly, human beings determine their own minds (as in "making up my
mind") and are determined by nature both at birth and over time.
Humans choose their own purpose.  Even if they do this because it
makes them feel good, their choices and their values are not _directly
explicable_ based on that.  Similarly, a computer does what it does
because it has current flowing through it.  But to say that a
computer's purpose is to pass current through, and to explain how it
works by stating that it passes current through, is to fail to analyze
its circuitry and its programming.  

We do what we do because it makes us feel good, or maybe feel the
best, but we decide what course to take, and what will make us feel
good, based on some computation of future good feeling AND our values.
How these arise is a proper subject for scientific inquiry.  I believe
it has not been adequately answered yet.

Finally, I'd like to answer a couple of smaller points.  Mr. Ettinger
wrote:

>I don't think it is a play on words to insist that the only POSSIBLE goal is
>feel-good. When Mr. Wetterau says the goal may be something else, such as
>self-actualization, that is only re-defining feel-good, or expressing an
>opinion as to which kind of feel-good is most important.  But we don't want
>just opinions or guesses or introspections; we want evidence, and finally
>proof.

My question in reply is: why is my opinion less authoritative than
yours?  Neither of us has done the experiments.  I speak from my own
sense of myself, and of some mistakes in logic I thought I had
observed.  I do not believe either of us has the right yet to claim
scientific certainty.

Furthermore, self-actualization, a term I have never heretofore used,
is *not*, in my opinion, just another way of describing feeling good.
It exists at a higher level of analysis.  Self-actualization is
feeling good _within the structure of values and experiences and
knowledge and behaviours and beliefs_ that makes the self the self.
This is quite a different, and a more sophisticated, thing.


>It is often said that "everyone is entitled to his own opinion"--but this
>isn't true, except politically.  No one is intellectually or morally entitled
>to a dangerously frivolous opinion, let alone a demented one.  We are on the
>road to PROVING which values are valid, and this trip is necessary.

I don't believe opinions are dangerous, nor do I believe mine is a
dangerously frivolous opinion.  If my opinion is that frivolous, or
demented, it will easily be defeated here.

All the best,
James Wetterau

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=2775