X-Message-Number: 24864
Subject: Re: Reply to Michael C. Price about Immortalism
Date: Tue, 19 Oct 2004 04:07:06 US/Eastern

   Michael C. Price wrote:

    >  Ben Best wrote:
> > You are confusing survival
> > strategies of species with survival strategies of individuals. 
> No, I was not.  Let me clarify: 
> The discounting is hard-wired in by evolution, but so are lots of
> other *undesirable* things, ergo the fact that we are hard-wired 
> to discount the future doesn't mean that we *should* discount the
> future.  Just as we are hard-wired to invest in reproduction rather
> than extend our lives greatly beyond the fertile period, yet as both
> life-extensionists and immortalists we have chosen to value
> extended lifespan / neural-information survival over genetic 
> survival through our descendants.

     Discounting of future values is part of the way people rationally
allocate resources. You can call it hard-wired as a dismissive 
way of calling it irrational, but I believe the opposite -- that it is
rational. Values are subjective, and you can reject (or imagine
that you reject) discounting of future values -- just as a person
who commits suicide is not behaving as "man qua man". We
may be hard-wired to not commit suicide, but many people 
do it anyway. (See below)

> >> An immortalist (such as myself) does not discount the 
> >> future in the way that a life-extensionist (like Ben) does.  
> Do you agree with above statement?

   I don't believe that you don't discount the future
(See below)

> > Even though a trillion
> > years is a million times longer than a million, by my values
> > the difference is not significant -- and I am skeptical that it
> > is even significant by yours -- skeptical that you are not 
> > "discounting" future life. 

    I don't know how much I confused matters by doing so,
but I was not paying attention to the fact that in North America
and France, a "trillion" is a 1 followed by 12 zeros, whereas
in Great Britain and Germany  a "trillion" is a 1 followed by 
18 zeros. The British use "billion" to mean a "million million",
which in North America is described by the word "trillion". 
I will stop using these terms in this discussion.

> Scepticism is often a healthy attitude, but you are wrong.  
> My values rate any finite-length existence to be valueless,
> and I am not alone in this view.  I agree with Bruce Klein
> when he says:
> >> Limited lifespan, living 10 more years, or 10 million more 
> >> years, is irrelevant if death=oblivion.  The pursuit  of infinite 
> >> lifespan is the best way to overcome this problem.
> And I find your response to Bruce interesting: 
> >    I find this statement incredible. If you have certain knowledge 
> > that you cannot live 10 million years you would consider living 
> > another 10 years to be irrelevant.  I know values are subjective, 
> > but this seems so outrageous that I find it hard to believe you. 
> You'll have not to believe me either, because that's my position
> also, and has been since I was 11.   This is the real source of our
> disagreement -- you can't imagine that anyone could actually
> have immortalist values.  I, by contrast, have no problem imagining
> that other people, even life-extensionists, don't share my 
> immortalist values.

(You have arrived at "below".)  This seems more to me a matter
of irrational thinking than "immortalist values" -- and I say this
acknowledging that values are subjective. I do think I somewhat
misstated the matter when I said that the difference between
the prospect of living one million years compared to the prospect
of living one million million years is not significant. In order for a 
person to not be discounting the future, the prospect of not living 
one million million years would have to be one million times more 
significant than the prospect of not living one million years. 

   Again, a googol is 10^100 and a googolplex is 10^googol.
I am skeptical that an "immortalist" really finds the prospect
of living a googolplex number of years nearly a googol order
of magnitude greater significance than the prospect of 
living a googol number of years. And a lifespan of a googolplex
number of years is a drop in the ocean of Eternity. I don't
believe that the prospect of living Eternally is on the 
same order of significance to anyone as an ocean is
to a drop of water when compared to the prospect of living
a googolplex number of years. But this is what the failure to
discount future value would require. 

    The fact that you would find it not worth living another
10 years if you knew for a certainty that there is no hope
that you could live a googolplex number of years -- or for
Eternity -- is a slightly different issue, an issue more 
relevant to the subjectivity of value than the discounting
of future values. If this is truly the way that you feel, I am
somewhat amazed by it, but I can recognize that you
may genuinely feel this way or have these values. I was
similarly amazed when I first became interested in 
cryonics and I discovered how indifferent most people
seem to be to the prospect of living less than 100 years. 

> > illustrated by listing goals:
> > 
> > (A) live until next year
> > (B) live to age 100
> > (C) live to age 1000
> > (D) live to age one million
> > (E) live to age one trillion
> > 
> >    (A) may not be difficult. Going from (A) to (B) will 
> > entail a monumental breakthrough in the history of 
> > mankind. 

   This was a typo that totally mangled my argument.
I meant to type 200 rather than 100. A similar error
at the wrong time and place could cause me even to
fail to achieve (A) -- relieving me of the danger of 
being destroyed in 25,000 years by an unexpected
explosion of a supernova. Whatever can go wrong 
will go wrong -- especially when there is Eternity.
Things can go right a million million times, but things
only need go wrong once to obliterate you forever. 

   I will cut the irrelevant replies to my mistake. 

> Perfectly true and absolutely irrelevant.  *Of course* an 
> immortalist has to worry about staying alive tomorrow
> before worrying about the next day, year, century etc.
> This has nothing to do with how much I *value* living
> forever as opposed to just another 100 or googolplex 
> years -- both require that I live another year.

    I think this IS relevant to the discounting of future
value, but I'm not sure how to explain it at this moment. 

> > Worrying about how to become "immortal" is worse 
> > than worrying about how to get from (Y) to (Z). 
> My take is the complete opposite; worrying about how to 
> become immortal is what makes me take anti-aging 
> supplements *now*.

   Again, this seems to be in the realm of your value
structure versus mine -- beyond argument. 

> >>   Is the "death" of an hour old embryo more tragic
> >> than that of a hundred year old?  Obviously it depends 
> >> on your values.
> > 
> >    Yes, but this is irrelevant to the points I have been 
> > arguing concerning sentient beings. 
> Not if you regard sentience as a phase along the continuum 
> of the complexity axis.  No point arguing about this, simply
> increase the timescales to whatever time-frame you 
> consider sentience to emerge in.

     I see your point, am not entirely comfortable with it,
but don't have an interest in pursuing it. 

> > Your supposition of "sour grapes" is wrong. 
> Actually, what you've said sounds exactly like the 
> fox in the fable, who decided he didn't want the grapes
> [immortality] once he realised he couldn't reach them.

   I have cut some background text from this already 
very-long message. 

   "Sour grapes" is regarding something as being 
undesirable because it is unattainable. I don't regard
immortality as undesirable -- simpy unattainable. I
want to live as long as possible.

   Unrelated to this fact, however, are other problems
I see with "immortalists" and their attitudes -- such
as the longing for a belief that immortality has been
attained -- a belief that I believe is unattainable almost
by definition -- and a belief which I believe can lead to
reduction of diligence and hastening of death. 

    -- Ben Best, speaking for himself

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=24864