```X-Message-Number: 19872
Date: Thu, 22 Aug 2002 11:35:32 -0400
From: Jeffrey Soreff < var s1 = "soreff"; var s2 = "edamail.fishkill.ibm.com"; var s3 = s1 + "@" + s2; document.write("<a href='mailto:" + s3 + "'>" + s3 + "</a>"); >
Subject: Re: Probabilties

>To Ralph Merkle: the point of a Markov chain is that the things you put in it
>are things you construct and define to be independent of the others.
...
>Thus, you must construct the chain as P1*P2
>where P1 is defined to be your estimate of probability that society survives,
>and P2 is your estimate that your particular cryonics organization survives
>GIVEN that society does.

At first the "define to be independent" raised my hackles, but given your
(emphatic) use of _conditional_ probabilities you are absolutely right.

In general, I agree with the form of your calculation, and I consider
such calculations to be valuable.  They help us avoid a number of mistakes:
- ignoring the aggregate effect of a series of substantial, though
not individually overwhelming, risks
- simply overlooking certain factors in the equation, when the overall
probability comes up in a discussion and in calculating it "on the fly"
one might tend to focus on just one or two of the risks involved

I also think that the overall probability estimate _does_ matter.
We all allocate our resources amongst various goals and subgoals,
and our best guesses about the odds of success for various courses
of action affect what allocations look sensible.  In my own position,
cryonics looks sensible to me if the odds of success are better than
about 1% (from, roughly speaking, comparing a year's vacation with
the net present value of an indefinitely long future, discounted at
1%/year).  My estimates of the odds of being revived tend to come
out in the 2%-10% range (with huge error bars, of course).  If I
thought the odds were ~>50% I'd be much more actively trying to
persuade friends and family members.  If I thought the odds were
well below a chance in a thousand I wouldn't have signed up.

One comment on two of the factors in your analysis:

>P3: Probability that eventually molecular repair technology will be invented
>that is capable of restoring humans, memory intact, when damaged this badly

>P6: Probability that anybody in the best of futures, will be interested,
>resourceful, and nice enough to use the technology on you.

Setting aside the conditional nature of these probabilities for the moment,
I think that the invention of molecular repair technology is very likely
to require a sufficiently powerful form of molecular manufacturing that
the fabrication of human-brain-equivalent structures from scratch will be easy.
Unfortunately, this also suggests that the construction of robots that can
occupy all of the economic and military niches that humans now occupy will
probably also be easy - which reduces the odds that humans (or entities
interested in humans' welfare) will be in control in that future.  In other
words, I think that the events described by P3 and P6 are anti-correlated,
or that the conditional probability of P6 given P3 is lower than a
straightforward analysis would suggest.

>Take the sequence of events which preceeded the Apollo 11 landing. In 1900, in
>a pre-flight era of train and cannon, men going to the moon didn't seem too
>likely -and certainly it didn't look likely to happen within a lifetime. But
>then came Tsiolkovsky, Goddard, the V2, Sputnik, Leyka the dog, Gagarin, JFK's
>promise, the Surveyor soft landings, and the Apollo mission 8, 9, and 10
>accomplishments. By that time, it was obvious to all but idiots that we'd do
>it and do it soon, even though it still had never been done.

Very nice example of the problem of how to calculate the odds of
events involving technical progress!  Phrased as "have we gotten to the moon
yet?" the answer was uniformly "no" till 1969.  Phrased as "Are the
technologies for rocket propulsion more capable than they were earlier?"
the answer was more or less uniformly "yes" over the same period.

Best wishes,
-Jeff

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=19872

```