X-Message-Number: 79
From att!sun!pyramid!munnari!basser.cs.su.oz.au!pete Thu Apr 20 04:26:30 1989
Received: from pyramid.UUCP by sun.Sun.COM (4.0/SMI-4.0)
	id AA21443; Thu, 20 Apr 89 04:26:30 PDT
Received: by pyramid.pyramid.com (5.61/OSx4.4c-890312)
	id AA00942; Thu, 20 Apr 89 02:54:41 -0700
From: sun!munnari!basser.cs.su.oz.au!pete
Message-Id: <>
Received: from basser.cs.su.oz (via murtoa) by munnari.oz with SunIII (5.5)
	id AA28372; Thu, 20 Apr 89 19:08:29 EST
	(from  for ho4cad!kqb)
Date: Thu, 20 Apr 89 19:06:27 EST
To: munnari!ho4cad!kqb
Subject: CRYONICS - a fate worse than death :-)
Status: RO

Tom Betz writes:

>Kevin,  assaults you for your suggestions re: technological
>mind transfer.  While I would personally react in much the same manner were
>you to come at me with that hardware, neither he nor I should be permitted
>to lock others out of this option. [...]

I never suggested limiting the choice of others. I believe that people should
be encouraged to do foolhardy things. There's too many of us, you see.

Assaults? Whoa. I never threatened him. Warned, perhaps ...

>[ SF References deleted... reality is generally far stranger than,
>and always different from, how it is anticipated in SF. ] [...]

Not always stranger and different (Geosynchronous satellites). SF often
presents well thought out speculative ideas long before any other
field/genre twinkles them in the eye. Cryonics being a case in point.

>But if one you loved had the option to keep living in you,
>wouldn't you think once about permitting them that option, by
>volunteering a piece of your time?  I'd think about it...

Many to a body, rather than many bodies to one. Well, I have to think.
I think about the closest person I've lost. There are two questions: would
I skull-share my life with that person? would I merge my life with that person?
Yes and Yes. I'd do the same for anyone I loved. I like the option; I abominate
the implications.

What shivers my timbers is the thought that my skull might be appropriated
by someone or something that I do not love. There's a word for it. Rape. If I 
am disappropriated in favour of some one else, then it's a different word.
Murder. Unfortunately, I believe you might hear less emotive words. Say,
Advertising and Conscription. There ain't no non-violent answer to rhetoric.

For every lock, there is a key; if you install a door in your skull, 
there will be burglars. 
(cf. "Mindkill", Spider Robinson; "All My Sins Remembered", Joe Haldeman; 
"The Manchurian Candidate" <blowed if I can remember who wrote that>)

> [...] As each must have the right to choose suicide, each must have the
>option to offer through their suicide the survival of another. [...]

Agreed. But not relevant.


Tim Freeman writes:

>Many technologies have moved along the path from impossible to
>impractical/untrustworthy to practical to indespensible/ubiquitous.
>Most of them caused intense apprehension before they became commonly
>Language [...] Automobiles [...] Artificial hearts [...]

I recognise that many technologies have taken this path. I do not
recognise that this technology will. Remember the Atomic Utopias dreamed
up in the early fifties? Fission is a technology which never fulfilled
its promise. Mind-altering drugs are another one. Just because something is
possible does not mean that it is good.

>Mind-links are not possible yet.  The first people using forerunners
>of this technology don't have much choice; consider communication
>tools for paraplegics and nerve hookups for artificial limbs.

I don't think that upmarket peripherals are comparable to Mind-link or
Mind-swap. I think that Mind-swap is more like changing OS - say from UN*X 
to VMS. Only much much more so. Ok, maybe like swapping a Renoir for a Picasso.
No, even more so than that. Maybe like swapping a Renoir for VMS. No, ...
pick the two least similar expressions of human mind that you can conceive.
Now, mind-swap is much more so than that.

>The belief that "my soul" is attached to a particular brain will probably
>pass by the time there are usable replacements for brains, like the
>belief that "my emotions" are imbedded in my heart subsided before
>artificial hearts were invented. [...]

Who's to say? What's a soul? I've got a pretty good idea about mine, but I
won't presume to adequately address the question for the rest of you zombies
:-) I see the soul as inextricably linked to the mind - I think it is the
source of mind. So I guess, if we can affect mind, we can affect soul.
Neither of these is something to take lightly. To quote Sam-the-Eagle,
"Under our clothes, we are all _naked_." But naked is all we've got.
Beyond souls, I think by definition, we have _nothing_.

>>To be short - I don't buy it. Let me ask you: would you rather control your
>>own destiny, or give up some or all of the responsibility for your thoughts
>>and life to some metapsychic octopus with a secret agenda?
>The same argument works against language.  By communicating with
>others, you have less control over your thoughts.  [...]

No. You do not have to accept what someone says or does. What they say or
do may cause you to react, but that does not necessarily lessen your
responsibility for your actions.

>I bet you can't avoid thinking of a pink polar bear right now. [...]

Thinking of a what? :-)

>For all you know, I have a secret agenda.

I presume that you, and everyone, has a secret agenda. What price Liberty?

>In my opinion, doing interesting things is important, not being in

Yeah. Live Fast, Die Young, Leave a Good Looking Corpsicle, then come back
after a few years and do it all over again. If life is guaranteed, then
what is moral and what is immoral? Be careful, friend Tim. There be thorns.

>>I mean, haven't you ever wished you could shut your ears the same way 
>>you shut your eyes? What if you couldn't shut your eyes, either?
>I can turn off my computer, I can get rid of windows on the screen, I
>can kill processes that are running, I can use kill files to censor
>information when reading netnews.  I expect that the number of useful
>ways to block out information will increase rather than decrease.

So. Tell me, how much advertising did you voluntarily expose yourself to today?

We raise our children to read on sight. They can't help it. Poor little sods.

>>The RISKs of such an interconnection are beyond estimation.
>In the very long run, not doing such an interconnection will result in
>being out of the game of life, since you'll be outclassed by the
>people who have made use of such an interconnection.  Which seems the
>same as dying to me.  Taking the risk (when the technology is ripe)
>can't be any worse.

Yeah, and they tell me, to make it really big, I've got to go live in
America. Perhaps I'm a hopeless case, but for me, Life is not a game to
win or lose. Love and comfort - the rest is vanity.

> [...] You're in existentialist hell already, with the rest of us.

Crap. To quote Locke, "The mind becomes that which it contemplates."
I'm where I want to be, and it doesn't look anything like existentialist
hell. Now I'm not sure where you are ... but I guess that means that
you're not in existentialist hell either. Existentialist limbo? Isn't that
some kind of dance?

That is beyond your compression - Galaxy Being, Outer Limits.  

(pete%) {uunet,mcvax,ukc,nttlab}!munnari!basser.oz!pete

JANET: (POST) pete%         (MAIL) EAN%""

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=79