X-Message-Number: 21036
Date: Sun, 02 Feb 2003 01:35:49 -0800
From: James Swayze <>
Subject: Re: [MURG] Yin's comment, a long reply to dissenters, reply to Joe
References: <>

"Joseph J. Strout" wrote:

> At 7:42 PM -0800 1/31/03, James Swayze wrote:
> >I know I will get flamed for this but I totally agree with Yin on this. I

> >would not submit voluntarily, (with a caveat to come) for a destructive 

> >as I do not believe in isomorphism. Not yet anyway. I do not buy the identity

> >arguments of even my good friends that do believe in isomorphism, that a copy
> >of me is me.
> I doubt you'll get flamed with it, but I do believe that you haven't
> got a coherent leg to stand on.

I thought I was very coherent and straight forward and quite simply logical. 

isomorphism that has not a coherent leg to stand on. But don't take my word for 
at this line, read further.

>  The debates on personal identity
> ended here years ago,

The debate is likely never to end. Especially when ideas such as yours, the 

of the poor in AR space, are spread around for the next Machiavelli or Stalin to
seize upon as the next "Final Solution". I hope you don't advocate _mandatory_

uploading of the poor. Please don't take this as disrespect. I realize you did 
your own reservations about this idea. I just think some ideas should remain
silent, but then I suppose someone else would have voiced it eventually.

I don't care how heaven-like AR space might be I never want to be there. I 

interacting with Real space. I don't want to be binging around inside a box and 

dare say that that idea won't sell to the general public very well either. That 

my point. Have you ever tried to talk a plain old every day person not 

to our lofty musings into these ideas? I doubt seriously it goes over any better
than a lead balloon. Most people rely upon common sense and common sense logic

tells them a copy is not the original. Sometimes common sense is enough... don't
knock it. Some things can be over-thought.

> but while they were going on, nobody could come
> up with a sensible theory of identity that doesn't leave you with the
> conclusion that a destructive upload leaves you just as much 'you' as
> a nondestructive one.

The difference is continuity of consciousness. That should be plainly seen if 

did not assume ahead of time that gradual, Moravec uploading was not possible. I

surprised you have missed the obvious with all your well earned education and 
deserved status. I'll get to continuity a little more below.

> I don't really wish to rehash all the old arguments... do we have
> archives?  Perhaps you'd also like to read some of the personal
> identity references listed at
> <http://sunsite.unc.edu/jstrout/uploading>.

Really no need, it wouldn't convince me, but I'll give you your due and read it
anyway. However, my logical instincts are far too strong to bite the irrational

isomorphic apple. The irony is that I used to believe that a pattern was enough 

rebuild me from, then somewhere a strict anti tautology virus set up shop and I 

through my error. I agree with Robert Ettinger that identity must bind time, see
more further on.

I did read your pdf on uploading [http://www.scifi-az.com/articles/guest1.pdf].

Very impressive and some parts very attractive, manipulating the physical laws 

all---allowing one to fly and such, but several holes and the foundation based 
on a
false premise that has no support other than you simply stated it as though a

given... chiefly the following: "We may define your survival as the existence of
person who is YOU, sometime in the future." Umm, no not good enough.

The problem with your premise is that merely stating it as so does not make it 
You assume the audience agrees before laying any ground work to back up the

statement. It therefore becomes a tautology. It says nothing about the popular 
view that identity is a subjective experience. It assumes that an objectively
judged YOU is a fully satisfactory stand in because objectively every outside
measurement has no way to tell the difference between original and copy. Your
article is too much preaching to the choir. You assume the reader agrees with
isomorphism and make no valid argument to support it.

> >The thing is it comes down, ironically, to a matter of faith.
> Nonsense.  It's a matter of philosophy

Flawed philosophy. Subjective philosophy albeit the same for both sides of the
argument admittedly. But subjectivity IS the point after all.

> , and philosophy is basically a
> branch of mathematics:

No amount of math can make an X be a Y.

> progress in philosophy is made through logical
> rigor, and a philosophical "theory" which doesn't hold up to logical
> analysis is nothing but a worthless pile of junk.  So far, any theory
> which claims to find a difference between gradual and destructive
> mind uploading falls into the latter category.

I told you the difference but you chose to ignore it. Perhaps you felt I was 
insolent, after all I am but a pion compared to your amount of education and
status, but I have my strengths. Perhaps you've over studied the whole identity
issue and with regards to the uploading subject, are you of the opinion a fresh
perspective, not mired in the details and complexities, is of no value?

> >Joseph J. Strout says, "I find it very hard to believe that such a
> >procedure will ever be possible.  It makes for passably good science
> >fiction, but it most likely won't happen in real life."
> >
> >Oh really? I guess he is more knowledgeable than Hans Moravec.
> So?  I've studied uploading for 15 years; Moravec thought about it
> for a little while and wrote a paragraph or two as a minor addendum
> to a book in which he was asserting that humans were obsolete and the
> future belonged to artificially intelligent robots.  I have an M.S.
> in Neuroscience and have worked with real neural tissue in a variety
> of ways (including live preparations, E.M. studies of frozen or
> vitrified tissues, etc.).  Moravec is a roboticist.  Why do you find
> it surprising that I may have a better understanding of

Well my hat is off to you. I mean it but I still feel some things are made over

complicated. The logic is simple an original is an original a copy is a copy. It

not a failing of language it is a physical fact. Here's another less language 

way to put it. Call it; 'one came first, the other second'. Do you have some 

math to make the copy precede the original in time? Quite simple this, really. 

is how I bind identity with time. It's not just that both entities can't occupy 

same space they cannot occupy the same time as well. A copy can never occupy the

space and time of the original and that's the only external criterion available.

This first occupier identity works for bodies and consciousness. Talk about 
the obvious, but it seems it needs be done.

> -- or a
> lesser willingness to gloss over -- the difficulties inherent in
> attempting to take apart a brain and keep it functioning at the same
> time?

Who said anything about taking it apart at the same time? Not I. If you'll pay

closer attention to what I wrote, well perhaps I should have made more clear the
meat was to remain a while... I did say "augment". I did say that the nanobots

would learn 'along side' the neurons. I never said they replace them. I said 

would augment them. However, I did gloss over what might happen after a long 
As the neurons naturally die they are not missed because the job has been taken
over somewhere along the line and preferred on the nanobot architecture due to

efficiency. Call it creeping along way way way slow attrition gradual uploading.
But the point is the ever so gradual changeover and hence 'continuity' of

> >I'd be interested in Joseph's opinion as to why this is sci-fi.
> Because of the details.  A neuron is not a red blood cell; it's a
> long stringy thing (sometimes up to a meter long in humans!), with
> thousands of connections to other long stringy things.

Again you seem to have passed right by what I said. I know I am a pion but do me

the respect to read every word I said, please. I read every word of yours of the
material I had available and I do, though it may not be readily apparent,

appreciate much of what you say. As to matching the neurons capability and size,

my treatise I said to string the nanobots end to end along the neuron and all 

tendrils and make the 'group' of them morph as necessary. Recall the part where 
mention the dual ability to have the computing power of each individual bot and
also that of the pseudo neuron made from the morphic neuron mimic shape? I

mentioned the stringing them end to end and even a little about possible 
computing because of the following. There is a serious problem I don't ever see

addressed and I meant to touch upon it in my treatise but forgot to owing to 

of sleep maybe. It is that if the nanobots that everyone hopes for, or the brain

chips, whatever they be--speak electronics, then they are subject to magnetic 

damage or interference or lock step a whole line of electrons with a strong 
line and oops, there went your upload!

> Nobody has
> proposed any reasonable method for taking apart something like that
> while it's functioning.

Well I suppose just did. Slow natural attrition with continued conscious thought

favoring the more efficient and faster robust structure over time. Perhaps 

very clever will devise an incentive program to coax the 
to favor the hardware over the meatware slowly. I'm certain you will agree that

even though you are very knowledgeable and bright that you can't have thought of

every future possibility already, have you? Is there no room for future 

>  If the improbability of this scenario is not
> obvious, it's because either one doesn't understand what neural
> structure is like, or else one simply hasn't thought about it in any
> detail.

Wrong, you simply hadn't paid attention to my suggestion. Let's consider what 
have wrote on the subject.

"Uploading by the Nanoreplacement Procedure

In this proposal, billions of microscopic machines are injected into the brain,
where they take up residence in or near the neurons. Each machine monitors the

input/output activity of its neuron, until it is able to predict perfectly how 
neuron will respond. At that point, it kills the neuron and takes its place.

This proposal sounds good -- in some ways, simpler than the microtome procedure.
But the machines would probably have trouble with the sheer size of neurons"

[Not if you let many of them assume the shape of the neuron.]

"and their connections. A neuron may have inputs many millimeters away,

[Ok, please don't take this as smart assed retort or quibbling but aren't you

contradicting yourself? I quote: "A typical neuron is only 10 microns across, 
and a

synapses... [sic] are on the order of 1 micron." I thought a millimeter was a 
larger than a micron or even 10. Of course this is not entirely germane to this

hopefully friendly discussion, but it has a small relevance where we are talking
nanobots residing _with_ neurons.]

"and may have outputs connections as long as a meter."

[A meter? That's a far cry from 10 microns with connections 1 micron. Just 

was the former statement based upon now outdated information? The former 
needs an update? Just wondering not accusing.]

"How could a tiny machine (smaller than a cell) monitor all those inputs, or 
all those output sites?"

[Don't try and make just one bot handle a single neuron on ts own. Where is it
written, what bible, that a neuron gets only one bot? To me it's obvious]

"It would have to be capable of growing as large as the neuron itself, and in 

same shape. Proponents of nanotechnology sometimes claim that almost anything is

possible. Let's suffice to say that it's not clear how this would be 
for now at least."

[Perhaps it is now. Maybe I have a different visualization of how I believe nano
robotics may work than others.]

> >Finally, I don't think that esoteric discussions of identical electrons, many

> >worlds, atom replacement, extremely convoluted math and whatever else are all

> >really any help in the discussion about identity. They don't solve the simple
> >fact of logic that for every 'thing' that ever comes into existence that
> >'thing' is the original and the only original.
> There is no such fact.

Yes there is. Or perhaps to prove me wrong you have a syllogism that can prove

isomorphic identity to be sameness? You can make the copy exist before the 

> If you begin with a faulty premise (or worse,
> imprecise use of language), then naturally you'll be led to invalid
> conclusions.

Agreed and that is exactly what _you_ have done... unless you can prove to me 

a copy can precede an original in linear time. Please explain exactly how I have

allegedly used imprecise language or made faulty premises. I made several points

with sevral logical examples. It would be nice if there were objection to them 
said objection be addressed point by point.

> >Esoteric intricately involved theories are interesting but they do
> >nothing to help real people deciding real issues about identity
> >where the rubber meets the road.
> Agreed.  The only sensible theory of personal identity I've ever
> heard is quite simple and coincides with how people naturally think
> about people (except when we take the lazy shortcut of equating
> identity with bodies, which just happens to work most of the time in
> the current era).
> >It's simple logic folks... maybe it is time to get reacquainted with it.
> You said it, not me.  :)

Indeed I did and I sit by it still. :) But really I am just so amused that

isomorphists feel their view is the more logical when by simple reduction, 
Razor, if you will, the opposite is true. It takes far fewer steps far fewer
shenanigans to accept that a copy cannot be an original than all the convoluted
justifications to the contrary. Simple is better this time.

I found something you said strikingly ironic for it makes my point so well. "For
first attempt, you might try identifying people by their bodies. But this seems

unsatisfactory; any part of my body (except my brain) may be removed or 

and I am still the same person. Exactly my point! Such refreshing subtle and 
logic. You just proved that atom replacement does not a copy make.

Oh yeah there is something I've always wanted to bring to fore in discussions of
copies and backups. In the above noted pdf article, you say, "Deaths may still

happen, but a contientious upload will keep multiple backups safely hidden 
Isn't that a bit unfair to the backups? I mean if a copy is YOU, then surely a

backup IS also a copy and therefore YOU. YOU have certain inalienable rights. So

then it is logical to say that YOU the backups deserve as much runtime as YOU 

current experiencer get. But that is troublesome because it means they cannot be

backup because a backup cannot be allowed to experience and thereby diverge 
too much divergence negates YOU'ness. So backup YOU's can't be allowed to

experience, have runtime, and that amounts to slavery. I know the friendly AI 

looks after the rights of all sentient entities won't allow that. But further it

really means that if one has the notion, believes it is ok, to have backups that

necessarily cannot be allowed runtime so that they be a faithful backup then 

backup does not have the same rights as YOU therefore it is NOT YOU. Since a 

IS a copy this means in the end must mean that a copy is NOT YOU after all. 
that for logic, er philosophy?

The argument is not over on identity and may well never be over. Still it boils

down to personal choice, I hope. I mean I hope society never forces it upon 
that dissent. I dissent, until further notice.... I'm still waiting for a
convincing simple argument to change my mind. Got that syllogism handy?

Cryonics Institute of Michigan Member!
The Immortalist Society Member!
The Society for Venturism Member!

MY WEBSITE: http://www.geocities.com/~davidpascal/swayze/
A FAVORITE quote: Last lines of the first Star Trek the Next Generation movie.

Capt.  Picard: "What we leave behind is not as important as how we've lived, 
all Number One, we're only mortal."
Will Ryker: "Speak for yourself captain, I intend to live forever!"

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=21036