X-Message-Number: 7933 Date: Mon, 24 Mar 1997 12:52:47 -0800 (PST) From: John K Clark <> Subject: Uploading -----BEGIN PGP SIGNED MESSAGE----- In #7912 Bob Ettinger On Fri, 21 Mar 1997 Wrote: >If the artificial visual cortex uses the same inputs to send the >same outputs to the "rest of the brain" as would the natural visual >cortex, then certainly the subject, wired to the artificial cortex, >will experience normal qualia I agree, but I'm surprised that you do, if the artificial visual cortex treats the inputs and outputs the same way the original did, then the rest of the brain can't tell the difference, it's just a black box to it. There is absolutely no reason to think that the visual cortex is the only part of the brain in which that is true. >what would happen if the "rest of the brain"--in particular, >including the self circuit or subjective circuit--were replaced We certainly have a feeling of self, but that no more proves the existence of a self circuit than the fact that my radio can play Beethoven proves the existence of a Beethoven circuit in among the transistors. >there is a profound difference between assuming life in another >person and assuming it in a robot that behaves like a person. The >difference is simply that other people (and animals) are made very >much as you are, hence it is perfectly reasonable to attribute to >them feelings similar to yours. You are not a woman, they are made differently than you are, do you think women are conscious? The heart in your twin brother stopped beating 30 seconds ago, except for one small blood clot your twin is made very much as you are, is your brother conscious? >We KNOW that ALREADY robots (computer programs) exist that, to a >limited extent, can converse like people To a very, VERY, limited extent. >we also know they have no slightest consciousness. We don't "know" that, just as we don't "know" that rocks are not conscious, we strongly suspect they are not sentient because they fail The Turing Test big time, but then, we don't "know" that The Turing Test works. In #7915 (Thomas Donaldson) On Fri, 21 Mar 1997 Wrote: >The problem with chaos is not at the START of the simulation, but >the result when it proceeds. True, but what has that to do with our analog verses digital debate? Regardless of what methods you use, Heisenberg tells us that ANY future version of you will be different than you are right now. >my main objection came from the distinction between a map and what >was being mapped A map is made of an object and yes, a map is not the object, but I am not an object either, I am a process. The word "I" should not be a pronoun but an adjective, an adjective modifying matter. I am not matter, I am the way matter reacts when it's organized in certain complex ways. If the information on how my mind operates is put into a computer and then my body is destroyed my consciousness does not stop, if two phonographs are synchronized and playing the same symphony and you destroy one machine, the music does not stop. The fundamental question you have to ask yourself is; are we, our subjective existence, more like bricks or more like symphonies? I say symphonies. >The problem with a simulation of Mike is that it is a symbolic >representation of Mike, and has meaning only so far as the symbols >used have meaning. Ultimately it is human beings who attach meaning >to those symbols. Lewis Carroll satirized your viewpoint in one of his dialogues, in it the tortoise proves that nothing is capable of reasoning, not machines, not animals and not humans. The tortoise says that before you can make even the smallest step in reasoning you must make use of a logical law at a higher level to justify it, but that higher level rule is also a step in reasoning requiring justification from an even higher level, a infinite regress, thus reasoning is unattainable. Pretty good reasoning considering it's impossible, especially for a tortoise. The solution is that at the lowest level all hardware, biological or electronic obey the laws of physics and you don't need justification to obey the laws of physics. >philosophers in the area have ceased to believe that the Turing Test >is sufficient. I will say as bluntly as I can that any philosopher who doesn't think that passing the Turing Test indicates consciousness and who is not a believe in solipsism is being inconsistent. I humbly suggest that such philosophers consider making a career move to a field less mentally demanding. I would recommend the fast food industry, or politics. >the Turing test requires the computer person only to interact >symbolically with its interrogator. That's true, and if you were to punch me in the jaw I would interpret that as a symbol that you don't like me very much. There is no ironclad guarantee that my interpretation is correct but I do the best I can. >It is not asked to fall in love with him, or hate him, or show anger, >dismay, and all the other emotions... except symbolically. True, and that's all we ask of other people, good thing too because that's all we could possibly get. There is no way I can experience your emotions directly, I can only treat your actions as symbols and deduce your emotional state from that. John K Clark -----BEGIN PGP SIGNATURE----- Version: 2.6.i iQCzAgUBMzTPZX03wfSpid95AQG6FwTwgeUyz+onJXFyQQVgBWIEggVKXHeahO6y buh38zWbXlRFEhjXgqQvYCI/g8k78gshLdI8TfYI7Oxr3y+ziS1i7ugzsfvTuKvG uksyfNWgyCwwKpVBCP4BrZRk+IltZ4OzYXPL/tAUsAfmgwzVAgyY3QAsU6q3c7sp JPYIQwPeQAZLRsmaroMkgXJvXD7sbKXzCmpMjClZbWIxLAn3KN8= =/uJX -----END PGP SIGNATURE----- Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=7933