X-Message-Number: 3807 From: Date: Mon, 6 Feb 1995 12:58:08 -0500 Subject: SCI. CRYONICS self circuit etc. Certainly I have not interpreted any of John Clark's posts as rude, even though some of mine may have sounded a bit testy. It's really mainly myself I become exasperated with when the point seems so clear and yet I can't put it across. (No doubt he feels the same way--but one of us is wrong.) All right, let me try still again, in response specifically this time to Mr. Clark's post #3800. Perhaps it is worth the time if I can finally find some language formula for getting past these sticking points. 1. In response to my suggestion (just as a hypothetical possibility) that particular kinds of matter, e.g. iron atoms, might be necessary to produce feeling, Mr. Clark's essential response was that mind is non-material; "...a lot of things can be fast and a lot of things can generate you and me." Doesn't he recognize that this "argument" is merely a restatement of his premise? It is equivalent to saying, "If mind is only logic gates and algorithms, then mind is only logic gates and algorithms." 2. He says that, although it's difficult to prove a negative, there isn't the slightest reason to think magnetic fields (or various other physical phenomena) have anything to do with feeling. But there IS reason to suspect that feeling is not something that just magically appears when your algorithm becomes complicated enough. That reason is the known fact that we NOW have algorithms without feeling but that nevertheless can do wonderful things, some of them much better than a human mind. In fact, some pretty low forms of life almost surely have feeling, yet are in almost all respects less "intelligent" than some algorithms or collections of algorithms. Doesn't this tell you that "intelligence" could exist without feeling? 3. He warns against confusing absurdity with oddity. I think I am one of the least likely people to do that! Weizenbaum's toilet paper computer emulating Einstein is absurd not because it is funny, but because there isn't the slightest reason to think the toilet paper--moving or not--has any feelings. 4. He mentions 3 reasons for thinking that the essence of a person is in information processing in a computer algorithm: (a) He says the genetic code is digital and amazingly computer-like. But he is focusing only on the similarities, not on the differences. DNA depends on CHEMISTRY. Presumably he thinks that a computer analog of DNA could generate a computer analog of a person, which would BE a person with feeling--but again, this is not a demonstrated fact, only a restatement of his premise. (b) He says mind does not need new physics, just correct organization of ordinary matter. I agree entirely, and I don't know where he got the notion that I am looking for new physics. (I don't even think we need quantum mechanics to explain mind--just ordinary physics.) But I take exception to the implication that the KIND of matter doesn't matter. Once more, he says that all you need is information--falling back, as usual, on the mantra, offering as a conclusion what in reality is just his original unfounded assumption. (c) He says Turing proved there is no scientific reason why information processing can't duplicate the behavior of an intelligent person--but concedes nothing was proven about consciousness (or feeling). He thinks it is entirely reasonable to believe that sufficiently complex information porocessing will produce subjectivity. It is certainly reasonable as a POSSIBILITY, and I don't rule it out--but it is NOT reasonable to ASSUME it, given the fact that we know virtually nothing about the anatomy/physiology of feeling. He also asks how I could ever be convinced that some system had feeling, if intelligent behavior alone wouldn't convince me. The answer is that one day we will understand the anatomy and physiology of feeling in humans and other animals; that will tell us at least sufficient conditions for feeling, and perhaps even necessary conditions. 5. He says evolution found subjective states indispensable for intelligence. How does he know that? Aren't some present-day computers intelligent in some degree, although they have no feeling? Aren't ant and bee colonies intelligent in some degree, although the presence of feeling is unproven? 6. After saying that evolution found subjective states indispensable for intelligence, he says the "self circuit" would be no use to evolution if it did nothing except produce a feeling of self. He doesn't appear to have a clear distnction in mind between these various terms. I use "self circuit" to mean whatever portion(s) or aspect(s) or function(s) of the brain are responsible for feeling. Feeling is the ground of being and the basis of subjectivity and consciousness. Consciousness is the integration of feeling and computing; it gives what psychologists call "affect" to the data, or helps organize the data in an egocentric system. Once more, the possible evolutionary basis for a self circuit is that it provides an organized and flexible system for responses to stimuli. This might be vaguely similar to the difference between a brute-force chess computer on the one hand, that works by analyzing every possible move as far out as its speed and memory permit, and on the other hand an algorithm that recognizes general patterns, hence might be more error-prone but much faster and more efficient. No--I am not thereby admitting that the self circuit is necessarily essentially an algorithm. My previous remarks on this point hold. I am just saying that a self circuit, starting out perhaps with just a good/bad judgment and producing approach/avoid responses, and then developing greater complexity and flexibility, might well have proven much more efficient than the "robot" approach. R.C.W. Ettinger Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=3807