X-Message-Number: 8026 Date: Wed, 09 Apr 1997 11:34:37 -0700 From: John Roscoe <> Subject: In response to Olaf Henry References: <> Allan Turing himself once defined "Artificial Intelligence" as: "Whatever Artificial Intelligence researchers have not yet accomplished." This dynamic "donkey with a carrot on a stick" definition goes a long way to describe how people are still poo-pooing the feild, even after the ground breaking work of Dr. Lotfi Zadeh in Fuzzy set theory, as well as the work being done by others with neural nets and holographic patterning etc, etc. Olaf, keep your ant away from its colony for a while and see how long it lasts. You will find that without help the poor little begger won't be able to keep itself alive for very long at all. And why? Because it's not intellectually complete without it's little buddies. I think you called it a "hive mentality", quite so. Similarly: You can't chase a neuron across the kitchen floor (which is good, 'cause the difference in scale would make our heads bigger than football stadiums), but you can make it "fire": stimulate this dendron here and "bang!" goes the axion (provided that the stimulous was of adequate potential). AND Get that ant under a microscope and stimulate her/him and you shall observe a similiar effluent, a chemical instead of an electrical discharge. The "language" of ants is to ant colonies what the "language" of neurons is to individual people. The number, rank and position of ants decides the colony's "personality." Ipso facto de humanis. Am I saying that ants are dumb, but ant colonies are self-aware? Well, sure! I mean, if our definition of self-awareness is that the entity in question makes decisions for its self; for its own continued well being, then yes, ant colonies are sentient and self aware. Decisions bubble up from democratic fermonal processes deep within the masses and become clear, tangable objectives and actions on the surface. Let us not be misguided by this jeleous anthropomorphism that makes us arbitrarily deny the existence of sentience in anything but ourselves. I agree with Olaf, rabbits are self aware. So are ants that scurry away from danger. So are curling irons that shut themselves off if they get too hot. As well as automotive suspension systems that adjust to changing road conditions, etc, etc. It seems that self-awareness is, of itself, not such a big deal. Anticlimactic? Sure. But the real challenge is to emulate the complexity and dynamism of the electro-chemical processes between human neurons placed "just so." The prize? Longevity in the blink of an eye. > If truly intelligent computers would ever > develop consciousness, *then* we would have the mother of all > wars on our hands. Truly intelligent, conscious computers would move so much faster than us that our world would seem as static, unchanging and generally as predictable as a starry night sky seems to us. They might refer to us as navigational landmarks: "...Turn left at Smith and the wall socket is behind Jones." I say this wise-assedly because I can conceive of no good reason for an AI to even inhabit the same spacial plane, never mind the same temporal plane. A trillion "human years" could flit past in the twinkle of a processor; civilizations could rise and fall so quickly in the machine that us "skin-bag" folk would be forever playing catch-up, analyzing their histories in post-mortem. War? What for? A prerequisite would be some mutually required resource, would it not? Existing in an entirely different dimention, as described above, what reason could there be for a battle? To rule the world? Pardon if I sound Miltonesque, but would you like to reign supreme over a civilization of slobbering, droopy, slow-moving, slow-witted, short-lived, goofy-assed turtle people? For surely that is how we shall seem to the thinking machines. -- John Roscoe "I am certain that there is a name for my disorder, but it is the one thing that I do not wish to know." Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=8026