X-Message-Number: 7945
From: Peter Merel <>
Subject: Be Vewy Vewy Quiet ...
Date: Wed, 26 Mar 1997 22:25:54 +1000 (EST)

Robert Ettinger writes,

>Peter asks the difference between "raw sensory data"and"qualia." O.K.: A raw
>sensory datum is an electrochemical signal sent e.g. from the retina of the
>eye to the visual cortex; the secondary datum (oversimplifying a bit) would
>be the signal from the visual cortex to some other part of the brain (perhaps
>a more primitive part) containing the "self circuit" or the
>anatomical/physiological feature that constitutes the seat of feeling (and
>hence the ground of being); ensuing modifications to the self circuit, or
>modulations of the self circuit, constitute the quale.

I've already stipulated this: qualia, if they are distinct from the
usual run of neural events, may indeed be implemented by your
self-circuit. But my question concerns the definition of qualia, not
their implementation. I think, let us understand what these beasts are,
and then maybe we'll find a way to track them to their lair.

>Probably Peter and others will as usual ask, why introduce this unnecessary
>complication, the self circuit? Why can't the processed sensory data
>themselves constitute the qualia? The answer is that we have every reason to
>recognize a profound, qualitative distinction between the subjective
>condition and other phenomena of nature, demanding a special physical system.

I'm sure you'll recognise that a definition of the distinction between
sense data and qualia as "qualitative" smacks of a certain circularity ...  
But you speak of "every reason to recognise" qualia. It would be most 
helpful if you could list some of your reasons - even just one or two 
would be a good start.

>Certainly there are many who think consciousness will "just grow," like Topsy
>(or like Hal in the movie), if the computer becomes complicated enough. Some
>have even attempted to offer some detail--e.g. Daniel Dennett in his book of
>a few years back, ludicrously entitled CONSCIOUSNESS EXPLAINED. 

I'm not one of them. I suspect that consciousness is only a value
judgement, just as per Turing; in fact I'm tempted to put life down to a
value judgement too: tell me that fire is not alive, and I'll show you
the Banksias opening after a bushfire. 

But these are only my suspicions. I'm open minded on such things, and 
most curious about them.

>I don't think computers will "just grow" into consciousness any more than you
>can make a bicycle out of manure just by piling it high enough.

Wait, let me get my mold! :-)

Peter Merel.

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=7945