X-Message-Number: 12535 From: Date: Sun, 10 Oct 1999 14:30:09 EDT Subject: Kurzweil's book A couple of people recently have expressed enthusiasm about Ray Kurzweil's THE AGE OF SPIRITUAL MACHINES, and yesterday Thomas Donaldson weighed in with a bit of criticism. I reviewed the book for THE IMMORTALIST issue of Jan./Feb. 1999, mentioning several specific errors/shortcomings, and concluded that the book's only real virtue was its contribution to raising consciousness about the pace of technological progress. Of course, that contribution is not necessarily negligible, and I don't want to be a wet blanket; in some ways it is doubtless an eye-opener to some people. But those people will necessarily be among the previously less informed, and they should be warned against taking everything the author says as gospel. He even presents as fact the minority speculation (Wigner et al) that human observation is needed (and can work retroactively) to make a quantum event definite. Thomas Donaldson wrote: >His book completely omitted discussion of something quite fundamental: >values and feelings. Intelligence alone, no matter how extensive, has >no chance at all of taking us over, for the simple reason that so long >as WE are the only creatures providing values, purposes, and feelings, >then these artificial intelligences will follow US, not we them. I think Thomas partly misses the point. Values, feelings, and purposes are all different concepts. A non-feeling machine (one with no subjective experiences) can nevertheless have "purpose" in the sense of being goal-directed, and some have already been made. It could also, therefore, have "values" in the sense of criteria for choices. It could indeed, then, be an adversary, and a formidable one, a la the Berserker stories, if programmed with certain kinds of goals. If it lacked feeling, however, it could not be a person, and would not be entitled to any consideration on an ethical or moral or empathetic basis. Robert Ettinger Cryonics Institute Immortalist Society http://www.cryonics.org Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=12535