X-Message-Number: 5271 Date: Mon, 27 Nov 1995 12:09:40 +0100 (MET) From: Eugene Leitl <> Subject: Re: 5262: The Singularity and Money On Sat, 25 Nov 1995 20:13:08 -0800 John K Clark <> wrote (among other things): > [...] > > >Unless you assume that the "singularity" will be unforeseen, > >instantaneous, and universal > > But that's exactly what I think will happen, the biggest revolution in the > history of life since the Cambrian explosion 570 million years ago. I don't > know when the singularity will happen, but like all exponential processes, > the conclusion will be sudden. If the singularity happens in a 1000 years, > 999 years from now it will still seem a long way off by most people, > because an enormous amount of work will remain to be done. However, more > progress will happen in the last year ( perhaps the last day ) than all > previous centuries put together. Everybody will be surprised at the date > except for a few who made a lucky guess. I agree wholeheartedly. Moreover it should be pointed out that some growth processes are not merely exponential but even hyperbolic. > > >investments will be shifted into what is valuable as changes > >are made, as investment managers have always done. > > Even in stable times like today it takes a fair amount of brain power to be a > good investment manager. During the singularity when everything is changing > at light speed.... > > >That's nonsensical. > > Time will tell. Which processes currently power progress? The rate of progress is currently bottlenecked by essentially the atomic intelligence level of the inventing agent, the interagent communication bandwidth and the rate of matter manipulation, not merely for production but for prototyping. What would happen if some, or even all of them would to increase? Consider prototyping: would we able to manipulate and optimize emulations of objects instead of their physical counterparts: as the computing power and modeling accuracy tends to increase, this will finally allow a prototyping rate at only a fraction of resources, temporal or otherwise. Look around: this is already happening. Consider increased communication bandwidth. While currently the infobahn is but a dim image of what the future will bring, the increased communication bandwidth _and_ quality has vastly increased the productivity of the scientific community. Even such a trivial tool as Lotus Notes noticeably increases the productivity and nimbleness of a typical company. Consider the production of complex objects as the airbus or a modern weapon system: without the electronical crutches their construction would become impossible. The information explosion is already occuring: seen the flash? Wait for the boom to arrive. But the biggest potential bomb is artificial superhuman intelligence. I know, this has been heralded to appear for some time now. It will take some decades, may be even one century. Though truly superhuman machines will need molecular circuitry for substrate, they don't need strong nanotech in the guise of the Drexlerian assembler. Engineered proteins for the matrix and molecular switches for the computation are sufficiently strong to construct a human equivalent in the volume of about a fist. Their operation speed will be at least one order of magnitude higher, their atomic complexity can be virtually unlimited. Intelligence is a evolutionary very recent invention, it has not been optimized very thoroughly yet: I can play ball but have some trouble with quantum phyisics. Intelligence improving intelligence is a positive feedback loop. Consider autoreplicators. No, not the Drexlerian ones: the proposed 100 t von Neumann probe. Currently we are used to look after machines, to direct and repair them. Now what would happen, if the technology would grow and evolve unattended? Like grass, like a forest, but _much_ faster? What we have here is a whole web of autofeedbacking nodes with synergistic effects. I dunno, it looks an explosive brew. -- Eugene P.S. merely increasing worldwide network bandwidth and computer (albeit not von-Neumann) power is sufficient to detonate the singularity on the long run. They both grow currently. Scared? > > > john K Clark > Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=5271