X-Message-Number: 8012
Date:  Fri, 04 Apr 97 20:50:02 
From: Mike Perry <>
Subject: Re: #7999

Perry Metzger (#7999) wrote
>
>> From: Mike Perry <>
>>  
>> I have made a confusing statement. A growing automaton 
>> is not an FSM--but behaves like one over a finite interval
>> of time.
>
>ALL automata behave like Finite State Machines if you bound them in
>time. A Turing Machine can only reach a finite number of slots on 
>its tape in a finite number of steps, you know.
>
>The statement is therefore "nearly" tautological.

Yes, but my previous commentator brought up the issue,
so I felt I had to respond.

>BTW, computer scientists tend to model large systems as TMs and not 
>as FSMs because although the systems are almost always in fact FSMs, 
>the analytic tools used for TMs are far more easily applied to the 
>sort of systems we typically use.
>
I like TMs better than FSMs too. However, the systems I am mainly
aiming at (in the book I'm writing) are natural systems such as 
human beings. Tipler likes to use an FSM model for these; so far
I'm sticking by that.

>> (Finite sequences are always computable,
>
>Nope. Not by a long shot. See Rice's theorem. Any non-trivial 
>property of an element of an R.E. language is non-computable.
>The answer to the halting problem is a finite sequence. It is not,
>however, computable.

I see I am misunderstood here. I was intending to make a fairly 
trivial statement. If we have a sequence of N "objects" all of
which, say, are finite strings of symbols over a fixed alphabet,
a computer program can be written to "compute" any one of
them on command, using a simple table-lookup. It sounds as if
what you're saying is that a bounded
function (from integers to integers say) need not be computable.
No argument there.

>> Overall, it seems that reality 
>> can be described as an interconnection of FSMs--in-
>> finitely many of them. I would call this "digital"--so our
>> existence could be described as digital, even if the FSM
>> model only applies locally. 
>
>Interconnected FSMs are simply FSMs.

An infinite interconnection would not (generally) be an FSM.

>> Thomas questions whether mathematics would be universal 
>> to "advanced" aliens, and uses this to call into question my 
>> idea of a universal language.
>
>More interestingly, I question whether or not without any notion of
>the semantics of a language it is possible for a remote and alien
>civilization to reconstruct the semantics. In particular, it is 
>fairly easy to imagine sending a sequence that says "5 + 5 = 10" and get
>across the notion of addition. Trying to get across enough symbols 
>to convey the notion 'this symbol in our code means a Cauchy Integral"
>would be quite a challenge, however -- I can't conceive of how to
>build up to it without a predefined shared "human" language. Trying 
>to establish the notions of most human languages like "please do X" or
>what have you in a "universal" way is beyond me. I'm not saying it
>can't be done, but the entire discussion seems extraordinarily
>speculative.

In a previous posting (#7987) I proposed Lisp as a universal 
language. I imagine that curious aliens, who we assume have computers and
understand computable functions, etc. would eventually discover a
polynomial-time algorithm to convert Lisp programs (encoded as
a stream of bytes in ASCII format, say) into some
programming language they understood. True, that is a brute-force
argument which assumes nothing about the aliens beyond their
possession of computer technology and reasonable mathematical
knowledge to go with it (and ongoing curiosity, of course). If 
we could assume they had vision, we could encode movies, etc.

>> One trend in our own civilization is that mathematics is a 
>> cumulative enterprise--branches once developed do not 
>> lose relevance or interest over time,
>
>I see you haven't been involved with many professional mathematics
>types.

Not very recently, anyway.

>Lots of fields go "dead". Often they only had a couple of people
>interested in the first place.

But going "dead" is not the same as losing the information for good.
It's there to be picked up on again (and is carefully maintained
for that purpose), and arguably eventually will. I'll grant there are
cases where people amuse themselves with various
researches, then throw away the results and forget about them. (I had
a professor like that once.) It's hard to imagine this happening, 
though, with such things as integer arithmetic or computable functions,
unless our whole civilization goes kaput.


Mike Perry

http://www.alcor.org

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=8012