Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Semantics vs. Information (Was: Mind and computation)

0 views
Skip to first unread message

Joseph O'Rourke

unread,
Mar 19, 1991, 7:42:37 PM3/19/91
to
I was a bit shocked in recent exchanges on syntax and semantics
with the discrepancy between my person-in-the-streets naive understanding
of the meaning of "meaning," and professional opinions held by
netters more schooled in philosophy than I. May I pursue this
nearly independently of previous discussions?
What I would like to know is if there is a sense in which
symbols in a language may be said to have a "meaning" without reference
to minds as David Tate insists. I think of symbols as a signal,
a communication which conveys information. But to whom? The most
sublime poem conveys nothing to a rock. And a Russian novel conveys
nothing to me because I do not read Russian. But is it not possible
to postulate a "typical" human literate in the language in which the
signal is composed? And then to say that a signal "has meaning" --
the meaning it would convey to this typical mind? Suppose we assume
the signals are received by a human who knows English, some math,
and various programming languages; a human like me (and like most of
sci.tech.philosophy readers).
Now consider four sequences of symbols, echoing my earlier
example:

1. 00110000001011100011000000110000001100000011000000110000...
[several thousand bits deleted]

2.
0.000000
0.001000
0.002000
0.003000
...
[992 numbers deleted]
0.839303
0.839846
0.840389
0.840930

3.
for( i=0; i<1000; i++ )
printf( "%lf\n", sin( .001 * i ) );

4. The sine function evaluated between 0 and 1 at 1000 equal intervals.


It seems to me there is a clear progression from (1) to (4), from non-
informative to informative. (This is what I meant by when I said that
there is a continuum between syntax and semantics, a position summarily
rejected by David Tate.) Note there is also a substantial reduction
in the length of the strings in characters, from 64000 (1) to 8000 (2) to
58 (3) to 40 (4) [it is quite possible that (3) & (4) would interchange
places with another example]. It is true that all are syntax at
some level, just sequences of symbols. But each is expressed in a
different language: bits, numbers, C, English. If you (as a person
literate in English and C) were presented with (1) as a puzzle, you
would be understandably proud of yourself if you "explained" (1)
by (4). There seems to be some sense in which you have discovered
the "true" meaning of the original 64000 symbols in signal (1):
you now have 40 symbols (in English) that convincingly "explain"
the 64000 symbols.
Even the step from (1) to (2) is a significant advance,
a reduction by 8. You might have hypothesized that when you
converted the bits according to the ASCII convention, you would
have ended up with random characters. But you ended up with
decimal numbers between 0 and 1, each with a zero to the left of
a period, each with seven decimals to the right of the decimal
point, followed by a newline character. Surely this can't be a
coincidence you would say to yourself!
But, not content to rest upon an eight-fold reduction, you
try to "explain" the sequence of numbers, you notice its monotonicity,
and in a flash of insight, you guess that the numbers were generated
by a program represented as signal (3), and you quickly describe
this program to yourself as signal (4). [Note the similarity between
this process of decoding and the everyday task of a scientist.]
As David Tate implied earlier, of course you do not know that
the 1000 numbers (if you believe the bits represent numbers) are
not merely evaluations of a polynomial of degree 1000. There is
no way to rule out this possibility. Nor is there any way to rule
out the possibility that 64000 bit signal (1) is not a communication
from Alpha Centauri in an Alpha Standard Code for Information Interchange
which you do not understand, representing instructions for building a
faster-than-light spacecraft. But anyone over 10 years old will
realize that such coincidences are so wildly improbably as to be
not worth considering (except in philosophical debates?).
Were I confined to the equivalent of the Chinese Room for several
decades, forced to mindlessly process the 64000 bits of signal (1), and
I happened upon signal (4) as an explanation, I would be as certain as
I am that Spring follows Winter that I have understood the "meaning"
of signal (1). Am I correct in understanding David Tate, John Collier,
and (perhaps?) Jeff Dalton to be saying that I have merely translated
the original syntax of 64000 characters into another syntax of 40
characters, that signal (4) is not "simpler" than signal (1), that
I am deluding myself, that I have not discerned its semantics, its meaning
at all?

0 new messages