I'll start with the last questions first: You ask what he meant by a
function and that is not so easy to answer. Mathematically a function is
an entity that maps some values to other values, where the input values
are always mapped to the same output values. In the world of software we
don't mean that at all. Take a similar question: What's a structure? The
answer is that it's the thing defined by your language's primitive with
a name like DEFSTRUCT. Similarly a function, subroutine, etc., is that
thing defined by the primitive your language provides to define such
things. Does a function define a synchronous path? Depends on the
language in which it is defined.
I don't know what it means for two numbers to always exist together so I
couldn't determine if they were synchronous.
It's been (quite) a while since I read this thesis but I might be able
to add some to what you have got out of it so far. The things said about
multiple interpreters was the following: In order to interpret a control
structure, the interpreter must "do" the control structure. Let's take
an example: PARALLEL(x, y, z), where x, y, and z are program pieces. In
order to really get the effect of parallel execution, the interpreter
must, in general, start Ix, Iy, and Iz; three interpreter routines, one
to interpret x, one to interpret y, and one to interpret z and they must
execute in parallel. And the same thing needs to happen when each of the
other control primitives are encountered. (There are, of course,
optimizations such as subsume a sequential element that appears in a
sequential, etc.) One may think that you could simulate PARALLEL by some
sort of interleaving on sequential hardware but when you mix in other
control relations the interpreter can't be faithful to the implied
semantics.
The synchronization issue is that, for example, a monitor must
instantaneously spot that its condition has been satisfied so that a
declared reaction will occur. This is a hell of a burden on any
interpretation scheme. Let's look at an example: Let the variable X be a
sixteen bit integer; let the variable H be the high order 8 bits of X
and L be the low order 8 bits of X. Assume that there is a monitor on
the value of X, then that monitor must actively take a peek when either
H or L is modified. Similarly if there is a monitor on either H or L,
it must take a peek any time X is modified. This example may seem quite
artificial but it isn't. Consider interrupt structures of your favorite
computer. Bits are flipped in registers and interpreted as signals to
and by the OS. Describing and simulating such capabilities as they
actually work is quite difficult.
For a moment, set aside the issue of interpreting programs written in
the control structure language and consider using the language to write
a detailed spec for a modern CPU with multiple cores and multiple
threads per core. You want to specify what the range of behaviors are
allowed. If you think about this for a while, I believe that you will
appreciate why the dissertation seems so convoluted. It's too bad that a
second dissertation on the same topic did not follow and clarify all of
these issues.
At one point in the 1970s, I wanted to abstract the control flow and
data flow within a speech understanding system so I invented a language
called CSL (Control Structure Language) in which modules did not know
about each other. Data communications was over a set of software buses
(think pipes) and common data stores. CSL provided the primitives to
move data from module to module and enforce sequential execution among
threads, an I don't care what order they run in (pseudo parallel), and
condition monitors. There were some tokens pushed around to simulate
control signals, etc., something like Petri nets. The point of this
exercise was to put together a problem solver that did not commit order
of computation constraints when there was no reason to do so. As we
learned more, we could modify the CSL to exhibit more directed behavior.
By the way, the pseudo parallel directive assigned random numbers
dynamically to parallel threads as priorities so that running the system
on the same data multiple times could exhibit multiple behaviors and
generate different answers.
I don't necessarily recommend reading another obscure paper (on CSL) but
if you are interested, a pdf copy is at
https://notatt.com/large-systems.pdf
--
Jeff Barnett