Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

A little Pop history

0 views
Skip to first unread message

Jonathan Laventhol

unread,
Nov 15, 1992, 10:52:54 AM11/15/92
to
Hi --

Yep, of course, Brian was right and saw my bug. Well, here's
the fixings.

define recursivemember(item, list);
var item, list, element;
if list = [] then
return(false)
endif;

for el in list do
if el = item then
return(true)
elseif islist(el) and recursivemember(item, el) then
return(true);
endif
endfor;

return(false);
enddefine;

This illustrates another facet of the language: that boolean
expression evaluation is defined to be ordered (left to right), as is
argument evaluation.

Sorry/thanks.
J.


b...@anarres.CS.Berkeley.EDU (Brian Harvey) writes:

>j...@deshaw.com (Jonathan Laventhol) writes:
[...]
>| for el in list do
>| if el = item then
>| return(true)
>| elseif islist(el) then
>| return(recursivemember(item, el));
>| endif
>| endfor;
[...]
>Can this be right? It looks to me as if it'll say that X isn't
>a recursivemember of [[A B C] [X Y Z]] because it isn't in the
>first sublist.

Brian Harvey

unread,
Nov 15, 1992, 4:20:26 PM11/15/92
to

Of course, these examples are entirely equivalent to what could be done
in Lisp. I think you should post examples using some of the more unusual
POP features, such as pattern matching.

Steve Knight

unread,
Nov 17, 1992, 12:52:42 PM11/17/92
to

Pop is very closely related to the Lisp family of languages, so it should
come as no surprise that many programs look similiar. The main reasons
for choosing to write in Pop rather than Lisp are aesthetic ones, in
my view. For example, the lack of distinction between the empty list
(nil) and the false value (false) in almost all Lisps is, to the eyes of
many Poppers like myself, clunky and ugly. But in practical programming
such differences are not a big deal.

The key difference between Lisp and Pop's semantics (putting the
important issue of syntax aside for one moment) is the open stack.
After all, the Pop pattern-matcher is a nice idea, incorrectly
implemented, but trivial to duplicate in extensible languages such as
Lisp or Prolog. Having the open stack at the heart of all programming
idioms gives the language family its distinctive character.

One idiom will illustrate my point nicely. The task is (say) to
add all the numbers in a list. One of the built-in routines is
called "applist" which takes a list and a procedure as arguments.
It then applies the procedure to each element of the list in turn.
We can add all the numbers in the list as follows:

applist( 0, LIST, nonop + ) ;;; nonop disables infix-ness of +

Oh? Did I say applist takes 2 arguments? Well, I did but don't forget
that Pop uses an open stack, so there's no notion of argument checking.
All the arguments go on the stack and come off again. This example works
because the iteration starts with 0 on the stack, then element 1 gets
pushed, then the top two items get added and replaced by the single result,
and so on and so on. At the end of the iteration, all that remains on
top of the stack is the answer.

Of course, the equivalent functional programming idiom (e.g. in Lisp)
would be to define a slightly different higher-order operator to
"applist" which might be called "fold".

define fold( sofar, list, op );
if null( list ) then
sofar
else
fold( op( sofar, hd( list ) ), tl( list ), op )
endif
enddefine;

That's always true. You can always do it in Lisp (or whatever) but
you just do it differently. Here's another nice use of the open stack.
Suppose we want to flatten a tree (represented as a list of lists)
into (say) a list. What's the idiomatic way of writing that? The answer
is to use the stack as a temporary collection zone.

define flatten_to_stack( tree );
if islist( tree ) then
applist( tree, flatten_to_stack )
else
tree
endif
enddefine;

define flatten( tree );
[% flatten_to_stack( tree ) %]
enddefine;

The virtue of writing flatten this way is that, for example, it might
turn out that we didn't want a list but we wanted a vector (1D array).
Well that was easy ...

define flatten( tree );
{% flatten_to_stack( tree ) %}
enddefine;

Changed my mind. I wanted to convert the vector into a string. Fine ...

define flatten( tree );
consstring(#| flatten_to_stack( tree ) |#)
enddefine;

Note the use of count brackets "#|" and "|#" in the above example. These
simply count the difference in the length of stack before and after the
expression between them. This makes sense when you know that consstring
expects a count of the number of characters on the stack as its "topmost"
argument.

This is probably far too long a post already but I could go on about the
open stack for another couple of hours. It is the most wonderful thing
about Pop and defines the language family. It is never the case that
it makes you able to do things that can't be done in Lisp-like
languages but it does make you want to say them rather differently!

Steve

Aaron Sloman

unread,
Nov 15, 1992, 8:50:01 PM11/15/92
to
b...@anarres.CS.Berkeley.EDU (Brian Harvey) writes:

> Date: 15 Nov 92 21:20:26 GMT
> Organization: University of California, Berkeley


>
> Of course, these examples are entirely equivalent to what could be done
> in Lisp.

Agreed, or in Prolog

> I think you should post examples using some of the more unusual
> POP features, such as pattern matching.

I am tempted. Hope it's not too long.

The Pop-11 pattern matchers (there are two now) are unlike
prolog unification in that they go only one way: i.e. you can have
variables only on the right: so you can't match two variables and
have a later binding of one of them propagate to the other). However
the Pop-11 matchers compensate for this (some would say more than
compensate, others would say less than compensate) by allowing
segment variables in the middle of a list, plus restriction
procedures.

Here's an example of the use of forevery, which uses the
(original) Pop-11 matcher;

vars x, y, z;

vars properties =
[[g is h][e is f] [d is e] [b is c] [a is b] [h is i]];

forevery [[?x is ?y][?y is ?z]] in properties do
;;; extend the list of properties
[[^x is ^z] ^^properties] -> properties;
endforevery;

properties ==>
** [[a is c]
[d is f]
[g is i]
[g is h]
[e is f]
[d is e]
[b is c]
[a is b]
[h is i]]


Using the matcher you can define a very simple parser. In what
follows ?<var> matches a single item in a list ??<var> matches
an arbitrarily long segment of a list. If the <var> is followed
by a colon, then the next item is taken to refer to a restriction
predicate, as in ?word:noun. If a match is successful and the
restriction predicate returns a non-boolean result, then the
result is bound to the variable.

;;; first define recognisers for lexical categories
define recognise(word, category, list) -> result;
lvars word, category, list, result;
if member(word, list) then
[^category ^word] -> result;
else
false -> result
endif;
enddefine;;

;;; define particular recognisers by partially applying recognise

define noun =
recognise(%"noun", [cat dog mouse elephant man mat tree owl]%)
enddefine;

define det =
;;; recogniser for determiners
recognise(%"det", [each all every a the some]%)
enddefine;

define adj =
recognise(%"adj", [happy hungry big rich yellow striped]%)
enddefine;

define verb =
recognise(%"verb", [ate struck chased liked stroked tickled]%)
enddefine;

;;; a utility to check that all elements of a list satisfy a predicate
define all_are(list, pred) -> result;
lvars list, pred, result = false, item;
for item in list do
returnunless(pred(item))
endfor;
true -> result;
enddefine;

;;; now recogniser/parsers for non-terminals

define np(list) -> result;
;;; parse noun phrases
lvars list,result;

vars word1,word2,word3,list1; ;;; pattern variables

if list matches [?word1:det ?word2:noun] then
[np ^word1 ^word2]
elseif list matches [?word1:det ?word2:adj ?word3:noun] then
[np ^word1 ^word2 ^word3]
elseif
list matches [?word1:det ??list1: ^(all_are(%adj%)) ?word2:noun]
then
[np ^word1 [adjs ^list1] ^word2]
else
false
endif -> result
enddefine;

define vp(list) -> result;
;;; parse verb phrases
lvars list, result;
vars word list2;
if list matches [?word:verb ??list2:np] then
[vp ^word ^list2]
else
false
endif -> result;
enddefine;

define sentence(list) -> result;
lvars list,result;
vars list1, list2;
if list matches [??list1:np ??list2:vp] then
[sentence ^list1 ^list2]
else
false
endif -> result
enddefine;

;;; now for a bit of magic

np([the dog]) =>
** [np [det the] [noun dog]]

np([the big dog]) =>
** [np [det the] [adj big] [noun dog]]

np([each hungry yellow cat]) =>
** [np [det each] [adjs [hungry yellow]] [noun cat]]

vp([chased the cat]) =>
** [vp [verb chased] [np [det the] [noun cat]]]

sentence([the dog chased the rich yellow cat]) ==>
** [sentence [np [det the] [noun dog]]
[vp [verb chased]
[np [det the] [adjs [rich yellow]] [noun cat]]]]

sentence([every rich happy elephant tickled some
big striped yellow mouse]) ==>
** [sentence [np [det every] [adjs [rich happy]] [noun elephant]]
[vp [verb tickled]
[np [det some]
[adjs [big striped yellow]]
[noun mouse]]]]

;;; or with a more pictorial display
lib showtree

showtree(sentence([the big cat liked the striped mat]));

/--------\
|sentence|
\---+----/
/----------+----------\
/+-\ /+-\
|np| |vp|
\+-/ \+-/
/------+------\ /------+------\
/-+-\ /-+-\ /-+--\ /-+--\ /+-\
|det| |adj| |noun| |verb| |np|
\-+-/ \-+-/ \-+--/ \-+--/ \+-/
| | | | /------+------\
| | | | /-+-\ /-+-\ /-+--\
the big cat liked |det| |adj| |noun|
\-+-/ \-+-/ \-+--/
| | |
| | |
the striped mat

Actually it looks nicer on a terminal with graphics, but that's
not for a news posting!

Of course, there are better ways to write parsers, but this
illustrates the matcher.

(The parsing example was based on one of Poplog Pop-11's online
"teach" files, most of which contain examples that can be run at the
terminal.)

Aaron
--
Aaron Sloman, School of Computer Science,
The University of Birmingham, B15 2TT, England
EMAIL A.Sl...@cs.bham.ac.uk OR A.Sl...@bham.ac.uk
Phone: +44-(0)21-414-3711 Fax: +44-(0)21-414-4281

Luc Beaudoin

unread,
Nov 16, 1992, 11:23:54 AM11/16/92
to

While we're on the subject of the history of Pop languages, I'd like
to say that, in my humble opinion, it was a mistake to call Poplog's
Pop, Pop-11; for if a different company produces a programming
system with a language called Pop-12, there is a great danger that
potential clients will tend to assume that it's better because it is
apparently newer.

Oh well, what's done is done.

Luc
--
-------------------------------------- -----------------------------------
Luc Beaudoin | School of Computer Science
E-mail: l...@cs.bham.ac.uk | University of Birmingham
voice: +44 (21) 414-4766 or 3743 | Birmingham B15 2TT, UK

Aaron Sloman

unread,
Nov 14, 1992, 6:20:34 PM11/14/92
to
j...@deshaw.com (Jonathan Laventhol) writes:

Hello Jonathan

> A little history lesson for those on the net.
>
> There was a Pop1 but I don't know anything about it.
>
> Pop2 was a programming language developed at the University of
> Edinburgh (which is in SCOTLAND).
>
> The 'POP' of Pop11, Poplog et cetera, is a Robin Popplestone,
> who, last I knew, was working in a university in Boston. I
> never asked him why it's not called Burst11, but there you go.
> The best original paper is 'The Design Philosophy of Pop2' (if
> I remember the title correctly), by Popplestone.
>
> There was no Pop3 to Pop10, to the best of my knowledge.

Pop10 was implemented for the PDP10 in Edinburgh by Julian
Davies and was used there and in various other places including
Toronto I think, for several years in the mid 70s.

> A Pop implementation at the University of Sussex (which is in
> ENGLAND) was called Pop11 after the PDP-11 it ran on.

It was implemented by Steve Hardy, now somewhere in California. He
called it Pop-11 because the PDP10 version had been called Pop10

> A better snippet:


> define recursivemember(item, list);
> var item, list, element;
> if list = [] then
> return(false)
> endif;
>
> for el in list do
> if el = item then
> return(true)

> elseif islist(el) then
> return(recursivemember(item, el));
> endif
> endfor;
>

> return(false);
> enddefine;

OOPS! You've been away from it a bit too long.

define recursivemember(item, list);
lvars item, list, el;

for el in list do
if el = item then
return(true)
elseif islist(el) and recursivemember(item, el) then
return(true)
endif
endfor;

return(false);
enddefine;

recursivemember("cat", [[the dog][a silly [black cat]] mouse]) =>
** <true>
recursivemember("mouse", [[the dog][a silly [black cat]] mouse]) =>
** <true>
recursivemember("dog", [[the dog][a silly [black cat]] mouse]) =>
** <true>
recursivemember("house", [[the dog][a silly [black cat]] mouse]) =>
** <false>

Pop2, Pop10, ALPHAPOP (the Mac version), the PDP11 version of Pop-11
and early Poplog Pop-11 up to about 1984(?) had only dynamically
scoped variables ("vars"), like early lisp systems. From about 1984
Poplog Pop-11 had both dynamic scoping and lexical scoping of
variables ("lvars").

Luc Beaudoin

unread,
Nov 14, 1992, 5:37:57 PM11/14/92
to
b...@anarres.CS.Berkeley.EDU (Brian Harvey) writes:

> Can this be right? It looks to me as if it'll say that X isn't
> a recursivemember of [[A B C] [X Y Z]] because it isn't in the
> first sublist.

Yep.

Steve Knight

unread,
Nov 18, 1992, 7:29:21 AM11/18/92
to
Nice to hear from Jonathan again. So as a greeting I'll stick the boot
in on his implementation of recursivemember :-)

Jonathan implements recursivemember like this:


> define recursivemember(item, list);
> var item, list, element;
> if list = [] then
> return(false)
> endif;
> for el in list do
> if el = item then
> return(true)

> elseif islist(el) and recursivemember(item, el) then
> return(true);
> endif
> endfor;
> return(false);
> enddefine;

I still don't much like this as a solution. For example, you can never
find [] in a tree -- but you can find all other lists. We need to take
a systematic view of the meaning of lists representing tress. We can
either adopt the view that the trees are built up from "dotted pairs", to
borrow Lisp-speak, or that the lists simply represent linear
collections.

The first view leads to this solution. But look how horribly [] gets
treated again.

define recmem( x, tree );
x = tree or
tree.islist and not( tree.null ) and
( recmem( x, tree.hd) or recmem( x, tree.tl ) )
enddefine;

Ikk. The more typical Pop-style solution is to treat the lists as being
linear collections rather than binary trees. This gives us the following
solution

define recmem( x, tree );

define recmemlist( x, list );
not( list.null ) and
( recmem( x, list.hd ) or recmemlist( x, list.tl )
enddefine;

x = tree or tree.islist and recmemlist( x, tree )
enddefine;

This is obviously an inefficient solution -- it chews up stack space
like there is no tomorrow, because no implementation of Pop does
tail-call optimisation. So you need to modify the solution like this

define recmem( x, tree );

define recmemlist( x, list );
not( list.null ) and
( recmem( x, list.hd ) or chain( x, list.tl, recmemlist ) )
enddefine;

x = tree or tree.islist and chain( x, tree, recmemlist )
enddefine;

The call to "chain" forces the currently active procedure to exit and
invoke the procedure on top of the stack. In other words this is a manual
implementation of tail-call optimisation.

As an alternative, you might choose to expand the recursion into a
loop. But that's a detail and there would be only a minor
performance gain. And you would use up more stack space. So take
your pick. Here's the above routine converted to use a loop.

define recmem( x, tree );
x = tree or
tree.islist and
lblock ;;; begin new lexical block.
lvars result = false; ;;; tree might be null.
until tree.null do
;;; Nice Pop idiom. "dest" returns head and tail of a
;;; non-empty list. So E.dest -> E is an expression that
;;; steps E down the list and leaves the head on the stack.
recmem( x, tree.dest -> tree ) -> result;
quitif( result ); ;;; shortcut when we find true.
enduntil;
result ;;; push the result.
endlblock
enddefine;

Since this version was created from the recursive version, it retains all
the behaviour we want. In particular, it treats [] as a legitimate search
target and doesn't make it a special case. One criticism, however, would
be that the algorithm doesn't immediately generalise to a representation
of trees as vectors (1D arrays).

To remedy that, we can introduce the appropriate for loop:

define recmem( x, tree );
x = tree or
x.isvector and
lblock
lvars result = false;
for i in_vector tree do
;;; Another cute idiom. ->> is an assignment that duplicates
;;; the value on the stack before the assignment.
quitif( recmem( x, t ) ->> result )
endfor;
result
endlblock
enddefine;

This version can be trivially modified to cope with any data type
that provides a loop or iteration procedure. It has no special treatment
for any element. It is efficient in both space and time, although not
quite optimal in either. And a mere 11 lines of non-commented code.

Yow!

Steve

Steve Knight

unread,
Nov 19, 1992, 1:47:39 PM11/19/92
to
Luc says (hi Luc!):

> While we're on the subject of the history of Pop languages, I'd like
> to say that, in my humble opinion, it was a mistake to call Poplog's
> Pop, Pop-11; for if a different company produces a programming
> system with a language called Pop-12, there is a great danger that
> potential clients will tend to assume that it's better because it is
> apparently newer.

Nah ... surely they'll just think it will only run on the PDP-12 ?

:-)

Steve

Ray Dunn

unread,
Nov 26, 1992, 12:45:54 AM11/26/92
to
This has turned into a long posting, for which I apologize, but hopefully
some will be interested in the reminiscences of an old hacker and the early
history.

In refd article, a...@cs.bham.ac.uk (Aaron Sloman) writes:
>Pop10 was implemented for the PDP10 in Edinburgh by Julian
>Davies and was used there and in various other places including
>Toronto I think, for several years in the mid 70s.

I have no recollection of Pop10, but POP-2 (please notice the capitalisation
and the hyphen) for the PDP10 (later known as the DECsystem 10) was
implemented by myself and Malcolm Atkinson (now of Cambridge) in 1970 or
1971.

Didn't Julian implement an early version of POP-2 based Prologue?

Malcolm worked from Cambridge and I worked from Edinburgh into the PDP10 of
TimeSharing Ltd in London, via 10cps model 33 teletypes in our homes - is
this the earliest case of teleworking in the UK? (:-).

After 3 months of design and heavy coding (and enormous telephone bills) we
met one Easter Weekend in London and compiled the system for the first
time. By the end of the weekend we had 2+2 => working!

The initial implementation of POP-2 was "Multi-POP", in the Department of
Machine Intelligence and Perception in Edinburgh University. It was done
on a Elliott Automation 4120 (later ICL 4130) by Robin Popplestone, Dave
Pullin and myself, and was an eight user multi-access system based totally
on POP-2 (which was the control language of the system also).

This came live in 1967 (possibly late 1966). It's now said that we later
realized that we were the first online multi-access system to come live in
the UK, several weeks ahead of the Cambridge system, but I've no proof of
that nor do I remember when the Cambridge system went on-line!

When you consider the research output of the DMIP in the late sixties and
early seventies, it is difficult to believe that all the work was done on a
machine with 64K 24-bit words, 2 microsecond add time, through model 33
10cps teletypes.

A PDP7 with a model 340 (?) vector graphics display and later a Honeywell
316 were linked to the 4130 through a "high-speed" data link (into the ATU
of the 4130 for devotees).

The PDP7 was used by John Oldfield and his group for early CAD system
research, and the H316 was the control processor for FREDDY, the hand-eye
robot system that formed the focus of much of the department's research.
We would have used a PDP11 instead of the 316, but DEC announced that
machine a matter of days after we'd ordered the Honewell (shame - the order
was not retractable)! I could probably still key in the boot code on the
front-panel hand-keys of the 316 if I had to!

After the technical success on the 4100, POP-2 was implemented on the following
systems in addition to the PDP10:

ICL System-4 by John Barnes,
ICL1900 under GEORGE3 by John Scott & Frazer Dingwall (now with Digital),
CDC6000 under SCOPE by Ken McCallum (Strathclyde) and Lawrence Shafe,
MODULAR-1 again by Frazer Dingwall,

and (believe it or not) also on the

IBM 360/370 under TSO by Rod Steel
(so much for the complaints about the Intel architecture for POP
implementations!)

The foundation of these implementations was a machine independant compiler
written in POP-2 itself.

Much of this later compiler work was funded by Conversational Software Ltd,
a spin-off company from the Department under the Chairmanship of Donald
Michie.

I was the Technical Manager of CSL, and the Managing Director was Peter
Burt who is probably the only person connected to POP-2 to achieve
perpetual fame - he is now the Treasurer and Chief General Manager of the
Bank of Scotland - his is the signature on the Scottish pound notes and
fivers etc!!

Robert Rae (now with AIAI in Edinburgh) was one of many who later slaved at
enhancing the 4130 and DECsystem 10 implementations.

There were two main POP-2 reference manuals published in addition to
articles in the Machine Intelligence series:

The "Blue Book": "POP-2 Papers" by R.M. Burstall, J.S. Collins and R.J.
Popplestone, Edinburgh University Press, 1968.

and

The "Silver Book": "Programming in POP-2" by R.M. Burstall, J.S. Collins
and R.J. Popplestone, Edinburgh University Press, 1971.


As Rod Burstal and Robin Popplestone wrote in "Programming in POP-2":

"POP-2 is a new computer language. Conceptual affinities can be traced to:

1. John McCarthy's LISP (1962), from which it takes ideas for handling
non-numerical objects of computation (lists).

2. Christopher Strachey's CPL (1963) and Peter Landin's ISWIM (1966),
which foreshadow the aim of making a programming language into a notation
with full mathematical generality, akin to algebra.

3. Cliff Shaw's JOSS (1964), which it resembles in its 'conversational'
facilities.

[ Note: Those who baptized this group with its first irrelevant flame
war over the lack of a BASIC newsgroup, should be interested to know
that JOSS is also the grand-daddy of BASIC ]

4. Robin Popplestone's POP-1 (1968) of which POP-2 represents a rationalized
and greatly extended development.

[ Note: 1968 is the date of publication of Robin's paper on POP-1 in
Machine Intelligence 2. POP-1 and its precursor COWSEL had been around
for several years previously. The first POP-2 reference manual
appeared in the same book, and POP-2 had already been running for at
least a year - was Robin stuck for a paper to present at the
conference? (:-)

"Machine Intelligence 2", edited by E. Dale and D. Michie, Oliver & Boyd,
Edinburgh, 1968 ]

"These ingredients have produced a powerful but compact language for
non-numerical programming. POP-2 was designed for implementation on a
medium-sized machine with a modest investment in system programming.

[ Ha - it was easy for THEM to say that! (:-) ].

Because the language had to be stripped down to the level of the basic
mathemeatical principles of programming, it is unrestrictive and
open-ended.

The main distinctive features of POP-2 are:

1. The syntax is very simple but the programmer has some freedom to extend it.

2. The programmer can create a wide variety of data structures: words,
arrays, strings, lists, and records. A 'garbage collector'
automatically controls storage for him.

3. Functions can be used in the same manner as in mathematics or logic, for
example, as arguments or results of other functions, with no unfortunate
restrictions on free variables.

4. The novel device of 'partial application' allows one to fix the value of
one or more parameters to the function. This has a surprising multiplicity
of uses, for example, to generalize the notion of an array to non-numerical
subscripts and to disguise the distinction between stored values and
computed values.

5. Another technique, 'dynamic lists' enables a physical device like a
paper-tape reader to be treated as if it were an ordinary list.

6. The programmer can call for immediate execution of statements at any time,
giving facilities for conversational use and rapid debugging of complex
programs.

7. The facility for immediate execution together with the variety of data
structures available makes POP-2 suitable for use as the control language
of a time-sharing system, enabling the user to effect filing, editing,
compilation and execution.

8. In the context of the widespread shortage of system programmers, a crucial
feature is the open-endedness of the language. Work normally done in
machine code by highly skilled system programmers can be done in POP-2
itself.

9. POP-2 is compact and easy to implement. On the ICL 4100, for example, the
whole system for compiling, running, and time sharing occupies only 22K of
core (24-bit words). The effort needed to construct the complete system
was less than 5 man-years. A machine independant POP-2 in POP-2 compiler
has been written.

Unquote.

Later, in the reference manual section, the main features of the language
are detailed with their associated predecessors:

"The following main features are provided. Roughly analogous features of
some other programming languages are mentioned in brackets as a guide:

(a) Variables (cf. ALGOL but no types at compile time)
(b) Constants (cf. ALGOL numeric string constants, LISP atoms and list
constants)
(c) Expressions and statements (cf. ALGOL)
(d) Assignment (cf. ALGOL, also CPL left-hand functions)
(e) Conditionals, jumps and labels (cf. ALGOL)
(f) Functions (cf. ALGOL procedures but no call by name, cf. CPL and ISWIM
for full manipulation of functions)
(g) Arrays (cf. ALGOL; also CPL for full manipulation of arrays)
(h) Records (cf. COBOL, PL/1, Wirth-Hoare ALGOL records, CPL nodes)
(i) Words (cf LISP atoms)
(j) Lists (cf. LISP, IPL-V)
(k) Macros
(l) Use of compiler during running (cf. LISP, TRAC)
(m) Immediate execution (cf. JOSS, TRAC).

Unquote

It's interesting to note that Rod Burstall didn't initially believe that
the open stack should be a concept that the user was overtly aware of (that
clearly goes against the principle that the language is a mathematical
notation), and although he mentions the stack when he has to, no emphasis
on the stack is given in either of the manuals.

I had a very different view, and explained the stack mechanism in some
detail in the user manuals I wrote for the Multi-POP and PDP10 versions.

Finally, I think it's appropriate to repeat the acknowledgement that
appears in the introduction to Programming In POP-2, so that the remainder
of the early pioneers' names are written into this record:

"The suggestions and criticism of many members of the Department of Machine
Intelligence and Perception at Edinburgh University, especially Mr Ray Dunn,
and of others outside it, especially Mr Michael Healy and Mr Michael
Woodger and those who have implemented POP-2 systems, are gratefully
acknowledged. Technical consultations with Dr Michael Foster and Dr David
Park about storage control were invaluable. Thanks are due to Mrs Pat
Ambler who has helped in the editing of this book, to Mr Bruce Anderson who
has provided answers to the exercises in the primer and to Miss Eleanor
Kerse who typed the new manuscript through numerous drafts.

"POP-2 has been implemented on the ICL 4100 by Messrs Robin Popplestone,
Ray Dunn, and David Pullin, on the ICL 1900 by Mr John Scott, and on the
ICL System-4 and IBM 360 by Mr John Barnes and Mr Rod Steel (using a
machine independant version of the compiler written in POP-2). The library
of programs has been built up by many hands and organized by Mr Ray Dunn
and Mr Robert Owen. The work has been part of a machine intelligence
project, directed by Professor Donald Michie, whose guidance and
encouragement have been invaluable, and supported by Edinburgh University,
the Science Research Council, and the Medical Research Council.

"We would like to thank Edinburgh University Press for their painstaking
work on a very technical book, especially Dr Helen Muirhead, whose care and
patience never failed us.

R. M. Burstall (Editor)

Unquote
--
Ray Dunn at home | Beaconsfield, Quebec | Phone: (514) 630 3749
r...@philmtl.philips.ca | r...@cam.org | uunet!sobeco!philmtl!ray

Jack Campin

unread,
Nov 26, 1992, 10:08:26 AM11/26/92
to
r...@philmtl.philips.ca (Ray Dunn) wrote:
> I have no recollection of Pop10, but POP-2 (please notice the capitalisation
> and the hyphen) for the PDP10 (later known as the DECsystem 10) was
> implemented by myself and Malcolm Atkinson (now of Cambridge) in 1970 or
> 1971.

Malcolm's a professor in this department, and was in Edinburgh for a while
before that; hasn't worked at Cambridge since 1980-ish. His current
research is persistent programming languages, persistent-store-based system
architectures, OODBs, that sort of thing.

Which suggests a question. Is there a Pop with full-strength orthogonal
persistence? Is anybody working on one? Is there anything about Pop that
makes it impossible?

--
-- Jack Campin room G092, Computing Science Department, Glasgow University,
17 Lilybank Gardens, Glasgow G12 8RZ, Scotland TEL: 041 339 8855 x6854 (work)
INTERNET: ja...@dcs.glasgow.ac.uk or via nsfnet-relay.ac.uk FAX: 041 330 4913
BANG!net: via mcsun and uknet BITNET: via UKACRL UUCP: ja...@glasgow.uucp

Aaron Sloman

unread,
Nov 26, 1992, 6:24:46 PM11/26/92
to
r...@philmtl.philips.ca (Ray Dunn) writes:

> Date: 26 Nov 92 05:45:54 GMT
> Organization: Not Philips.
> ....


> I have no recollection of Pop10, but POP-2 (please notice the capitalisation
> and the hyphen) for the PDP10 (later known as the DECsystem 10) was

> implemented by myself and Malcolm Atkinson ...

I think Julian Davies implemented POP-10 (I can't recall whether
there was or wasn't a hyphen) more or less from scratch using PDP-10
assembler because he did not wish to have any obligations to the
other version, owned then by Conversational Software Ltd. He also
had some preferences of his own, including having the Boyer-Moore
"ped" editor included in the system (i.e. not a library) also
implemented in assembler.

He made some other changes, but I can't recall what they were,
except that I seem to recall he allowed upper and lower case
characters and provided support for typing them in on upper case
only terminals, by toggling case conversion using a control
character.

I think he also extended the maximum identifier size so that it was
effectively unlimited. Previously it had been six, or eight
characters I think.

Sussex people and some Edinburgh people used Julian's POP-10 for
several years before Robert Rae's WPOP (WonderPOP) became available
with additional facilities, including processes and hased property
tables, plus typed arithmetic variables with associated compile time
optimisation.

Aaron

Aaron Sloman

unread,
Nov 26, 1992, 6:48:28 PM11/26/92
to
ja...@dcs.glasgow.ac.uk (Jack Campin) writes:

> Date: 26 Nov 92 15:08:26 GMT
> Organization: COMANDOS Project, Glesga Yoonie
> ...
> ....Is there a Pop with full-strength orthogonal


> persistence? Is anybody working on one? Is there anything about Pop that
> makes it impossible?

I don't know precisely what you mean by "orthogonal persistence",
but I'll guess that you mean being able to suspend a program and
then restart it on another machine or in another version of the
operating system or a later version of the Pop compiler system.

There are saved images in Poplog Pop-11, including layered saved
images (you can create a saved image relative to another saved
image) and shareable saved images (i.e. containing non-writeable
code and data that can be shared, in the same portion of memory,
between different users.)

But orthogonal persistence in the sense defined above is probably
very difficult, or impossible, at least for versions like Poplog
Pop-11 for the following reasons:

(a) Procedures are first class objects that can occur wherever any
other data-type represented by a pointer can occur, and procedures
are fully compiled. Thus user procedures are records in the heap
containing machine instructions. This means that saving the heap
will produce something that cannot be expected to work on another
machine. If the implementation used only some intermediate code and
interpreted it, this objection might go away, though the cost would
be a loss in speed of execution. Alternatively a VERY clever system
could somehow translate the compiled procedures into a machine
independent notation, or could keep some intermediate representation
of the compiled procedures created during compilation, and use that
in the persistent version. The cost would be time required for doing
the translation and possibly the extra space for storing both
versions of each procedure.

(b) System procedures are no different from user procedures and they
too are objects that can be stored in user datastructures on the
heap. So datastructures in the heap can contain pointers into system
procedure records containing machine instructions. Apart from the
problem mentioned in (a) this raises the further problem that
different versions of Poplog Pop-11 may have the same system
procedures located at different places in (virtual) memory. Thus a
saved image created using one version, if restored relative to a
newer version of Poplog, even on the same machine, may contain
pointers to the wrong locations in the system. This could be
overcome either by replacing all such references with symbols
indicating what they point to before a saved image is made and then
replacing the symbols when the system starts up. I can think of
other possibilities, all with efficiency costs.

(c) Other problems arise out of the access that Poplog Pop-11 gives
you to system facilities, e.g. pipes in Unix and mailboxes in VMS,
that are not available in all operating systems. E.g. it takes
advantage of MMAP for shareable saved images on some Unix systems,
but does something different when that's not available.

I don't think any of this means that full persistence is totally
impossible. The tradeoffs may not make it worth while though.

There are various partial solutions in the form of libraries that
allow datastructures to be saved on disk in a file and then restored
later. The files use a format that's independent of machine or
operating system or version of Pop-11. But the data-structures then
can't include procedures, although they can contain source code
instructions for creating procedures. They could also include lists,
strings, words, arrays, records, vectors, properties, numbers, etc.

LIB DATAFILE, originally implemented by David Hogg (now a professor
at Leeds) provides such a mechanism. I believe Jonathan Meyer, at
Integral solutions ltd has produced a more general version.

Apologies if my answer is irrelevant because I misunderstood the
question.

Aaron
---

Ian Rogers

unread,
Nov 27, 1992, 7:19:03 AM11/27/92
to
a...@cs.bham.ac.uk (Aaron Sloman) writes:

> ja...@dcs.glasgow.ac.uk (Jack Campin) writes:
> > ....Is there a Pop with full-strength orthogonal
> > persistence? Is anybody working on one? Is there anything about Pop that
> > makes it impossible?
>
> I don't know precisely what you mean by "orthogonal persistence",
> but I'll guess that you mean being able to suspend a program and
> then restart it on another machine or in another version of the
> operating system or a later version of the Pop compiler system.


Keylink [1] have implemented an SQL interface between Poplog Prolog
and Ingres Oracle etc. This gives you a limited persistence
(amongst other things) for prolog programs. Of course, it's easy
peasy to call prolog from Pop11 :)

You can read about it in the Plug92 Proceedings which is, err, umm,
coming Real Soon Now.

Ian.

[1] Keylink Computers Ltd. 0926 50909

Charles Lasner

unread,
Nov 27, 1992, 9:56:03 AM11/27/92
to
In article <1992Nov27....@syma.sussex.ac.uk> an...@syma.sussex.ac.uk (Andrew Holyer) writes:
>
>A PDP7 with a model 340 (?) vector graphics display and later a Honeywell
>316 were linked to the 4130 through a "high-speed" data link (into the ATU
>of the 4130 for devotees).

PDP7/9 would use 339, PDP-8 uses 338, and PDP-10 uses 340. All quite similar
but buss-specific.

cjl

Rich Alderson

unread,
Nov 30, 1992, 7:36:02 PM11/30/92
to
In article <1992Nov27....@syma.sussex.ac.uk>, andyh@syma (Andrew Holyer) writes:
>After the technical success on the 4100, POP-2 was implemented on the following
>systems in addition to the PDP10:
>
> ICL System-4 by John Barnes,
> ICL1900 under GEORGE3 by John Scott & Frazer Dingwall (now with Digital),
> CDC6000 under SCOPE by Ken McCallum (Strathclyde) and Lawrence Shafe,
> MODULAR-1 again by Frazer Dingwall,
>
>and (believe it or not) also on the
>
> IBM 360/370 under TSO by Rod Steel
> (so much for the complaints about the Intel architecture for POP
> implementations!)

What has Intel's architecture to do with the 360 architecture?
--
Rich Alderson 'I wish life was not so short,' he thought. 'Languages take
such a time, and so do all the things one wants to know about.'
--J. R. R. Tolkien,
alde...@leland.stanford.edu _The Lost Road_

Stefek Zaba

unread,
Dec 2, 1992, 9:52:18 AM12/2/92
to
In comp.lang.pop, alde...@elaine46.Stanford.EDU (Rich Alderson) writes:

> What has Intel's architecture to do with the 360 architecture?

Oh dear. Try "segments", or, "absence of a linearly addressable memory".
> --
> Rich Alderson
>
> alde...@leland.stanford.edu

So who's teaching Computer Architectures 101 this year? Must be slipping :-)

Ray Dunn

unread,
Dec 1, 1992, 8:07:17 PM12/1/92
to
In refd article, alde...@elaine46.Stanford.EDU (Rich Alderson) writes:
[incorrectly attributing the posting]:

>>After the technical success on the 4100, POP-2 was implemented on the following
>>systems in addition to the PDP10:
>> .......

>>and (believe it or not) also on the
>>
>> IBM 360/370 under TSO by Rod Steel
>> (so much for the complaints about the Intel architecture for POP
>> implementations!)
>
>What has Intel's architecture to do with the 360 architecture?

Nothing. I meant to imply that although there have been statements about
the difficulty of porting POP to Intel 80x86 based systems, POP was
implemented on the IBM 360, an architecture arguably much more difficult
to implement POP on.

cg...@btma74.nohost.nodomain

unread,
Dec 4, 1992, 11:43:17 AM12/4/92
to

In article <1992Dec1.0...@leland.Stanford.EDU>, alde...@elaine46.Stanford.EDU (Rich Alderson) writes:
|>Path: se.alcbel!alcbel!ub4b!mcsun!uunet!stanford.edu!leland.Stanford.EDU!alderson
|>From: alde...@elaine46.Stanford.EDU (Rich Alderson)
|>Newsgroups: comp.lang.pop
|>Subject: Re: A little Pop history
|>Message-ID: <1992Dec1.0...@leland.Stanford.EDU>
|>Date: 1 Dec 92 00:36:02 GMT
|>References: <Bxpq...@deshaw.com> <BxqBI...@cs.bham.ac.uk> <1992Nov26.0...@philmtl.philips.ca> <1992Nov27....@syma.sussex.ac.uk>
|>Sender: ne...@leland.Stanford.EDU (Mr News)
|>Reply-To: alde...@elaine46.Stanford.EDU (Rich Alderson)
|>Organization: Stanford University Academic Information Resources
|>Lines: 21
|>In-Reply-To: an...@syma.sussex.ac.uk (Andrew Holyer)
|>Originator: alde...@leland.Stanford.EDU

|>
|>In article <1992Nov27....@syma.sussex.ac.uk>, andyh@syma (Andrew Holyer) writes:
||>
|>What has Intel's architecture to do with the 360 architecture?
|>--

Nothing. The IBM 360 developed into a monster, a nightmarish beast with an
ever-expanding instruction set and umpteen diffrent kinds of ways of addressing
memory beyond the tiny quantity envisaged by the original designers. Getting
anything to run on it required in-depth knowledge of internals of the OS.

None of these mistakes were repeated in IBM's Intel-based PCs...

%^)

Ray Dunn

unread,
Dec 6, 1992, 12:35:27 AM12/6/92
to
I wrote:
>>What has Intel's architecture to do with the 360 architecture?
>
>Nothing. I meant to imply that although there have been statements about
>the difficulty of porting POP to Intel 80x86 based systems, POP was
>implemented on the IBM 360, an architecture arguably much more difficult
>to implement POP on.

By "nothing" I meant that there was no direct relationship between Intel
and IBM that caused the 80x86 family to be designed as it was.

As others have pointed out, both architectures have limited segment sizes
(at least prior to the 80386). I also seem to remember there was a problem
with implementing the POP-2 stack on the 360.

POP should really be quite easy to implement on the 80386 and later Intel
processors. On these machines the segments can be as big as the machine's
address space, and the segment register architecture can be used to great
advantage to provide separate address spaces and to simplify relocation of
code.

Has anyone written a compiler in 'C' or 'C++'?

What about a 'POP' machine - has there been any work to build some hardware
assistance. On a PC, there are several things that could be provided on
an option card - memory width expansion for type tags for example.

0 new messages