Volume 1, Number 1
UNIVAC in Pittsburgh 1953-1963
by C.J. "Sam" Huston
After the end of World War II, I went to work for Remington Rand in
its punched card tabulating division. This equipment was manufactured in
Ilion, New York, and I went there for training. This was funded with the
G.I. Bill, but we had to pay our own room and board. The training center
was run with military precision by a former Army colonel: each Friday
there was a test, and the students who didn't score high enough flunked
out. When I completed the training, I went to Pittsburgh, where there
were over thirty Remington Rand tabulating customers, who used the
equipment for various business functions. One project I worked on was the
dividend accounting for Wheeling Steel Company.
I left Remington Rand and worked at two other companies for several
years. At the second one, I was in charge of the punched card accounting
section, which used Remington Rand machines. When the first UNIVACs came
out in 1951 and 1952, I decided that I wanted to work with computers. I
returned to Remington Rand in 1953 and in 1956 became the computer SE
manager for the Pittsburgh region, a position I held until 1963.
By the late 1950s there were four major UNIVAC customers in
Pittsburgh: U.S. Steel, Westinghouse Electric, Alcoa, and Pittsburgh Plate
Glass. U.S. Steel had a UNIVAC I on the 14th floor of a downtown office
building. In the spring of 1959 it was replaced by two UNIVAC IIs in a
converted warehouse outside of downtown. U.S. Steel used the UNIVACs for
accounting, statistical, and engineering tasks. I spent a lot of time at
U.S. Steel, programming in Short Code, which was just a step above machine
language. Toward the end of 1958, I was part of a team which converted
U.S. Steel's pension fund from an IBM 650 to the UNIVAC I. We used the
new FLOW-MATIC programming language, which was being developed by Grace
Hopper's group in Philadelphia. Since the UNIVAC was busy most of the day
running production work, the only time we could get for our testing was
one or two twenty minute shots between 2 A.M. and 6 A.M. This made life
difficult, because whenever we encountered a problem, we had to wait until
9 A.M. to call the FLOW-MATIC group and then stay around, hoping they
would call back with a fix. The conversion was a success, and other
applications were developed using the new language. I believe that we
were the first commercial user of FLOWMATIC.
Remington Rand decided to centralize its UNIVAC sales and technical
support resources in New York City. This created problems for those of us
out in the field. To get formal training, we had to go to New York or the
factory in Philadelphia, or else catch one of the classes offered at a new
customer's site. I went to the UNIVAC I programming class which was
taught for Alcoa at its New Kensington plant near Pittsburgh. We wished
that training was more readily available. In 1959 our shortage of skilled
people was eased somewhat when the company hired two ex-Air Force men.
They had a good bit of UNIVAC I programming experience, but very little
concept of how businesses used computers. It was also hard to get current
technical information. All too often, customer employees returned from
the national users group meetings with new information which was unknown
to me or the other local Remington Rand staff.
The UNIVAC I and II were almost fully program compatible: just two or
three instructions had new features on the II. After the pension fund
conversion, we started using FLOW-MATIC and COBOL (when it became
available) for some of our programs. The UNIVAC III was Remington Rand's
large scale transistorized computer, and U.S. Steel bought the first one,
which was delivered in the fall of 1962. The III was not program
compatible with the I and II, but U.S. Steel chose it on the basis of its
capabilities and a feeling that conversion to it would be easier than
switching to another vendor. The COBOL programs were moved without too
much trouble and the Short Code programs were rewritten. At U.S. Steel
the III replaced both IIs. Ten years later, UNIVAC had no third
generation successor product for the III. The 9200s and 9300s were not
program compatible, and they were smaller machines. U.S. Steel replaced
the III with IBM computers.
I did not have much direct involvement in the UNIVAC III
installation, because I was concentrating on the new UNIVAC 490s at U.S.
Steel and Westinghouse. The real-time and communications capabilities of
the 490 opened up new uses for computers. U.S. Steel used the 490 to
schedule the operations of its National Tube Division mill. This had
formerly been done by a cumbersome system using tabulating card machines
and manual tub files. At Westinghouse, the 490 ran a centralized
accounting system for all of its various divisions. I taught a class on
the 490 to people from Westinghouse's accounting section. The 490 greatly
reduced the delays in their month-end accounting process.
In the days of the vacuum tube and transistor computers, machines
were categorized as being scientific or business. Yet, the 490 at
Westinghouse was used for a "business" function. This illustrated the
flexibility of the computer: it could be used for just about anything
when the programmers put their minds to it. I encountered two more
examples of this flexibility with customers who used the UNIVAC 120. The
120 was a small computer, not that much above a card tabulator, with the
programming done entirely by plugboard. It had twelve 10-digit words of
vacuum tube memory. The U.S. Bureau of Mines experiment station in
Pittsburgh had a 120 and did all sorts of things with it. Besides the
expected payroll and inventory, they used it for statistical and
mathematical work, including curve fitting and solutions of simultaneous
equations. The other place was Wheeling Steel Company, which had two
120s, primarily for accounting. One of the engineers also used them to
work out theoretical problems in metallurgy, some of which were left over
from his graduate school days: he couldn't solve them back then, because
the computations were far too lengthy.
The Solid State computer was one of UNIVAC's successful products of
the late 1950s. It came in two versions: one handled Remington Rand's 90
column card, while the other used IBM's 80 column card. I thought that
Remington Rand management had gone too long trying to pretend that the 80
column IBM card wasn't dominant, so I was glad to see the Solid State 80.
We were still having problems getting timely training: just two weeks
after I went to a class on the Solid State 90, I had to teach one on the
SS 80 at Bethlehem Steel. The class was huge: 42 students--and some of
them had never even seen a punched card or imagined a magnetic tape. I
persuaded the Bethlehem managers to withdraw the novices and get the class
down to a manageable size.
In 1963 I took advantage of a chance to go to work for Westinghouse
Electric. At first, I dealt mainly with their Burroughs computers.
Westinghouse became the largest commercial Burroughs user in the world,
with seventeen installed; primarily 2500s and 3500s, but including 5500s
in Pittsburgh and Sunnyvale, California. A few years later, Burroughs was
very late on an important delivery for a new Westinghouse plant in
Virginia, delaying its opening. After that, Westinghouse stopped getting
Burroughs machines. The B5500 in the Pittsburgh research laboratory was
replaced by a UNIVAC 1108, and another 1108 was installed at the plant in
Baltimore. The 1100s never really caught on in Pittsburgh. The 1108 at
Westinghouse and one at Carnegie Mellon University (which didn't last very
long), both installed in the late 1960s, were the first ones. There
weren't many more.
I had a little more involvement with Westinghouse's 490, namely
trying to find a buyer for it after we had upgraded to a 492 or a 494. I
couldn't find anyone, and UNIVAC said it would charge $25,000 to remove
the 490. Finally, one of the CEs bought it for $1.00 on the condition
that he didn't have to remove it right away. He dismantled it in his
spare time and made over $30,000 selling it as repair parts to other
UNIVAC had a strong start in Pittsburgh with the three UNIVAC I
installations, but the company was not very good at supporting us with
technical help and program products. The slowness in bringing out newer
computer models in the late 1950s and early 1960s hurt a lot, too. Too
often, we just couldn't capitalize on the opportunities we had. UNIVAC I,
II, and III, the Solid State, and the 490 were all fine machines. Of
course today's desktop computers are more powerful than any of them, but
we had to learn to walk before we could run. It has been an amazing
change from the days of punched card tabulators.
NOTE FROM THE EDITOR
It is easy to see now that this newsletter should have been started
ten or twenty years ago, but no one did. There's nothing to gain by
waiting any longer. It is over forty years since the first UNIVAC I was
accepted by the Census Bureau (March 1951) and close to forty years from
the installation of the first Burroughs UDEC at Wayne State University
(December 1953). Many of the people who worked with these early computers
are retired. We are running out of time to capture more of the history of
that era. This newsletter is by no means limited to the 1950s; more
recent days need attention too: the the thirtieth anniversaries of the
UNIVAC 1107 and the B5000 are fast approaching. People who worked on
them, the 1108 and B5500, and other computers of the 1960s should also be
telling their stories.
To be sure, there is already historical work going on. The Charles
Babbage Institute at the University of Minnesota is building a collection
of oral history interviews. For over ten years there has been a scholarly
journal, the Annals of the History of Computing, which has published
historical articles. But both the oral history and journal efforts
necessarily tend to focus on the better known figures in the history of
computing. Many people may feel that what they have to tell doesn't
justify the effort of a recorded interview or a scholarly article. So,
they tell nothing, and then their experiences are lost.
This newsletter is intended to serve as a forum for the exchange of
historical information on UNIVAC, Burroughs, and Unisys computers. We
need to have information from our readers: either short narratives, such
as the one in this issue by Sam Huston, or bits of old documentation. Mr.
Huston regrets that he no longer has his UNIVAC I programming card, but
perhaps others still have one. The questions to be answered are endless.
What was it like moving from the first generation computers to the
relative sophistication of the UNIVAC 1107 and B5000 operating systems?
Why did companies get a computer which wasn't made by IBM? And, what
prompted companies to give up on UNIVAC or Burroughs and switch to
something else? If we address these sorts of questions, this newsletter
will be more than a compilation of anecdotes. When we put together these
bits and pieces of history, we can find patterns in the past and provide
ourselves with a guide for making decisions in the future.
WHAT DO WE WANT TO KNOW? (STARTING POINTS)
UNIVAC I and II: What was it like to program these early machines,
and how many sites ventured into those pioneer higher level languages such
as FLOW-MATIC and MATH-MATIC?
Burroughs 204, 205, and 220: These vacuum tube computers had a small,
but devoted, group of users during the late 1950s. What features made
them so well liked?
UNIVAC 1103 and 1105: The 1103 was the first "scientific" computer,
and its users established USE as an organization to share software. What
did they produce? Who, besides the Census Bureau, had an 1105?
UNIVAC FILE COMPUTER: Northwest Airlines tried to do reservations on
it; what applications did it run at other sites?
UNIVAC III: The III was apparently a very sophisticated machine,
incorporating much that UNIVAC learned from the LARC, but nowadays it is
hardly ever mentioned. What software did it have? Who used it? It is
said to have been overpriced; is that true?
Burroughs 5000: The B5000 was very innovative. What customers were
bold enough to buy it and program in ALGOL instead of FORTRAN?
UNIVAC 1005: The 1005 went to war: at least one was used in South
Vietnam, and another was in the Pentagon. What did the Army do with it?
UNIVAC 418: It is said that a few 418s are still in operation after
all these years. Is that true? California is still running 418 emulation
on its 1100--there's an interesting USE paper on that.
Burroughs 5500: The 5500 was a 5000 that really worked. Some 5500s
were used for years and years. How did people keep them going?
UNIVAC 1108 and 1106: For a time, they were giant killers: replacing
IBM machines. An 1108 replaced an IBM 7094 at the University of Maryland;
how did that happen?
The BC/7: Did anyone really buy one, or was the entire supply used
as SSPs for 1100s?
NU ALGOL: This fine ALGOL compiler on the 1100s came from Norway (NU
= Norwegian University). Was it ever used much in the U.S.?
This scarcely begins to address the range of topics. There is the
story of the 9000 series, ancestor of today's System 80: what was its
early development like? What about the V77: Sperry's ill-fated
acquisition from Varian? The UNIVAC 1050 is now known as one of the
machines which the Air Force replaced with 1100/60s, but why did the Air
Force get the 1050s to begin with?
UNIVAC, Unisys, and FLOW-MATIC are registered trademarks of Unisys
Please send comments and contributions to:
1418 Mackenzie Ct.
Tucker, GA 30084
Phone: 404-656-7327 (days)
In the long run, this newsletter must have contributions from the readers.
Unisys History newsletter will be published quarterly. To receive
the next issue, send me either $1.00 or material to be published. The
material does not have to be an article. It could be old documentation,
articles, or advertisements. I need input in order to produce more
output. The next issue will be in December.
Copyright 1992 by George Gray
Randy Carpenter rcar...@gsu.edu % Got a light?
Georgia State University (404) 651-2648 No match.
Wells Computer Center %
Richard H. Miller Email: ri...@bcm.tmc.edu
Asst. Dir. for Technical Support Voice: (713)798-3532
Baylor College of Medicine US Mail: One Baylor Plaza, 302H
Houston, Texas 77030
Volume 1, Number 2
The UNIVAC Solid State Computer
by George Gray
The UNIVAC Solid State Computer was one of the first to use solid
state components in its central processing unit. Although the prototype
was accepted by the Air Force in 1956, Remington Rand held back from
marketing a commercial version for three years. This mistake reduced the
number sold, but, even so, the Solid State was a success, bringing in
badly needed revenue during the company's troubled times in the early
1960s. I found a notebook full of Solid State manuals, making it possible
to take a closer look at this interesting machine.
First generation computers, such as the UNIVAC I and II, used vacuum
tubes for the logic and control circuits of their CPUs. This is the
distinguishing characteristic of the first generation of computer
hardware, and the switch to transistors takes us into the second
generation. Vacuum tubes were slow, bulky, hot, and prone to failure, so
computer designers looked for alternatives. The transistor was invented
at Bell Labs in 1948, but it was several years before transistors became
fast and reliable enough to be useful for computers.
In the meantime, engineers looked at other devices, including
thryratons, trionodes, and magnetic amplifiers. Thryratons and trionodes
were two types of gas-filled tubes with electrical components. Neither
ever found very wide use, although the Colossus cryptoanalytic computer
built in Britain during World War II did have thryraton rings as counters.
Magnetic amplifiers had been developed during the 1930s, and the Germans
used them in devices to control the firing of battleship guns. Amplifiers
were solid-state iron or ferrite core inductors whose inductance could be
varied by an electrical current.
Remington Rand and several other companies conducted research on
magnetic amplifiers in the years after World War II. The UNIVAC Solid
State Computer had its genesis in the Air Force Cambridge Research Center
(AFCRC) computer, which the Philadelphia division of Remington Rand
developed from 1952 to 1955. Aside from fifteen vacuum tubes, all the CPU
circuitry used magnetic amplifiers, which Remington Rand named FERRACTORs.
For memory, the AFCRC computer had a magnetic drum.
Several types of memory were used in first generation computers:
mercury delay lines, electrostatic tubes, magnetic drums, and magnetic
cores. The BINAC and UNIVAC I had mercury delay lines, which were slow.
The UNIVAC 1103 and the IBM 701 used electrostatic memory, essentially
cathode ray tubes, which was faster than mercury lines, but not very
reliable. Drums were reliable, but slow, due to the inherent rotational
delay. Engineering Research Associates, which was acquired by Remington
Rand in December 1951, was a leader in drum memo- ry development. The ERA
Atlas I (or, 1101 in its commercial designation) which was delivered to
the Navy in December 1950 had a drum memory of 16,384 24-bit words. Core
memory was superior to the other three types, and, as we know, eventually
prevailed. It was initially very expensive, so that for some time the
drum continued to be the memory device for small-scale, lowcost systems.
The Remington Rand product line needed such a system. Both the
UNIVAC I from the Philadelphia division and the 1103 from St. Paul were
large systems, selling for about $1 million. IBM's drum memory 650
computer, announced in 1953, sold for $200,000 to $400,000 and was a great
success: over 1800 were sold or leased. IBM licensed the drum memory
technology from Remington Rand. Reportedly, Remington Rand settled for a
lump-sum royalty payment based on the expectation that only a few hundred
650s would sell.
The drum memory of the AFCRC computer was faster (16,500 rpm versus
12,500) and bigger (4000 words versus 2000) than that of the the original
IBM 650. The AFCRC computer was completed in June 1955 and shipped to
Hanscom Air Base in Massachusetts, where it passed its acceptance test in
April 1956. The total project cost was $800,000. The Air Force used it
for various scientific computation problems, and Air Force engineers added
a large three-color display scope for plotting output.
A commercial version of the AFCRC computer was an obvious response to
the success if the IBM 650. The Philadelphia division developed an
improved version, which was the UNIVAC Solid State computer. However, the
St. Paul division had already announced its drum memory UNIVAC File
Computer in January 1955, and Remington Rand management feared that
announcement of the Solid State would hurt sales of the File Computer.
The File Computer project was troubled by serious delays. The first
delivery was not until August 1956 and that version (the File 0) could
only be programmed by plugboard. The File 1 model with internal program
capability finally came out in August 1958. The File Computers were not a
success in the marketplace: fewer than 200 were sold. In the meantime,
management had allowed sales of the Solid State in Europe, where it was
called the Universal Card Tabulating Machine (UCT). Deliveries there
started in 1958. Potential customers in the U.S. heard about the UCT and
put pressure on Remington Rand to sell it here. The company relented, and
American deliveries began in 1959.
The UNIVAC Solid State Computer came in two versions: the Solid State
80 handled IBM-style 80 column cards, while the Solid State 90 was adapted
for Remington Rand's 90 column cards. A Solid State system consisted of
the CPU and drum memory, card reader, card punch, and printer. There was
the option of adding a tape controller and up to ten UNISERVO II tape
drives. The drives could read both mylar tape and the old UNIVAC metallic
tape: the mode was selected by a switch on the front of the drive. The
CPU had twenty vacuum tubes, 700 transistors, and 3000 FERRACTOR
The drum was five inches in diameter and eight inches long, with 25
bands of 200 ten-digit words, for a total capacity of 5000 words. Its
speed had been increased to 17,670 rpm. Twenty of the bands had one
read-write head each, giving a maximum access time of 3.4 milliseconds,
that is, one revolution. The remaining five bands had four equally spaced
read-write heads to reduce their maximum access time to 0.85 milliseconds.
Instruction words consisted of a two-digit function code, a
four-digit operand address (referred to as m), and the four-digit address
of the next instruction (referred to as c). The address of the next
instruction was important in a drum memory environment. Since the drum
was constantly rotating, it would move some distance while each
instruction was being executed. So, to minimize the delay between
instructions, it would be best to have the next instruction positioned on
the drum at the place where the read-write head was when execution of the
current instruction was completed. As a result, instructions which
followed each other in program logic would be scattered around the drum,
not physically next to each other. The manuals for the Solid State
computer gave instruction timings, so that programmers could try to reduce
rotational delays. This approach was called minimum latency programming.
It was complicated by the need to fetch operands, so that the programmer
had to keep in mind the locations of data and of the next instruction.
Generally speaking, data items were stored in the five bands which had
four read-write heads, since data would be accessed more frequently than
The Solid State used four bits (plus a parity bit) to represent a
digit. The bits stood for five, four, two, and one. This table shows how
it was done:
5 4 2 1 value
1 1 0 0 9
1 0 1 1 8
1 0 1 0 7
1 0 0 1 6
1 0 0 0 5
0 1 0 0 4
0 0 1 1 3
0 0 1 0 2
0 0 0 1 1
This scheme, called biquinary coded decimal, was also used on the LARC,
the large computer which the Philadelphia division developed for the
Atomic Energy Commission during the late 1950s. Variations of coded
decimal were common on businessoriented computers during the 1950s and
early 1960s. Scientific computers, such as the UNIVAC 1103A and 1105
operated in pure binary, with instructions for both fixed and
floating-point arithmetic, but none for decimal. The UNIVAC III, first
delivered in 1962, had instructions for binary (fixed point) and decimal
arithmetic. Third generation computers, such as the IBM 360 and the later
UNIVAC 1100s, had a full range of arithmetic instructions, ending the
division between business and scientific machines.
The Solid State computer had three ten-digit arithmetic registers,
referred to as A, X, and L. Their use varied according to the operation
being performed. Addition and subtraction used the A register: the value
in the operand (m) was added to or subtracted from the value in A, with
the result remaining in A. For multiplication, the value in L was
multiplied by the value in m; the ten most significant digits of the
product went into A and the ten least significant went into X. In
division, the value in m was divided by L with the quotient in A and the
remainder in X.
The X register was not an index register. Index registers had just
started being used in the mid-1950s, and the AFCRC computer did not have
them. On the Solid State computer they were an option, costing an extra
$7500 (or $150 per month), and many customers chose not to spend the
money. Without index registers, the task of moving through data tables
was performed by modifying the address portion of instructions. This
technique, which seems foreign in today's climate of read-only instruction
areas, was extensively used. In fact the Solid State programming manual
said that the ability to do instruction modification was an "invaluable
feature" of the machine. For those who paid the extra money, there were
three index registers, designated B1, B2, and B3. Each could hold a
four-digit value, which was enough, since the address field of an
instruction was also just four digits. Use of an index register was
indicated by setting the sign bit and the sixth bit in the operation code
field, which was otherwise unused.
Programming was ordinarily done in machine code, although an
assembler called X-6 was developed. As an example of machine code, 25
1202 0050 meant to load the contents of address 1202 into the A register
and find the next instruction at address 0050. There were instructions
for loading into and storing from registers, arithmetic operations, ANDing
(which was called erasing) and ORing (referred to as buffing or
superimposition), right and left shift, magnitude comparison, and equality
comparison. The test for equality was operation code 82; if the contents
of A were equal to the contents of L, then the address of the next
instruction was in the m (operand) field, otherwise the address of the
next instruction was in the usual c field. In all, there were 53
instructions, including some for input-output and for translation between
XS-3 code and the Solid State's internal biquinary coded decimal.
Remington Rand provided customers with subroutine libraries for
various functions to help cut down the labor of machine language
programming. The X-6 assembler was primitive by today's standards. It
allowed replacement of the numeric function code with a mnemonic,
assignment of names to instruction addresses, and limited use of data
tables, but not much more than that. Although the binary coded decimal
arithmetic operation classed the Solid State as a "business" computer,
customers in fact used it for both business and scientific computing. The
U.S. Army Chemical Corps did mathematical and statistical work, as well as
payroll and accounting. Shell Oil used a Solid State primarily for
A basic Solid State card system was priced at $350,000 or leased for
$7000 per month. Since the Solid State was faster and more capable than
the IBM 650, sales were brisk during 1959 and 1960. In June 1959,
Remington Rand announced that it had written an IBM 650 emulator program
to ease conversion. But the market life of the Solid State was cut short
by the announcement of the IBM 1401 in October 1959. For the same price,
it was faster than the Solid State. As the pace of sales slowed,
Remington Rand announced a somewhat improved version called STEP (Simple
Transition Electronic Processing). It was the same CPU hardware, but it
could be ordered with a smaller or larger capacity drum memory. A system
with one tape drive sold for $250,000. This was a better price than on
the Solid State, but still did not match the price-performance of the IBM
1401. Altogether, about 600 Solid State and 200 STEP computers were sold.
It is unfortunate that the Solid State was not marketed sooner, so
that more could have been sold before the arrival of the IBM 1401. It is
not clear why a more powerful follow-on, one which was fully
transistorized, was not developed so that there would have been something
to compete with the 1401. This may have been due to the concentration of
the Philadelphia division on the LARC computer, which fell way behind
schedule and swallowed up several million dollars. In the absence of a
follow-on, Solid State customers had no reason to stay with UNIVAC, since
any upgrade involved a conversion.
The late 1950s and the beginning of the 1960s was a bad time for
UNIVAC. Many computer models were late, and the company's market share
fell dramatically. It is sad to look back and see promising new products,
such as the UNIVAC I and Solid State, which were not aggressively
developed and marketed. That makes us ask how do such things happen, and
more to the point, how such mistakes can be avoided in the present.
One key is in recognizing success and following it up with the
allocation of more resources for development. The recent announcements
that mapper will be deployed on Sun and IBM RS/6000 computers indicate
that Unisys may be taking this lesson to heart. Like the Solid State
computer in its day, MAPPER has been a modest success. Now Unisys should
push to broaden MAPPER's market share. Holding it back from other
platforms would be like the holding back of the Solid State to favor the
File Computer. If MAPPER's market share can be increased, then it need
not follow the Solid State and be one of those promising beginnings that
didn't make it.
Unisys, UNIVAC, MAPPER, UNISERVO, and FERRACTOR are registered trademarks
of Unisys Corporation.
Contributions of information are always needed. This issue could not
have been done without the binder of Solid State manuals. I would like to
do an issue on the UNIVAC III, but I don't know enough about it. Please
look in your files or bookcases for material on it or any other old UNIVAC
or Burroughs computers. For this newsletter to go on, I need to have
input. Write to
1418 Mackenzie Ct
Tucker, GA 30084
or call me at 404-656-7327 (daytime). Of course, articles such as the one
by Sam Huston in the last issue are always welcome. The next issue will
be in March.
Copyright 1992, by George Gray
Volume 1, Number 3
by George Gray
The operating system used today on the 1100 and 2200 Series is an
outgrowth of EXEC 8 which was written for the UNIVAC 1108 in the late
1960s, but EXEC 8 was not the first UNIVAC operating system. Its roots go
back to EXEC I and EXEC II, which is why in its early years EXEC 8 was
often written as EXEC VIII. I found a copy of the EXEC II Programmers
Reference for the 1106/1108 (UP-4058 Rev.1), which makes it possible to
take a closer look at the early UNIVAC operating systems.
In the beginning, computers did not have operating systems. The
computer operator set up the computer to run each program. This involved
loading the program into memory from magnetic tape, paper tape, or punched
cards, and then mounting tapes or loading cards for data input/output.
The program itself had to perform all the input/output handling: if it
expected the output tape to be on drive 2, then an output tape had better
be there. To some degree, this inflexibility could be overcome by
switches or jumper cables, but the whole process was very cumbersome, and
wasted time between programs.
The early large-scale computers, such as the UNIVAC I and 1103 and the
IBM 701 and 704 were very expensive, selling for one or two million
dollars, back in the mid-1950s, when, as we well know, a dollar was worth
much more than it is now. At most sites, there was no shortage of work to
be done, so computer time was a scarce and costly resource. This was one
of the factors which opponents of the early higher-level languages cited:
with machine time at a premium, the greater size of compiled programs,
often five to ten times bigger than an equivalent handcrafted machine
language program, was too wasteful. Of course, as the cost of machine
time fell relative to programmer time, the power of this argument
evaporated. But in the meantime, this scarcity and cost of machine time
was a driving force in the development of operating systems. Something
had to be done to reduce the time lost between programs, so that more work
could be pushed through the computer.
In 1953 at the Eastern Joint Computer Conference several users of the
IBM 701 met and discussed the need for some way to streamline the
scheduling of computer programs. This led to the first operating system,
a batch processing monitor program for the 701 written by programmers at
the General Motors Research Center in 1955. The electro- static memory of
the 701 was not very reliable, so IBM brought out the core memory 704 as a
replacement. The monitor was rewritten for the 704 as a joint effort by
General Motors and North American Aviation in 1956. Their GM/NAA I/O
System was further developed by the SHARE user group in 1960 as the SHARE
Operating System (SOS). Finally, IBM took over support of it, and, with
the name changed to IBSYS, it was widely used on the transistor
(second-generation) IBM 7090 and 7094 computers. It was not, however, the
only operating system for this family of computers. IBM had also
developed a FORTRAN monitor system (FMS), Bell Labs wrote a monitor called
BELLMON, and the University of Michigan Executive System (UMES) was used
at many universities.
These batch processing monitors made it possible to put a series of
jobs together on an input tape. The jobs were separated by simple control
cards, such as $COMPILE MAD or $DATA, to indicate functions to be
performed or the location of data. Time on the 7090 was frequently seen
as too valuable to be taken up with card processing or printing, so quite
often these tasks were handled by some smaller, cheaper computer,
typically an IBM 1401. The input job streams and card data were written
to tape on the 1401, and then the tapes were taken to the 7090. Output
was done in reverse: the 1401 received tapes from the 7090, and did the
printing or card punching. This arrangement was a good deal for the IBM
salesmen, because they got to sell (or lease) two computers instead of
UNIVAC's announcement of the 1107 in December 1960 was very late. By
that time the first IBM 7090 had been in the field for a year, and over a
dozen more had been shipped since then, while the first Philco Transac
2000 and Control Data 1604 computers were installed in early 1960. The
announced delivery date for the 1107 was late 1961, but due to various
development problems, the first one was not completed until September
1962. To have any marketplace advantage at all, UNIVAC would have to
offer a superior product. The operating system software was one area in
which UNIVAC tried to do better.
UNIVAC's systems programming group in St. Paul started work on an
operating system for the 1107, which eventually became known as EXEC I.
It was intended to support true multi-programming: sharing CPU time among
several batch runs. This would permit the use of on-line utility programs
to do the card-totape and tape-to-print/punch work which the IBM 7090
sites were doing off-line. EXEC I was to be more sophisticated than batch
monitors, since it would schedule CPU time, allocate memory, and queue i/o
requests. Processing was interrupt- driven, using i/o completions or
clock timing, to force programs to relinquish control of the CPU. The
control algorithm was simple: each program was classified as either
compute-bound or i/o-bound, and the EXEC kept a first-in first-out queue
for each class. Each program execution within a run was scheduled
independently, so the programmer had to supply cards describing the
resources (core, tape, drum, reader, or printer) that it needed. These
statements were complex, making EXEC I difficult to use. Putting together
a whole runstream took a lot of work. EXEC I was written in the assembly
language (known as SLEUTH) of the 1107; it was 25,000 lines of code, and
occupied about 8K of the 1107's 32K of memory.
The early 1960s were a difficult time for Sperry Rand, and the
company did not have the resources to do all of its software development
in-house. For the 1107, UNIVAC wrote EXEC I and SLEUTH, but the ALGOL,
COBOL, and FORTRAN compilers were contracted out. The contract for COBOL
went to Computer Science Corporation (CSC), which had already produced a
FORTRAN compiler for UNIVAC's LARC computer and was also writing much of
the software for the UNIVAC III, the UNIVAC Philadelphia division's
business computing counterpart to the 1107. Very probably there was some
kind of barter involved, since CSC received the first 1107 shipped by
EXEC I wasn't ready yet, so CSC went ahead and devised its own
operating system for the 1107, which became EXEC II. CSC stretched the
terms of their contract a good bit: when the COBOL compiler was finished,
it was for an EXEC II environment, not EXEC I! CSC designed EXEC II to be
a serial batch processing system, with a multiprogram- ming capability
limited to the handling of card input and printer output. This still gave
it an edge over the IBM 7090 monitors, without introducing the
complexities of EXEC I. EXEC II made extensive use of the FH-880 drum for
staging card input and printed output, as well as for temporary storage of
program files read in from tape. Keep in mind that at this time disk (as
opposed to drum) technology was very new, and UNIVAC had no disks. Not
until 1970 was a disk drive, the 8414, supported on the 1100 Series. The
drums of the early 1960s did not have enough capacity to accomodate
long-term storage of program files. Therefore, programs were usually kept
on tape or card.
EXEC II was composed of resident routines which stayed in memory at
all times, and non-resident ones which were brought in from drum only when
needed. The resident routines included a minimal control card
interpreter, a jump vector for user program entries into EXEC II,
configuration tables, file directory handlers, and control routines for
drum, tape, and console. There was also a dispatcher routine to control
i/o queues. The major non-resident routines were the main control card
interpreter, job accounting, and the routines to control card and print
i/o. This last group of routines for card and print was called the
symbiont routines. In the later years of EXEC II, there were symbiont
routines for data interchange with remote or on-site UNIVAC 1004 or 9300
computers. The term symbiont referred to the symbiotic rela- tionship
between the computer's central complex and the peripheral devices.
EXEC II was bigger than EXEC I, comprising 45,000 lines of assembler
code and occupying 12K of memory. The command language was easier to use
than that of EXEC I, and was the basis for the Executive Control Language
(ECL) used in EXEC 8. The major difference in syntax is that in EXEC II,
the options field came first (immediately after the @) instead of after
the operation field. For example, a call to the COBOL compiler to compile
input source element BING, putting the updated source output in BONG and
the relocatable object output in TICK/TOCK would be:
@SX COB BING, BONG, TICK/TOCK
The S option said to punch the source output on cards and the X to abort
the compilation if errors were detected. Incidentally, the names BING,
BONG, and TICK/TOCK are taken from an example in the EXEC II Programmers
Reference. Many statements had essentially the same meaning in EXEC II as
they would have in EXEC 8, including:
RUN run initiation
FIN run termination
XQT program execution
MSG console message
ASG facilities assignment
PMD postmortem dump
ELT input cards into program file
HDG heading for print pages
EXEC II provided a tape file structure called a PCF (program complex
file) for source, relocatable, and absolute program elements.
Manipulation of program elements was done through a set of routines called
CUR (complex utility routines) which was the ancestor of EXEC 8's FURPUR
processor. CUR was called via an @XQT CUR, followed by directives on
subsequent cards. A few of the directives, such as ERS, FIND, and PCH
were carried over directly into FURPUR. Other processors which came with
EXEC II were:
ALG ALGOL Compiler
ASM Assembler (SLEUTH)
COB COBOL Compiler
FOR FORTRAN Compiler
LFT LIFT: a FORTRAN II to FORTRAN IV conversion routine
The ALGOL compiler was written at Case Institute of Technology, as part of
an arrangement whereby it received an 1107 to replace its old UNIVAC I.
As a result of its late appearance, only 36 1107s were sold, but
fortunately UNIVAC was prompt in announcing and delivering its third
generation 1108 computer. It was fully compatible with the 1107, so that
all the software, including EXEC II, could be carried over to it, and EXEC
II had several years of glory on the 1108. An article titled "Conversion
at Lockheed Missiles and Space" in the January 1967 issue of Datamation
said that Lockheed found that "The FORTRAN IV compiler and EXEC II
Operating System were considerably more powerful and of better design than
their counterparts on the IBM 7094." Using an 1107 as an interim machine,
Lockheed ultimately replaced two 7094s with two 1108s.
During its later years, there were two major enhancements to EXEC II.
The first was the capability to use a remote 1004 computer as an
input-output device. This was apparently the outgrowth of work done by
UNIVAC on one of the Bogart computers it made for the National Security
Agency. The second was the incorporation of the FASTRAND drum as a mass
storage device for permanent program and data files.
While both EXEC I and EXEC II were provided for the 1108, it was
clear that the two should be merged to provide a true multi-programming
system with the ease of use and external appearance of EXEC II. This was
EXEC 8. The specifications for it were drawn up in December 1964 and work
began in May 1965.
But that is another story. EXEC I and EXEC II were significant
developments in the evolution of operating systems. Though not as
sophisticated as the Master Control Program (MCP) for the Burroughs 5000
and the Compatible Time Sharing System (CTSS) developed at MIT for the IBM
7090, the EXECs did represent an advance over the batch moni- tors of
their time. They demonstrated that an operating system more complex than
the IBM monitors could achieve a high level of throughput.
Unisys, UNIVAC, FASTRAND, and UNISERVO are registered trademarks of Unisys
Contributions of information are needed to keep this newsletter
going. This issue could not have been done without the EXEC II manuals.
I would like to do an issues on the UNIVAC III, the File Computer, and the
Burroughs 205 and 220, but I don't know enough about them. Please look in
your files or bookcases for material on them or any other old UNIVAC or
Burroughs computers. For this newsletter to continue, I need to have
input. Write to
1418 Mackenzie Ct
Tucker, GA 30084
or call me at 404-656-7327 (daytime). Of course, articles such as the one
on UNIVAC in Pittsburgh in the first issue are always welcome. The next
issue will be in June.
Copyright 1993, by George Gray
Volume 1, Number 4
The UNIVAC 1100 in the Early 70s
by George Gray
The history of Unisys has many periods of disasters and missed
opportunities, but if the past had been nothing but failure, Unisys would
be gone from the computer marketplace by now, following the path of
General Electric and RCA. There were prosperous times for both UNIVAC and
Burroughs: the 1108 and the B5500 were notable successes, and both
companies did well in the early 1970s. A set of UNIVAC 1100 account
profiles from this period makes it possible to take a closer look at a
time when companies and government agencies were actually switching from
IBM and other vendors to UNIVAC.
The account profiles cover thirty-five 1106 and four 1108 sites where
a new computer had been acquired in the early 70s. There were eleven
government agencies (local, state/provincial, and national), three
universities, seven utility or communications companies, a savings and
loan, and a newspaper. The other sixteen were various manufacturing and
business enterprises. The UNIVAC replaced an IBM 360 or 370 at fourteen
of these customers, Honeywell/GE computers at five, RCA at four, and
Burroughs at two. The RCA replacements arose from UNIVAC's purchase of
the RCA customer base in 1971. They are still significant, since those
four companies could well have chosen IBM to stay with that architecture,
instead of switching to the 1100. In two of these sites, the customer was
converting from a UNIVAC III to an 1106.
Various strengths of the 1100 helped make these sales. EXEC 8 was
superior to IBM's OS and DOS in several areas, including scheduling, the
ability to handle a mix of batch and demand runs, time-sharing
capabilities, and the simplicity of Executive Control Language (ECL) as
compared with JCL. Programmers who had worked only on IBM sometimes
thought they were being tricked when they were first shown an ECL
runstream: it had to be more complicated than that. The existance of two
operating systems (DOS and OS) was another disadvantage for IBM.
Customers who wanted to move up to larger models in the IBM 370 hardware
line were faced with a laborious conversion from DOS to OS, and some chose
to convert to other vendors. By this time, UNIVAC was just about finished
with its move from EXEC II to EXEC 8, and EXEC 8 was settling down to be a
stable operating system.
UNIVAC had an advantage in its multiprocessor architecture, an area
in which Burroughs was the only other serious contender. This permitted
easier, incremental hardware upgrades and was the beginning of the road
toward today's fully redundant systems. At this point, IBM was still
several years away from delivering effective multiprocessor machines.
This, combined with the scheduling flexibility of EXEC 8, meant that the
1106 outperformed IBM 370/135 and 370/145 computers in benchmarks done for
several of these customers.
Yet another area of advantage for UNIVAC was remote job entry (RJE)
capabilities. As early as 1964, the 1107 at Case Institute of Technology
in Cleveland had been linked to a 1004 at a hospital ten miles away, and
the following year another 1004 one hundred miles away was also connected.
By the end of the 1960s this capability was widely used, although 9200 and
9300 computers had displaced the 1004 as the preferred remote device. One
of these new customers tied its 1106 in Missouri to remote 9200/9300s in
Houston, Fort Worth, and Kansas, while another implemented a network of
fourteen 9200s spread across a state.
In the area of software, the availability of DMS-1100 was a factor in
twelve of these sales. While still in a very rudimentary form, it
provided greater data handling capability than IBM's IMS. General
Electric (and then Honeywell, after it acquired GE's computer business)
was a more serious contender with its Integrated Data Store (IDS)
developed by C.W. Bachman and others in the mid-1960s. Both IDS and
DMS-1100 had the additional glamor of complying with the database standard
of the Committe on Data Systems Languages (CODASYL), while IMS did not.
Demonstrations of time-sharing programs accessing DMS-1100 databases
impressed several of these customers and they became early users of it.
At two other companies, an older data management tool, FMS-8, was a key
factor in the choice of the 1100.
Since so many of these sales involved conversions, it is not
surprising that conversion software, such as the 1401 simulator and the
BAL to COBOL translator played an important role. At the time of these
sales, few computer users had ventured far into transaction processing and
screen formatting. This meant that most COBOL or FORTRAN programs were
batch-oriented and thus relatively easy to convert. UNIVAC's edge over
IBM in easy timesharing access also facilitated program conversions:
program card decks could be read into disk files and changed with the ED
processor, which seemed very powerful at the time.
UNIVAC did not have the advantage in every area. IBM was clearly
ahead in disk drive technology. The 1100 series had just started using
disks (as opposed to drums) in 1970, and the 8414 disk was a slow
performer compared with IBM's 3330s. One of these customers had severe
problems with its 8414s. Software was another area where UNIVAC was
starting to have trouble. It is true that several of these sales did
involve the availability of specific software packages on the 1100, such
as geophysical programs for oil companies, computer assisted instruction
(CAI), and the Integrated Civil Engineering System (ICES), however these
were exceptional cases. The tide was already moving in the direction of
IBM. Independent software companies were just getting started, and by the
end of the 1970s they would create the situation which is familiar today:
the mountain of program packages written specifically for IBM mainframes,
compared with just a handful which will run on any other mainframe.
In the 1970s it was particularly troubling for UNIVAC that many U.S.
government agencies developed IBM-only programs, instead of writing them
in standard FORTRAN or COBOL. The Federal Highway Administration prepared
a free package of transportation planning programs which could only run on
the IBM 360/370. It was used by many cities and states. The UNIVAC
marketing branch in Hartford had to write its own counterpart for the
Connecticut Department of Transportation's 1106. Those programs were not
used anywhere else and UNIVAC had to bear the development costs. When the
Highway Administration and the Urban Mass Transportation Administration
wrote a new package in the late 1970s, it was still IBM-only. It would be
a few more years before the U.S. government moved to an approach which was
based more in standards.
On the whole, however, this was a good period for UNIVAC. The 1106
and 1108 were selling at a brisk pace and a spirit of optimism is present
in these account profiles. This was a time when computer users still had
the flexibility to change vendors without incurring large conversion
costs, but in the later 1970s and in the 1980s, users became locked into
proprietary transaction and database systems. This meant that conversions
became rare and frequently very painful when they did take place. Due to
the complexity of these online systems, there was no CICS-to-TIP converter
similar to the old batch BAL-to-COBOL package.
Conversions, when they did happen, were more often from UNIVAC to
IBM. To some extent this was due to IBM's strong market position:
switching to IBM was a way to join the mainstream and be standard. The
corporate mergers of the 1980s often led to standardization on IBM, as
happened when Piedmont Airlines, a strong 1100 site, was swallowed by
all-IBM U.S. Air. Without a roster of the current customer base, it is
difficult to say how many of these thirty-nine sites still have 1100 or
2200 series computers. Twelve are current members of USE, so we know that
just about a third have stayed with Unisys over the past twenty years.
But many of them are gone, and this erosion of its customer base was a
major factor in all the predictions that Unisys would not survive the
recession of 1991.
Now in the 1990s, with many companies moving their processing from
mainframes to smaller computers or networked micros, the market is more
open and Unisys once again has the opportunity to pick up new customers.
In the competitive era of the early 1970's the success that UNIVAC had
came from specific strengths, such as time-sharing, multiprocessing, and
ease of use. For the competitive era of the 1990s, Unisys again has to
work from strength. The leaders of the corporation say they intend to do
that, and it will be interesting to see if they can.
The article on the Solid State Computer in the December 1992 issue
said that IBM licensed the drum memory technology for its 650 computer
from the ERA (St. Paul) division of Remington Rand. Actually, the
situation was more complex than that. Prior to its acquisition by
Remington Rand in December 1951, ERA did agree to design a drum for the
650. However, two groups within IBM started work on drum designs of their
own, and the drum which was used on the 650 came primarily from the IBM
efforts and included only a few of the design features from ERA.
Interestingly, one of the tasks faced by the IBM designers was to avoid
infringing a drum patent held by the Eckert-Mauchly (Philadelphia)
division of Remington Rand.
The article also gave the impression that the X-6 assembler for the
Solid State was not widely used. Max Feuer has informed me that X-6 was
in fact used extensively, because it was easier than machine code and
there was less chance to make errors.
In response to the article on early UNIVAC 1100 operating systems,
Jerry Reich has brought to my attention the operating systems written by
Computer Science Corporation (CSC) for its Infonet time-sharing network.
In the late 1960s, CSC developed CSCX, which was a highly modified EXEC II
with time-sharing and multiprogramming capabilities. CSCX later evolved
into the CSTS time-sharing system used on the Infonet 1100s throughout the
1970s. Commercial time-sharing networks were widespread during the 1960s
and 1970s. The first one, established by Keydata Corporation in Boston in
1965, used a UNIVAC 491 to provide computerized invoicing and inventory
control for various client companies, including a liquor distributor and a
book publisher. Both CSC and UNIVAC set up networks using the 1108: CSC
ordered twenty 1108s for this purpose in 1968. Of course many other
time-sharing services were based on General Electric, IBM, and other
hardware. For the most part, the commercial services died out as a result
of the availability of low cost miniand micro-comput- ers in the 1980s.
It became cheaper for companies to buy their own small computer, rather
than rent time from a service.
Unisys and UNIVAC are registered trademarks of Unisys Corporation.
Contributions of information are needed to keep this newsletter
going. This issue could not have been done without the 1100 account
profiles. I would like to do issues on the UNIVAC III, the File Computer,
and the Burroughs 205 and 220, but I don't know enough about them. Please
look in your files or bookcases for material on them or any other old
UNIVAC or Burroughs computers. For this newsletter to continue, I need to
have input. Write to
1418 Mackenzie Ct
Tucker, GA 30084
or call me at 404-656-7327 (daytime). Of course, articles such as the one
on UNIVAC in Pittsburgh in the first issue are always welcome. The next
issue will be in September.
Copyright 1993 by George Gray