Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

"How John Backus’ Fortran Beat the Machine Code ‘Priesthood’" by David Cassel

212 views
Skip to first unread message

Lynn McGuire

unread,
Jul 5, 2022, 7:29:41 PM7/5/22
to
"How John Backus’ Fortran Beat the Machine Code ‘Priesthood’" by David
Cassel

https://thenewstack.io/how-john-backus-fortran-beat-machine-codes-priesthood/

"John Backus one of the founding forefathers of early computer
programming, and in many ways set the stage for modern programming
languages."

"Backus led the team that developed the Fortran programming language in
1957, still touted today as “the first high-level programming language”
on web pages at IBM. As the Associated Press wrote — half a century
later — in an obituary for Backus, “Prior to Fortran, computers had to
be meticulously ‘hand-coded’ — programmed in the raw strings of digits
that triggered actions inside the machine.”"

Lynn

Robin Vowels

unread,
Jul 5, 2022, 11:21:53 PM7/5/22
to
That is false. Already at the University of Manchester, England in 1952,
the first autocode was in use.
An improved form, the Mark I autocode, was implemented in 1955.
In 1955, GIP was already in use on Pilot ACE. No knowledge of machine
code was required to write the instructions for GIP.

Louis Krupp

unread,
Jul 6, 2022, 3:55:18 AM7/6/22
to
I'm going to hazard a guess that obituary writers tend not to know much
about computer science and computer science practitioners tend not to
write obituaries.

Louis

jfh

unread,
Jul 6, 2022, 6:00:33 PM7/6/22
to
Bruce Payne was a Manchester PhD student in the early 1950s doing numerical
fluid mechanics (see Payne, R.B. 1958: J.Fluid Mech. 4, 81-86). He told me years later
'I shared an office with Turing. The machine wrote numbers in base 32 backwards.
" I found both those sentences surprising. (I first used a computer myself in 1963, using
Fortran II. A few months later I had access to one using Algol 60. Wasn’t it Backus who
said when someone asked him whether Fortran or Algol was the better language
“I didn’t write a worse language than I had already written” ?

Thomas Koenig

unread,
Jul 7, 2022, 2:15:28 AM7/7/22
to
Lynn McGuire <lynnmc...@gmail.com> schrieb:
"Abstracting Away The Machine" presents a more nuanced view.
At the time, there even was strong resistance to _not_ programming
in machine code, even assemblers were frowned upon even by John
von Neumann, because he thought it a waste of computer time and
optimization possibility.

Fortran introduced three main concepts:

- Arithmetic notation (which was revolutionary at the time), and
they had not even invented recursive descent

- A highly optimizing compiler, which often made code speed
comparable to those of an assembler

- Portability between different machines - not long after Fortran
for the 704, a version for the 650 was made, and many other
computer manufacturers followed suit.

Lynn McGuire

unread,
Jul 7, 2022, 2:28:24 PM7/7/22
to
I remember when we got our new operating system, Fortran IV compiler,
Cobol compiler, and Line Editor for our Prime 450 in 1976 or 1977. The
release had the source code for everything. I loaded the source code
tape and looked through all of the source code. It was simply amazing.

Lynn


Quadibloc

unread,
Jul 17, 2022, 10:10:09 AM7/17/22
to
On Thursday, July 7, 2022 at 12:15:28 AM UTC-6, Thomas Koenig wrote:

> "Abstracting Away The Machine" presents a more nuanced view.

I took a look at that book. It turns out, though, that it has some inaccuracies
as well.

"Punched cards with round holes had reached their zenith - 45 columns."

Actually, to avoid IBM's patent on the 80-column card, 65-column cards with
smaller round holes were used by some computer companies in Britain.

Later, it speaks of "two inventors at IBM", one who invented the rectangular
hole, "more structurally sound" (absolutely untrue; 80-column cards with too
many holes in them are notoriously fragile) and one who would allow a single
hole in the card to "represent more than one character".

Now, that would have been a miracle of data compression!

I think what that other inventor at IBM must have _actually_ invented... was
what Univac used in their 90 column cards, where each of the 45 _columns_ represented
more than one character.

John Savard

Quadibloc

unread,
Jul 17, 2022, 10:16:08 AM7/17/22
to
On Tuesday, July 5, 2022 at 5:29:41 PM UTC-6, Lynn McGuire wrote:
> As the Associated Press wrote — half a century
> later — in an obituary for Backus, “Prior to Fortran, computers had to
> be meticulously ‘hand-coded’ — programmed in the raw strings of digits
> that triggered actions inside the machine.”"

Not only did assemblers exist prior to FORTRAN, but so did higher-level
languages.

Here, "Abstracting Away the Machine" gets it right:
"the first mature high-level language to achieve widespread adoption".

Before FORTRAN, there were, for example, MATH-MATIC and
FLOW-MATIC. Those where the languages Grace Hopper worked on.

COBOL was the product of a *committee*, and it incorporated a lot
from IBM's Commercial Translator as well as from FLOW-MATIC.

No doubt Grace Hopper was part of that committee, but to give her
credit for inventing COBOL the way John Backus invented FORTRAN
is, I think, in error, but a lot of accounts these days do so.

John Savard

gah4

unread,
Jul 17, 2022, 12:29:27 PM7/17/22
to
On Sunday, July 17, 2022 at 7:16:08 AM UTC-7, Quadibloc wrote:

(snip)

> Before FORTRAN, there were, for example, MATH-MATIC and
> FLOW-MATIC. Those where the languages Grace Hopper worked on.

As well as I know it, Fortran was the first with multiple character
variable names. Seems so obvious, but mathematicians still use single
character variables.

Robin Vowels

unread,
Jul 17, 2022, 1:01:05 PM7/17/22
to
On Monday, July 18, 2022 at 12:16:08 AM UTC+10, Quadibloc wrote:
> On Tuesday, July 5, 2022 at 5:29:41 PM UTC-6, Lynn McGuire wrote:
> > As the Associated Press wrote — half a century
> > later — in an obituary for Backus, “Prior to Fortran, computers had to
> > be meticulously ‘hand-coded’ — programmed in the raw strings of digits
> > that triggered actions inside the machine.”"
> Not only did assemblers exist prior to FORTRAN, but so did higher-level
> languages.
>
Indeed. Elsethread I mentioned some.

Lynn McGuire

unread,
Jul 17, 2022, 5:35:11 PM7/17/22
to
Everyone knows that T = temperature, P = pressure, and F = flow. More
advanced users know that V = vapor and B = liquid.

Lynn


FortranFan

unread,
Jul 17, 2022, 10:24:15 PM7/17/22
to
On Sunday, July 17, 2022 at 5:35:11 PM UTC-4, Lynn McGuire wrote:

> On 7/17/2022 11:29 AM, gah4 wrote:
> ..
> > As well as I know it, Fortran was the first with multiple character
> > variable names. Seems so obvious, but mathematicians still use single
> > character variables.
> Everyone knows that T = temperature, P = pressure, and F = flow. More
> advanced users know that V = vapor and B = liquid.


I guess chemical engineers, many likely among the customers of WinSim, Inc., are not "advanced users"!

For they seem to only "know" L = liquid!

Thomas Koenig

unread,
Jul 18, 2022, 1:45:01 AM7/18/22
to
Lynn McGuire <lynnmc...@gmail.com> schrieb:
Lucky is the person that only has a single temperature and a single
pressure in a process :-)

Robin Vowels

unread,
Jul 18, 2022, 5:53:52 AM7/18/22
to
Really?
>
And PI ? Theta? Alpha? Beta?

gah4

unread,
Jul 18, 2022, 6:41:12 AM7/18/22
to
On Wednesday, July 6, 2022 at 11:15:28 PM UTC-7, Thomas Koenig wrote:

(snip)

> - A highly optimizing compiler, which often made code speed
> comparable to those of an assembler

(snip)

As well as I know it, this was the important one.

As noted, there were other high-level languages, but people
weren't using them. They weren't fast enough.

An optimizing compiler with execution speed comparable
to assembly programming was needed to make the
transition. It seems hard for us now, when computers
are faster, and speed of writing is more important than
execution speed.

jfh

unread,
Jul 18, 2022, 5:57:20 PM7/18/22
to
Not to mention gamma, which is Euler's constant or the ratio of specific heats, and upper-case Gamma, which is the gamma function or surface excess.

Quadibloc

unread,
Jul 21, 2022, 3:24:07 AM7/21/22
to
On Monday, July 18, 2022 at 4:41:12 AM UTC-6, gah4 wrote:
> It seems hard for us now, when computers
> are faster, and speed of writing is more important than
> execution speed.

Surely the fact that computers are _cheaper_ is even more
important for this than the fact that they are faster.

Of course, it could also be claimed that these are but two
ways of saying the same thing (since if a fast computer is
as cheap as a slow one was, that is computers getting
faster).

John Savard

gah4

unread,
Jul 21, 2022, 6:04:54 AM7/21/22
to
On Thursday, July 21, 2022 at 12:24:07 AM UTC-7, Quadibloc wrote:
> On Monday, July 18, 2022 at 4:41:12 AM UTC-6, gah4 wrote:
> > It seems hard for us now, when computers
> > are faster, and speed of writing is more important than
> > execution speed.

> Surely the fact that computers are _cheaper_ is even more
> important for this than the fact that they are faster.

Well, this has been the important question for recent years.

Computers have mostly stopped getting faster, and instead we
have multiple core processors. Some software works with more
than one core, other is single core only.

If computers didn't get faster, but instead just cheaper, we could
have something like the IBM 704 for $0.01 each, and we would
buy 100,000 of them each.

Now, there are complications with the comparison, including
that we would be stuck with punched cards 60 years later,
and also difference in the speed of I/O devices vs. processors.

On the other hand, if computers stayed expensive but got faster,
we would have figured out better and faster time-share systems.

(There are physics problems with the comparison, related to the
speed of light in large processors.)

0 new messages