Ron Shepard <
ron-s...@nospam.comcast.net> wrote:
(snip)
>> C++ is larger because Fortran took too long to get to where
>> it is and C was already larger general use well before then...
> This has been stated several times in this thread, and I agree. C
> was not standardized until 1989, and there were major problems with
> K&R C before that time for scientific/numerical programming. It was
> the slow progress of fortran in the decade of the 80's that gave C
> time to mature and flourish.
I suppose, but even so K&R wasn't that bad.
> IMO, here is what *should* have occurred in the decade of the 80's.
> Almost immediately after f77 was approved, there should have been a
> minor revision to the standard that basically included all of the
> mil-std features (standard bit operators, implicit none, etc.); this
> alone would have solved many problems with portability. Then what
> was eventually f90 should have been adopted a few years later in the
> 1984 to 1985 timeframe.
Personally, I was rooting for 1988, to keep the 66, 77, 88
trend going. (I don't know about the 99, 00 though.)
> The problems with adopting the standard was
> not that the features had not been set, it was that some of the
> vendors were intentionally trying to delay approval of the ANSI
> standard in order to milk competitive advantages with their existing
> f77 compilers. Some vendors had worked very hard, and done a very
> good job, of optimizing f77 code, and they believed that the
> introduction of high-level constructs such as array syntax would
> obviate that effort and make those same optimizations available to
> their competitors with essentially no effort. In retrospect, this
> belief was at least partly unfounded. So they drug their feet, year
> after year after year.
I was mostly doing Fortran programming in the 1977-1986 time frame,
but had pretty much no connection to what the vendors were doing,
other than what appeared in the manuals that they published.
(snip)
> So all of this feet
> dragging gave other languages, mostly C, time to take root and
> establish themselves. But even in the early 90's, the vast majority
> of scientific programming was still done with fortran, not with C or
> C++, and the other languages du jour such as java had not yet been
> invented. But that decade of obstruction is what let these lesser
> languages become popular.
I suppose, but it might also have been changes is what "scientific
computing" was doing around that time. Among others, the expansion
of computational biology. As the DNA database grew exponentially
(I believe it still is) the need for programs to process the data
also grew. But unlike much of scientific computing, computational
biology does a lot of fixed point work with small integers,
and a lot of character work. Also, computational biology uses
more pure math (things like graph theory) than computational
physics or chemistry (the more traditional scientific computing).
More (theoretical) computer science people became involved,
and not so many from the "numerical analysis" community.
Also around that time frame was big growth in work with symbolic
math, leading to systems like SMP, Maple, and Mathematica.
Again, reduces need for floating point number crunching and
more character comparisons.
It could have been two separate scientific communities, and might
have been if a new Fortran came earlier, but eventually people
talk to others, and ideas spread and mix.
> A related issue is the availability of a free compiler. F77 had the
> f2c converter and later the g77 compiler. But the typical fortran
> compiler was still something that was developed in-house by a
> hardware vendor.
Well, this was also the transition from the large computing
center to individual workstations. A large computing center
might have had enough people to license an expensive Fortran
compiler, but maybe not the individual worker.
> It was designed specifically to take advantage of
> whatever hardware features were available at that time (VLIW
> instruction sets, RISC, parallel threads, NUMA, special floating
> point units, etc.). C compilers in contrast were used more like
> assemblers, implementing more low-level nonportable code than
> high-level portable code. Most computers at the time had a free C
> compiler,
I don't know about "most", but for many years the unix sysgen
process required a C compiler, so the OS had to come with one.
That wasn't true for other OS, though, but unix was expanding
into the scientific community.
On the other hand, for DEC systems that I remember at the time,
> and including the gcc compiler, sometimes there were
> actually several C compilers available on a computer. In contrast,
> fortran compilers were expensive and only one would be available on
> any given computer. It would take another decade after f90, until
> the early 21st century, before gfortran would give fortran users a
> similar programming environment.
Sounds about right to me.
> This is also why so many programming courses in college were based
> on C rather than fortran in the 80's and 90's, and the number of
> trained programmers is yet another reason for the popularity of C
> compared to fortran.
Well, if taught by the numerical analysis community they
might have been in Fortran, but the CS community liked Pascal,
and then C.
It seems to me that there has been a merging of the more tranditional
CS (theoretical computer science) and CE (computer engineering,
including hardware) into CSE (computer science and engineering).
Another trend which has an effect on how things are taught.
> So in hindsight, imagine if the delays by the ANSI committee had
> been avoided, and imagine if gfortran (or something similar, such as
> a new f2c) had been available in the mid 1980's rather than the mid
> 2000's. Given the choice between gfortran and K&R C, most
> scientific, engineering, and numerical programming would almost
> certainly have stayed with fortran. Fortran would have remained the
> main programming language taught in colleges, and things would be
> very very different now.
Maybe for the numerical programming. Engineering has been moving
to higher level interpreted languages, such as Matlab, as computers
became faster. It seems to me that the scientific community was
changing enough that I am not convinced it would have stuck so
completely with Fortran.
-- glen