Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Ada 9X objectives

2 views
Skip to first unread message

William Thomas Wolfe, 2847

unread,
Oct 2, 1989, 4:07:28 PM10/2/89
to

In an earlier comp.lang.ada article I included a copy of a recent
article from Stephen Crawley in comp.sw.components. I'd like to
comment here on some of the points raised.

> Well how come ADA seems to be largely irrelevant outside of
> the defense sector?

That depends strongly on your definition of "largely irrelevant";
there is a large and growing number of non-defense projects and
companies using Ada. The new generation of highly optimizing
Ada compilers deserves at least some of the credit for this
substantial and accelerating growth.

> ADA 83 was 5 - 10 years out of date even before it was finalised. Unless
> it gets a RADICAL overhaul, ADA 9x will be 10 - 15 years out of date.
> Doubtless, the reasctionaries and religious zealots from the software
> engineering industry will make sure that much of the important work done
> by researchers over the last 15 years (like GC technology, functional
> programming, designing languages to make formal verification possible)
> will be ignored ... just like they did for ADA 83.

In fact, this is not correct. Ada 83 explicitly provides for garbage
collection as an optional compiler service. The rule that functions
must not modify their parameters was probably a direct result of
functional programming ideas. Finally, formal verification is a
major goal of the software engineering community, and Ada was designed
to support it to as great an extent as possible. For example, the
use of the termination model of exception handling was (at least in
part) motivated by formal verification considerations.

> Production language design should be an on-going evolutionary process.
> The language design should reviewed regularly to incorporate new proven
> ideas from research languages and the results of experience from the
> production language itself. A new language version every 2 years sounds
> about right to me.

This is too frequent; five years might be reasonable, but not two.
I don't think the compiler validation suites, etc., would be able to
respond meaningfully to a revision cycle which was THAT frequent.

> What about all the software in old versions of the language? Who does
> the job of converting it I hear you ask? It should be the task of the
> people who build programming support environments to write conversion
> tools to largely automate the task of converting code from one version
> of the PL to the next one.

The US Government is actively planning to maximize the use of
automatic translation technology during the transition from Ada
83 to Ada 9X.

> Maybe these ideas are not workable right now ... production programming
> support environments aren't really up to the task yet. But this is the
> direction the Software Engineering industry should be aiming. The process
> of change in computing is inevitable; we should be going with the flow
> not trying to hold back the tide.

On this I agree. But another good reason to only revise no more
quickly than five years at a time is to give new ideas a chance to
mature. Once a new idea has proven itself, and has become reasonably
agreed upon to be a good thing that production languages should have,
there should be a process by which production languages incorporate
new developments in software engineering technology, and this is what
should be accomplished by the Ada 9X scheduled revision process.


Bill Wolfe, wtw...@hubcap.clemson.edu

Ronald Guilmette

unread,
Oct 2, 1989, 7:33:33 PM10/2/89
to
In article <66...@hubcap.clemson.edu> billwolf%hazel.cs.c...@hubcap.clemson.edu writes:
>
> The US Government is actively planning to maximize the use of
> automatic translation technology during the transition from Ada
> 83 to Ada 9X.
>

Holy smokes!!!! Is it just me folks, or does the statement above imply that
(a) Ada 9X has already been designed, and (b) it *does not* provide upward
compatibility for Ada 83 programs?

Obviously, just following this newsgroup is not enough to keep ones self
"in the know" regarding the current state of the Ada 9X deliberations!

Somebody please reduce my ignorance level and tell me what parts of Ada 83
have already been declared to be obsolete. Tasking? Fixed-point?
Could somebody maybe post a list?

// rfg

William Thomas Wolfe, 2847

unread,
Oct 3, 1989, 2:14:46 PM10/3/89
to
From r...@ics.uci.edu (Ronald Guilmette):

>> The US Government is actively planning to maximize the use of
>> automatic translation technology during the transition from Ada
>> 83 to Ada 9X.
>
> Holy smokes!!!! Is it just me folks, or does the statement above imply that
> (a) Ada 9X has already been designed, and (b) it *does not* provide upward
> compatibility for Ada 83 programs?

It means only what it says: whatever 9X's final form, the use
of automatic translation technology will probably be maximized.
This probably will take the form of an automatic translator being
built to Government specs and placed in the Ada Software Repository.

I would think that there would be enough similarity to make the
use of automatic translation reasonable; this technology has
proven not to be exceptionally useful where remarkably dissimilar
languages are involved, such as in a Fortran-to-Ada conversion.


Bill Wolfe, wtw...@hubcap.clemson.edu

Ronald Guilmette

unread,
Oct 3, 1989, 4:02:34 PM10/3/89
to
>> Holy smokes!!!! Is it just me folks, or does the statement above imply that
>> (a) Ada 9X has already been designed, and (b) it *does not* provide upward
>> compatibility for Ada 83 programs?
>
> It means only what it says: whatever 9X's final form, the use
> of automatic translation technology will probably be maximized.
> This probably will take the form of an automatic translator being
> built to Government specs and placed in the Ada Software Repository.
>
> I would think that there would be enough similarity to make the
> use of automatic translation reasonable; ...

I think that you missed my point entirely.

I have to assume that there is a large base of Ada 83 users out there who
hope and pray that "the use of automatic translation" would *not* be
"reasonable", but would instead be TOTALLY UNNECESSARY when moving to Ada 9X.

I will ask one more time and hope for a more direct answer. Has it already
been decided that Ada 83 and Ada 9X will be sufficiently incompatible so
as to *require* translation? (Hint: this is a yes-or-no question.)

If the answer is no, then why is the government planning on building/using
a translator (or translators) when the need for such tools & processes is
not yet even established? Has Samuel Pierce moved over to DoD from HUD, or
is this just the 1990's version of the $600 screwdriver?

If the answer is yes, then it *must* logically follow that *somebody*
knows what the incompatibilities are. Otherwise, how could anyone know
that automatic translation will be required (or even useful).
If so, that person (or persons) are doing a disservice to the Ada
community by not comming forward to warn Ada 83 users about features to
avoid from now on. Could it perhaps be the case that the individuals who
know what the incompatibilities are (ahem, I mean what they "will be") are
keeping it to themselves in the hope of later capitalizing on this "insider
information"?

Either way, something here smells like three-day-old fish.

// rfg

Stephen Crawley

unread,
Oct 3, 1989, 9:08:22 PM10/3/89
to
I wrote:
>> Well how come ADA seems to be largely irrelevant outside of
>> the defense sector?

Bill Wolfe replies:


> That depends strongly on your definition of "largely irrelevant";
> there is a large and growing number of non-defense projects and
> companies using Ada. The new generation of highly optimizing
> Ada compilers deserves at least some of the credit for this
> substantial and accelerating growth.

OK, I'll clarify myself. Undoubtedly there are companies moving to
Ada for non-defence work. But there seem to be MORE companies
moving to other languages such as C++ and (I hate to say it) C.
This is only my perception of what is going on. Does anyone have
any meaningful statistics on recent trends in programming language
usage?

>> ADA 83 was 5 - 10 years out of date even before it was finalised. Unless
>> it gets a RADICAL overhaul, ADA 9x will be 10 - 15 years out of date.
>> Doubtless, the reasctionaries and religious zealots from the software
>> engineering industry will make sure that much of the important work done
>> by researchers over the last 15 years (like GC technology, functional
>> programming, designing languages to make formal verification possible)
>> will be ignored ... just like they did for ADA 83.

> In fact, this is not correct.

ADA 83 most certainly was 5 - 10 years out of date! And given that the
ADA 9x will be going through the same long, drawn out process as 83,
I can't see why it should be any less out of date.

> Ada 83 explicitly provides for garbage
> collection as an optional compiler service.

But they cocked it up. Optional garbage collection is close to useless,
since you can't depend on it being there ... unless you write code that
assumes a particular ADA compiler. This particular lesson should have
been learned from Algol-68. Maybe some of the ADA design time knew this
.. but the reactionaries won the day.

> The rule that functions
> must not modify their parameters was probably a direct result of
> functional programming ideas.

I doubt it very much. It is more likely it was a result of bad
experiences with FORTRAN and PASCAL "VAR" parameters.

> Finally, formal verification is a
> major goal of the software engineering community, and Ada was designed
> to support it to as great an extent as possible. For example, the
> use of the termination model of exception handling was (at least in
> part) motivated by formal verification considerations.

Excuse me while I laugh ...

Verifying ADA 83 is a fundamentally intractible problem for any number
of reasons. I don't believe anyone has even managed to formally define
the semantics of ADA 83! Maybe some members of the ADA design team did
have verification in their minds ... but others didn't, and the others
won the day.

>> Production language design should be an on-going evolutionary process.

>> ... A new language version every 2 years sounds about right to me.

> This is too frequent; five years might be reasonable, but not two.
> I don't think the compiler validation suites, etc., would be able to
> respond meaningfully to a revision cycle which was THAT frequent.

There is no reason why the revisions should not be pipelined. And
what is wrong with some people using pre-validated compilers? After
all that is how much of the rest of the computing industry works at
the moment. It is often better to use a new, somewhat flakey compiler
now if it offers significant benefits.

> But another good reason to only revise no more
> quickly than five years at a time is to give new ideas a chance to
> mature. Once a new idea has proven itself, and has become reasonably
> agreed upon to be a good thing that production languages should have,
> there should be a process by which production languages incorporate
> new developments in software engineering technology, and this is what
> should be accomplished by the Ada 9X scheduled revision process.

The time from a new idea being introduced, to its being mature is much
less than 5 years. Besides, new ideas are developed in parallel not
serially. The problem is that too many people in industry are too
busy meeting project deadlines to keep up with research. The result
is that it takes far too long for mature ideas to be perceived as such,
and hence to come into general use.

You would do well to consider the ongoing development of the Eiffel
language and environment. Currently, there seems to be a minor
revision cycle of ~6 months and a major cycle of ~2 years. Nobody
seems to be complaining ...

-- Steve

James E. Cardow

unread,
Oct 4, 1989, 9:09:52 AM10/4/89
to

I have to agree with Mr Wolfe, a two year update cycle would cause havoc
with the language and the organizations attempting to support the software
already developed or nearing the end of development. The support organizations
for large, real-time software systems (the original target of Ada) would like
nothing more than to have the latest and greatest improvements in their
software and support tools. The problem is, minor changes cause major problems.Consider the ten to twenty year development cycle for large projects. Compiler vendors (not to mention) compilers enter and exit the market over that period.
The software and the compiler must then be supported by someone else.

The ten year span is most likely too long. The additional time to effectively
propagate the change and maneuver through the bureaucracy stretches the time to
closer to 15 years. With that time frame, too many technically sound
improvements are ignored. Those of us is support painfully realize the
difference 15 years of technology change make.

As far as what changes should be considered, the first step should be an
evaluation of the features of "research" languages. The very features that
make them attractive are probably the most desired in changes to other
languages. Inheritance as handled by C++ would certainly seem to be a prime
candidate. There is usually a reason why those features make the language
attractive, not considering and incorporating them is foolish.

Changes to Ada 83 are needed. The change frequency should be reevaluated.
The healthy discussion over which change to include is outstanding. Too bad
the support community is not making a greater input, they will live with the
results long after the developers move on to newer ideas.

Ted Dunning

unread,
Oct 4, 1989, 11:33:50 AM10/4/89
to

In article <9...@scaup.cl.cam.ac.uk> s...@cl.cam.ac.uk (Stephen Crawley) writes:


Bill Wolfe replies:

> Finally, formal verification is a
> major goal of the software engineering community, and Ada was designed
> to support it to as great an extent as possible. For example, the
> use of the termination model of exception handling was (at least in
> part) motivated by formal verification considerations.

Excuse me while I laugh ...

Verifying ADA 83 is a fundamentally intractible problem for any number
of reasons. I don't believe anyone has even managed to formally define
the semantics of ADA 83! Maybe some members of the ADA design team did
have verification in their minds ... but others didn't, and the others
won the day.


compare this situation to that of scheme where the formal semantics of
the language _have_ been defined, and they are concise enough to fit on
a couple of pages. in fact, they are so simple and straightforward
that the formal semantics can be used in an undergraduate computer
science class.

even more interesting, these formal semantics were automatically
derived from a running scheme program which provides an executable
model of the semantics. this allows much simpler testing of the
semantics than just writing down the equations and having people stare
at them.

--
t...@nmsu.edu
remember, when extensions and subsets are outlawed,
only outlaws will have extensions or subsets

Ted Dunning

unread,
Oct 4, 1989, 4:24:37 PM10/4/89
to

In article <13...@blackbird.afit.af.mil> jca...@blackbird.afit.af.mil (James E. Cardow) writes:

...

As far as what changes should be considered, the first step should be an
evaluation of the features of "research" languages. The very features that
make them attractive are probably the most desired in changes to other
languages. Inheritance as handled by C++ would certainly seem to be a prime
candidate. There is usually a reason why those features make the language
attractive, not considering and incorporating them is foolish.


how do you add the primary feature of scheme which is parsimony to ada
whose salient characteristic is obesity?

in the formal semantics of scheme, the abstract syntax of scheme is
_6_ lines long. and yet this language has considerably _more_ power
than ada in many respects.

how do you ADD this to ada which doesn't even yet have a formal
semantics?

William Thomas Wolfe, 2847

unread,
Oct 4, 1989, 9:56:29 PM10/4/89
to
From r...@ics.uci.edu (Ronald Guilmette):

> I will ask one more time and hope for a more direct answer. Has it already
> been decided that Ada 83 and Ada 9X will be sufficiently incompatible so
> as to *require* translation? (Hint: this is a yes-or-no question.)

I do not know as a fact that this is the case, but I do recall
reading about plans for automatic 83 => 9X translation. Perhaps
someone having more direct knowledge of the situation could comment
in greater detail.

I would expect that there *would* be sufficient incompatibility,
just on the basis of the experiences with Ada 83 which have been
documented in Ada Letters, etc., which indicate that certain changes
are necessary. I would also expect that if a free 83 => 9X translator
is provided via the Ada Software Repository, the conversion process
would be dominated by the time required to upgrade programmers, rather
than the time required to automatically upgrade existing Ada 83 software.

If you want to get a good idea of what the likely changes will be:

o ACM SIGADA Ada Letters for the last 5 years or so
o Ada 9X revision requests
o proceedings of Tri-Ada and other conferences
o the last chapter of Paul Hilfinger's ACM Distinguished
Dissertation, "Abstraction Mechanisms and Language Design"

are all excellent sources.


Bill Wolfe, wtw...@hubcap.clemson.edu

Michael Peirce

unread,
Oct 5, 1989, 8:37:36 PM10/5/89
to
In article <9...@scaup.cl.cam.ac.uk> s...@cl.cam.ac.uk (Stephen Crawley) writes:
>
>>> Production language design should be an on-going evolutionary process.
>>> ... A new language version every 2 years sounds about right to me.
>
>> This is too frequent; five years might be reasonable, but not two.
>> I don't think the compiler validation suites, etc., would be able to
>> respond meaningfully to a revision cycle which was THAT frequent.
>
>There is no reason why the revisions should not be pipelined. And
>what is wrong with some people using pre-validated compilers? After
>all that is how much of the rest of the computing industry works at
>the moment. It is often better to use a new, somewhat flakey compiler
>now if it offers significant benefits.
>

Are you serious??? It's better to use a somewhat flakey compiler???

Those of us trying to solve real world problems can't afford to use
a "slightly flakey compiler". When shipping product to make money to
feed the kids, spending days tracking down weird bugs caused by flakey
compilers isn't the way to stay in business!

The ivory tower world makes some terrific contributions, but they can
keep their flakey compilers until such time as they aren't flakey any
more, thank you very much. Personally, I usually skip any compiler that's
no at least to release 2.0 or later. They're just not worth the trouble!


Claris Corp. | Michael R. Peirce
-------------+--------------------------------------
| 5201 Patrick Henry Drive MS-C4
| Box 58168
| Santa Clara, CA 95051-8168
| (408) 987-7319
| AppleLink: peirce1
| Internet: pei...@claris.com
| uucp: {ames,decwrl,apple,sun}!claris!peirce

Scott Simpson

unread,
Oct 6, 1989, 1:26:35 PM10/6/89
to
In article <10...@claris.com> pei...@claris.com (Michael Peirce) writes:
>In article <9...@scaup.cl.cam.ac.uk> s...@cl.cam.ac.uk (Stephen Crawley) writes:
>>There is no reason why the revisions should not be pipelined. And
>>what is wrong with some people using pre-validated compilers? After
>>all that is how much of the rest of the computing industry works at
>>the moment. It is often better to use a new, somewhat flakey compiler
>>now if it offers significant benefits.
>Are you serious??? It's better to use a somewhat flakey compiler???
>
>Those of us trying to solve real world problems can't afford to use
>a "slightly flakey compiler". When shipping product to make money to
>feed the kids, spending days tracking down weird bugs caused by flakey
>compilers isn't the way to stay in business!

I agree. If you spend more time debugging your tool than creating your
application, you are spending too much time on someone elses product.
This is an easy and dangerous trap to fall into. Interestingly, Barry
Boehm's COCOMO model lists the following constants for VIRT or Virtual
machine volatility (Virtual machine volatility corresponds to the
software tool you are using, e.g., compiler, database, etc.)

Rating (from Intermediate COCOMO)
Low Nominal High Very High
.87 1.00 1.15 1.30

These constants are somewhat high reflecting the additional time you
must spend when debugging the tool you are using rather than spending time
building your application.
Scott Simpson
TRW Space and Defense Sector
usc!trwarcadia!simpson (UUCP)
trwarcadia!sim...@usc.edu (Internet)

Stephen Crawley

unread,
Oct 6, 1989, 2:20:51 PM10/6/89
to
>> And what is wrong with some people using pre-validated compilers? After
>> all that is how much of the rest of the computing industry works at
>> the moment. It is often better to use a new, somewhat flakey compiler
>> now if it offers significant benefits.

> Are you serious??? It's better to use a somewhat flakey compiler???

In a lot of cases ... in the real world ... given that "somewhat flakey"
means "pre-validated" ... definitely yes.

There are a lot of people in the real world who cannot afford to be
conservative about the compilers they use. If they were, they'd
always be last into the marketplace with their products. This is
another way not to stay in business.

Sure. If you have managed to corner a section of the marketplace, or if
you are doing bespoke software development to your own deadlines, you
may well save time and money by being conservative and using a properly
validated compiler.

Given that you have to choose between three options:
1) do it now with a pre-validated compiler for a new language
or
2) do it later with a validated compiler for a new language
or
3) do it now with a validated compiler for an existing, but clearly
inadequate language
there are undoubtedly situations where the COMMERCIALLY CORRECT decision
is to take a risk and go for option 1)

---

> The ivory tower world makes some terrific contributions, but they can
> keep their flakey compilers until such time as they aren't flakey any
> more, thank you very much.

Hah! I'm not talking about toy compilers written as undergraduate projects.
I'm talking about flakey compilers written by commercial outfits, sold
for BIG MONEY. Like the dodgy VS FORTRAN compiler that IBM flogs to its
scientific customers. Or some of the early C compilers for PC's and Macs.
Thank YOU very much!

---

Some food for thought:

Compiler validation is a method of demonstrating that a compiler doesn't
have one of a fixed list of bugs ... not that it doesn't have ANY bugs.

-- Steve

Dick Dunn

unread,
Oct 6, 1989, 3:00:37 PM10/6/89
to
jca...@blackbird.afit.af.mil (James E. Cardow) writes about the problems
in shortening the Ada language update cycle. His points were good, yet I
was left with the feeling that there was something wrong underneath. I
finally decided that it's this:

> ...Consider the ten to twenty year development cycle for large projects...

If you have a ten-year development cycle for a software project, you're
going to be producing obsolete software! You can't help it. Ten years is
just too long for anything tied to computers--the technology moves too
fast.

You've got to get software up and working, and performing at least some of
the needed functions *soon*. You also need it to be adaptable, so that it
can evolve as needs and technology change.

What I'm getting at is that I think we're trying to address the wrong
problem. Rather than trying to solve "How do we deal with long development
cycles?" we should be solving "How do we shorten the development cycles?"
--
+---------+ Dick Dunn r...@ico.isc.com ico!rcd (303)449-2870
| In this | 4th annual MadHatterDay [10/6/89]:
| style | Madness takes its toll
|__10/6___|

Ronald Guilmette

unread,
Oct 6, 1989, 9:27:29 PM10/6/89
to
> I do not know as a fact that this is the case, but I do recall
> reading about plans for automatic 83 => 9X translation. Perhaps
> someone having more direct knowledge of the situation could comment
> in greater detail.

If you can find the reference, I'm sure that many here would be interested
to read that same publication.

// rfg

William Thomas Wolfe, 2847

unread,
Oct 8, 1989, 12:39:34 PM10/8/89
to
From r...@ics.uci.edu (Ronald Guilmette):

>> I do not know as a fact that this is the case, but I do recall
>> reading about plans for automatic 83 => 9X translation. Perhaps
>> someone having more direct knowledge of the situation could comment
>> in greater detail.
>
> If you can find the reference, I'm sure that many here would be interested
> to read that same publication.

Unfortunately, at the time I considered it an interesting tidbit of
information, but did not anticipate ever having to cite it. At any
rate, we must also realize that translation is only one of the two
major avenues of transition; the other is to exploit Ada's ability
to call code written in other languages. If the manufacturers of
Ada 83 compilers decided to sell an upgrade which would permit the
use of pragma Interface to Ada 9X, and the 9X compilers also had the
ability to interface to Ada 83, then the primary purpose of automatic
translation would be to enable the use of the more powerful Ada 9X
by the system's maintainers.


Bill Wolfe, wtw...@hubcap.clemson.edu

James E. Cardow

unread,
Oct 9, 1989, 11:26:12 PM10/9/89
to
r...@ico.ISC.COM (Dick Dunn) writes:

>> ...Consider the ten to twenty year development cycle for large projects...

>If you have a ten-year development cycle for a software project, you're
>going to be producing obsolete software! You can't help it. Ten years is
>just too long for anything tied to computers--the technology moves too
>fast.

>You've got to get software up and working, and performing at least some of
>the needed functions *soon*. You also need it to be adaptable, so that it
>can evolve as needs and technology change.

I must admit that my comments were made with only my own experience in mine,
that being large DOD sponsored projects that had developments spanning two
to three computer generations. However, that is the primary Ada environment.
In the military software support world we are for the most part just entering
the support for JOVIAL systems. Having been responsible for "selling" Ada
to the people attempting to prepare for "new" software, I'm convinced that
injecting new technology especially evolving technology may very well be
a cure more painful than the disease.

Consider the problem in a real context. System software in the +100,000
lines of code, with supporting software at a 4:1 ratio. Add to that
simulator software that must function exactly like the real thing. Now
add unique hardware, especially processors. If the system were
stagnant and the budget available the conversion to a new language would
be simple (simpler?). But reality says the money for change is small, and
the user demand for improvements is large. The changes come in modification
of 10 percent of a unit here, 5 percent there. The only real opportunity
is when major systems are effected, but that is rare.

>What I'm getting at is that I think we're trying to address the wrong
>problem. Rather than trying to solve "How do we deal with long development
>cycles?" we should be solving "How do we shorten the development cycles?"
>--

In the years I have spent chasing DoD software I have always worried about
how to get it delivered closer to the expected date, the idea of shorter
never occured to me. But, I'm changing roles now to teach software
engineering and would greatly love to discuss ways to shorten the
development cycle, or ways to inject new technology into old systems. If you
have ideas on the subject within the context of large, complex systems or
know of any work in these areas let me know.

As a side note, Ada can be added to older systems. It takes convincing
people that the benefits over the long run are worth the effort.

Dick Dunn

unread,
Oct 12, 1989, 1:09:30 AM10/12/89
to
jca...@blackbird.afit.af.mil (James E. Cardow) writes:

James Cardow wrote:
> >> ...Consider the ten to twenty year development cycle for large projects...

I squawked:


> >If you have a ten-year development cycle for a software project, you're

> >going to be producing obsolete software! You can't help it...

I realize that my comment was just an objection--not a suggestion for how
to fix the problem I perceived. However, Cardow seems interested in the
subject; I'm vitally interested; I'd like to bounce some ideas around...
so perhaps it's worth pursuing. (Would this be better moved to
comp.software-eng? I don't know how much Ada relevance it will retain.)

> I must admit that my comments were made with only my own experience in mine,
> that being large DOD sponsored projects that had developments spanning two
> to three computer generations. However, that is the primary Ada environment.

Perhaps, then, there's a fundamental question of whether Ada can remain
suitably stable for these very-long-schedule projects, yet acquire some
ability to adapt more quickly. At first blush, that seems like it might
be difficult. However, it's important to consider it, because if Ada can't
adapt faster and be more responsive than it has, there is a chance that
large DoD projects will be its *only* real domain, denying it success in
the commercial world where you've got to move faster.

(I spend my time in the commercial world and have done so for quite a
while; my only real encounter with the DoD world was a brief but horrible
skirmish with a Minuteman-related project many years ago.)

> Consider the problem in a real context. System software in the +100,000

> lines of code, with supporting software at a 4:1 ratio...

Yes, ouch, although commercial operating systems are in that range. Once
you've got such a system, you've got all the problems that go along with
it. But can you aim to avoid producing systems of that size in future
projects? How big do the systems *really* need to be?

One thing I've noted again and again as I look at complex software and
large software projects is that predictions that "this is gonna be a big
un!" are self-fulfilling. Let me see if I can illustrate.

There's a tendency for the perceived amount of effort required for a
project to be "bimodal" in a funny way. That is, if you look at the number
of people on a project versus the perception of the project staff of
whether it's under-staffed or over-staffed, you are very likely to see
something like the following:
- very few people: perception is "we need more"
- just about right: perceptions are mixed as short-term needs vary
- too many (past first node): "too many people; they're getting in
the way and getting bored"
- a whole bunch too many past first node: "not enough people"!
This is the interesting point--it's where you get so many people
that you need to ADD people to manage, coordinate, communicate,
etc. You're so overstaffed that you've got to add people to
cope with the excess staff so that work can get done. Excess
staff is a particularly severe problem in early project phases.
- just about right again (at second node): "ok, but this is sure
a big project" You've got enough people to handle the project
AND all the extra people.

Projects could be multi-modal (more than two nodes) but it's hard to
imagine covering that much range in staffing without getting a reality
check. Two examples of where I believe I've seen this bimodal-staffing
phenomenon were something like 4 or 5 people versus about 30, and perhaps
10-15 versus 600-800! The differences are radical--they have to be to get
a qualitative difference between the nodes.

The first point about this is that if you get really wound up for a big
project, instead of making every serious attempt to simplify it to the
bone, you'll staff up for a big project. You'll pass the first node at a
full gallop and enter the region where you're (seemingly) understaffed.

Now think about what happens if you get to the larger size: You *must*
produce a large piece of software. There *will* be more difficulties in
communication among people (and therefore among modules). So you have to
impose a lot more structure. It's harder to assess the effects of changes,
so it's harder to make them. If one function (or ADT implementation or
whatever) isn't quite what you'd like, it may involve too many people and
too much hassle to change it, so the code that uses it gets a little more
complicated and bigger to work around the mismatch. If it's obviously
wrong, it'll get changed...I'm talking about subtler problems. But the
phenomenon feeds on itself: As one piece of code grows to adapt to a
mismatch, it itself becomes harder to change. The software "sets up" too
soon. You try to avoid this, of course, because you can see some of it
coming. So you spend more work on the front-end phases--detailed design,
massive specification, all conceivable attempts to pull things together.
It helps a little, but it also speeds the ossification process. What
you're doing is making sure that when things set up, they end up about
where they're supposed to be. But what you really need is to keep them
from setting up so soon.

Some of you will no doubt think I'm crazy, or hopelessly pessimistic (or
both!:-) You probably have trouble grasping it until you've worked on a
project whose printed specifications comprise a stack of paper taller than
you are.

If you can keep the software fluid and able to change throughout the
development process, you have the side benefit that the software has been
going through changes. Adaptability is already an idea for the software.
It's in people's minds. Some of the un-adaptability will get fixed before
the software goes out the door the first time. (You're sanding off the
rough edges that occur in *any* new product.)

Another problem of the overstaffing/overestimating explained above is that
it becomes harder for any one person to get a comprehensive view, or even a
substantial view. This feeds into both the difficulty of change and the
difficulty of perceiving the need for change.

Following from that to a different observation - The best software is
constructed by the smallest possible number of best possible individuals.
Each person has greater responsibility and more global view of the project.
Instead of having, perhaps, NObody who can see the whole picture, you might
have two or three who can see it, and who can act as cross-checks on one
another.

There are still projects which are large enough that they need a small army
to implement them...and honestly, I don't have a lot of ideas there. But I
do know that the number of such massive projects is much smaller than is
commonly believed. I also know that once a project gets away from you and
starts to grow, you have real trouble taming it again.
--
Dick Dunn r...@ico.isc.com uucp: {ncar,nbires}!ico!rcd (303)449-2870
...No DOS. UNIX.

Robert Eachus

unread,
Oct 12, 1989, 2:16:43 PM10/12/89
to
In article <16...@vail.ICO.ISC.COM> r...@ico.ISC.COM (Dick Dunn) writes:

>jca...@blackbird.afit.af.mil (James E. Cardow) writes:
>> ...Consider the ten to twenty year development cycle for large projects...

>If you have a ten-year development cycle for a software project, you're
>going to be producing obsolete software! You can't help it. Ten years is
>just too long for anything tied to computers--the technology moves too
>fast.

In many cases the schedule is determined by something other than
the software, such as the Space Shuttle. (And the only reason that
software wasn't on the critical path was the problems with the tiles,
but I digress.) Or any new commercial jet aircraft like the 7J7, or
military projects like the Advanced Tactical Fighter...

Software schedules for such large, long duration projects will
include many builds and tests before the actual aircraft is flown.
Again, to use the Space Shuttle as an example, the filght software was
extensively tested (and frequently crashed) in various simulators
before the Shuttle was ever flown, and the Shuttle was flown as a
glider several years before it was ever launched from the cape.

Each software build may have a short schedule, but you need the
software from build 1 to still mean the same thing when build 27 (or
build 103) gets tested for the first time in the air, and when build
34 goes into the first production aircraft.

Robert I. Eachus

with STANDARD_DISCLAIMER;
use STANDARD_DISCLAIMER;
function MESSAGE (TEXT: in CLEVER_IDEAS) return BETTER_IDEAS is...

0 new messages