Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Why don't people like to conduct req./design/code inspections?

4 views
Skip to first unread message

sl...@imap1.asu.edu

unread,
Mar 23, 2000, 3:00:00 AM3/23/00
to
Hi,

First of all this is not a homework assignment, but this is something
that I have been investigating lately..

I do not work in industry, but would like to know what aspects of the
inspection process you do not like (or find useful). I do know
inspections can be quite effective in detecting defects, but many
organization do not use them for various reasons. I am reading articles on
inspection, but I do not feel that they provide me the insight that I
want that those of you who work in software development on a daily basis
can provide.

Thanks in advance,
Stephanie

JRStern

unread,
Mar 23, 2000, 3:00:00 AM3/23/00
to
On 23 Mar 2000 03:29:31 GMT, sl...@imap1.asu.edu wrote:
>I do not work in industry, but would like to know what aspects of the
>inspection process you do not like (or find useful). I do know
>inspections can be quite effective in detecting defects, but many
>organization do not use them for various reasons. I am reading articles on
>inspection, but I do not feel that they provide me the insight that I
>want that those of you who work in software development on a daily basis
>can provide.

Well, first of all, many organizations do not use anything. They just
throw six programmers in a room, and expect them to be keying in code
all day long. Sometimes these same organizations will hold long
department meetings where they *talk* about doing something --
anything -- to establish some kind of defined process, but they never
follow up.

Is there something about inspections, that makes them less used than
practically any other aspect of development that doesn't involve
keying and compiling? I doubt that many fewer people do inspections
(do you mean, walkthru's with several people present, or individual
developers sitting and reading their own code? not sure it matters,
just wondering) than do, say, written requirements.

Another question is whether inspections of code, which is what I
assume you mean in one flavor or another, is as you have stated,
actually effective at detecting defects. It is a "white box",
bottom-up tactic, where generally I favor top-down, "black box"
approaches. Humans do not read code well, even well-structured and
documented code. If there is a *theoretical* reason not to use
inspections, I'd say that would be it.

Joshua Stern
JRS...@gte.net


David Pedlar

unread,
Mar 23, 2000, 3:00:00 AM3/23/00
to
sl...@imap1.asu.edu wrote:
>
> Hi,
>
> First of all this is not a homework assignment, but this is something
> that I have been investigating lately..
>
> I do not work in industry, but would like to know what aspects of the
> inspection process you do not like (or find useful). I do know
> inspections can be quite effective in detecting defects, but many
> organization do not use them for various reasons.


Generally people do not like their work being criticised.
Thus inspections can often be painful.
The problem is made worse by there being no universally accepted
criteria of what makes a good program. Therefore there will always be
scope for arguments.

If the inspections limited themselves to looking for actual bugs
rather than finding problems with the style of the work, that
would be more pleasant but perhaps less effective.

I once attended an inspection of a document where again there
was plenty of argument, and the author(victim!) became visibly
stressed.
Part of the problem may be that the purpose of the
document was not adequately defined before-hand.

To improve the efficiency of code reviews I suggest that
firms first create coding standards that all the programmers
agree on. There also needs to be agreement on the relative
importance that should be given to efficiency as against
readability in the code.

D.Pedlar
d...@ftel.co.uk My views only.

Ken Foskey

unread,
Mar 23, 2000, 3:00:00 AM3/23/00
to
sl...@imap1.asu.edu wrote:
>
> I do not work in industry, but would like to know what aspects of the
> inspection process you do not like (or find useful). I do know
> inspections can be quite effective in detecting defects, but many
> organization do not use them for various reasons. I am reading
> articles on inspection, but I do not feel that they provide me the
> insight that I want that those of you who work in software
> development on a daily basis can provide.

The theory. Inspections performed at a measured pace that carefully
work over the product (source or document) can remove a lot of
problems before they become major problems.

The practice. Thou shalt do inspections, go into that room and
inspect! Results degrade into a major battle of stylistic issues,
real problems are not removed. Management declares that that was a
waste of time and it is abandoned.

The practice 2. Some bright spark reads about it, since they do not
want to waste time they inspect after testing has taken place. Since
the product is tested there is a big barrier to any change that is not
a serious problem.

To make inspection work you need:

1. A checklist.
2. Preparation document trivial issues before you walk in.
3. A slow metered pace that targets defects
4. A cycle to ensure defects are corrected.

I use inspection as a tool extensively. It must be done very early, I
do it immediately after a clean compile. The formal stuff does not
need to be onerous simply hand write notes in the meeting and mark up
trivial issues on the listing itself. Use the notes on the next cycle
to ensure that everything is complete.

I have a document on inspection (word 6) on my web site if you are
interested, with coding checklists.

Thanks
Ken Foskey
http://www.zipworld.com.au/~waratah/

Chuck Dillon

unread,
Mar 23, 2000, 3:00:00 AM3/23/00
to

We have done some experimenting with code inspections recently
(our own variation of Fagan's method). One of the primary drivers
for doing them is to try and transfer experience from more practiced
engineers to less experienced engineers. As the most senior engineer
in the group I have been involved in all of the inspections.

Since we started doing this I have asked myself how I feel about
the act of inspecting code? How I feel about this being part of
what my job is?

At one level I recognize the potential value of doing inspections,
if done correctly, and I want to support doing them, if done correctly.
But at another level I don't enjoy doing them at all.

What I'm not sure of is what I don't like about them. My best quess
is that it is because much of what I do is automated. I have tools
for building code and debugging it. When you ask me to inspect it
and give me no tools I feel like I've been asked to read by candlelight.
I get a strong urge to write a perl/awk/sed/anything! script to run
the code through.

Of course there are other reasons for concern like the potential for
resentment and other interpersonal conflicts that might arise. But
I'm less concerned about those because I expect professional engineers
to act like grown ups (old fashioned attitude ;-).

On the other hand if we talk about inspecting documents (which we
haven't done in a Fagan context) it seems natural to me. When I get
a requirements document what I do with it is inspect it. That is the
norm. I have to inspect it in detail and determine who I can apply
the tools I have to satisfying those requirements. A design document
would be the same. They are prose and as an engineer my job is to
systematically inspect them.

-- ced

sl...@imap1.asu.edu wrote:
>
> Hi,
>
> First of all this is not a homework assignment, but this is something
> that I have been investigating lately..
>

> I do not work in industry, but would like to know what aspects of the
> inspection process you do not like (or find useful). I do know
> inspections can be quite effective in detecting defects, but many
> organization do not use them for various reasons. I am reading articles on
> inspection, but I do not feel that they provide me the insight that I
> want that those of you who work in software development on a daily basis
> can provide.
>

> Thanks in advance,
> Stephanie
>
>

--
Chuck Dillon
Senior Software Engineer
Genetics Computer Group, a subsidiary of Oxford Molecular

Samuel Adams

unread,
Mar 23, 2000, 3:00:00 AM3/23/00
to
I remember reading an article in the Communications of the ACM last year about
code reviews ( inspections). They reviewed research on the effectiveness of
inspections and found that formal reviews were not that effective BUT that
informat - hey I'll look that over at lunch - reviews were very effective. I
have also seen that.

Comments?

Bill Kilgore

unread,
Mar 23, 2000, 3:00:00 AM3/23/00
to
In article <38DA45A2.5045331D@aboo_niehs.nih.gov_del>,

Samuel Adams <Bug_adams5@aboo_niehs.nih.gov_del> wrote:
> I remember reading an article in the Communications of the ACM last
year about
> code reviews ( inspections). They reviewed research on the
effectiveness of
> inspections and found that formal reviews were not that effective BUT
that
> informat - hey I'll look that over at lunch - reviews were very
effective. I
> have also seen that.
>
> Comments?
>

I'll take a stab:

Formal inspections are very resource-intensive. Informal inspections are
much less so.

Both provide a very powerful incentive to get the code right the first
time. IMHO, about 90% of the goodness of inspections comes from knowing
beforehand that someone else will be giving your code a thorough review;
the other 10% goodness comes from finding actual bugs.

So even if an informal review finds many fewer actual bugs, but costs
half as much and provides roughly the same up-front incentive, the end
result is bigger bang/buck.

Another point -- way back when I was learning how to do inspections, the
idea was continuously stressed that formal inspection is a damned
expensive proposition. The ONLY way to make it pay off is to apply the
Deming principles of instrumenting the process (inspection and
inspections results), then feeding the results back to enhance the
process, with the goal of eliminating problem creation upstream rather
than catching the problems downstream. This eventually allows you to cut
way back on the expensive part, the inspections.

Unfortunately, it's an extremely hard sell to get resources devoted to
analyzing code inspection results and developing process changes.
Without that feedback loop, the cost of inspection has to be justified
only against the cost of not finding the problems that are found by the
inspection. Thus inspection becomes a much easier to justify early in
the development process, for requirements, spec and design documents,
because problems not found at those stages quickly snowball in cost.

Finally, a very persuasive argument can be made that code inspection is
a difficult, one shot deal for humans (reading by candle light, as
someone said). Compilers and comprehensive testing are much better at it
than we are, and they can be repeated with a key stroke.


I'm a big believer in:
1) formal inspection of project documents
2) informal review and intensive (repeatable) testing of code

--
Bill Kilgore (DEC-certified inspection moderator)
------------------------------------
Share your life -- be an organ donor
------------------------------------


Sent via Deja.com http://www.deja.com/
Before you buy.

Michael Edwards

unread,
Mar 23, 2000, 3:00:00 AM3/23/00
to
Having just gone through a second inspection facilitator training, I'll
add:

I don't think anybody, deep down, likes seeing all the blemishes in
their work. This is an area that's critical to control - to make sure
that comments are mature, that the environment is generally supportive,
and that the author knows that it's the work being criticized, not
him/herself.

However, this is the same reason that inspections (formal or not) work
well - you don't like to see blemishes in your own work, so you tend to
subconsciously mitigate or ignore them.

I think some people hate to take time away from "regular" work to do
inspections. It's a bit of laborious work, and if the inspection isn't
for something that your team is working on, it kind of feels like time
is being stolen from you. This is especially true if taking time out for
inspections isn't taken into account by those in charge of schedules.

Speaking of which, is also seems that the schedulers, if they don't
take inspections into account from the very start, tend to dislike
inspections as they see it as adding development time to an already too
tight schedule.

It's an easy way to shave time to skip, reduce, or downgrade formal
inspections. This is the same thinking, however, that would cut testing
time in order to meet a schedule. While sometimes such things do have to
be done, if it's being done based on the idea that "hopefully the
requirement/design/code doesn't have many problems with it", they are
undoubtedly wrong. Published statistics show that this is the worst type
of false savings - being penny wise and pound foolish.

As for formal vs. informal inspections, I think that depends largely on
your organization. If your inspectors prep just as hard, and play off
each others ideas just as well in an informal setting, then sure, formal
inspections are just going to waste more time without much of a benefit
(discounting keeping track of metric information).

However, since you don't have any way of knowing this for sure unless
you keep metrics for both informal and formal inspections, it's hard to
say. The only published reports I've heard of indicate that formal
inspections do find more issues, albeit at a higher cost per issue. They
also indicate that the cost difference is more than made up by savings
down the line (except possibly in code inspections).

Of course, I'd assume that most published sources are from
organizations that inspections were well implemented at, so who knows
what the returns are if they aren't so well implemented. Also, since I
just went through training again, I'm a little gung-ho on the issue, if
you can't tell... ;-)

--
Mike Edwards - Shoreline, WA, USA

Andrew Gabb

unread,
Mar 23, 2000, 3:00:00 AM3/23/00
to
sl...@imap1.asu.edu wrote:
> I do not work in industry, but would like to know what aspects of the
> inspection process you do not like (or find useful). I do know
> inspections can be quite effective in detecting defects, but many
> organization do not use them for various reasons. I am reading articles on
> inspection, but I do not feel that they provide me the insight that I
> want that those of you who work in software development on a daily basis
> can provide.

The usual reason that they are not used is because they are not REAL
work, ie not design and code, and therefore could not possibly
contribute to the value of the product. Moreover, they get in the
way of REAL work. Most quality improvement measures, including
defined process, falls into the same category.

Alternatively, many that have tried inspections have run them in a
half-hearted way, without training or even proper briefing for
inspectors, authors, or moderators, leading to personality clashes
and everyone's time being wasted. Badly managed inspections are
probably worse than no inspections.

For organisations that do not understand quality, do not measure
quality (or rather quality indicators), do not have a commitment to
quality, do not appreciate the cost of defects, complain instead of
the cost of quality, inspections ARE a waste of time.

If you do introduce inspections, consider using them for most
internal products including plans, analysis, design and test
documents.

Andrew
--
Andrew Gabb
email: ag...@tpgi.com.au Adelaide, South Australia
phone: +61 8 8342-1021
-----

Chris Helck

unread,
Mar 23, 2000, 3:00:00 AM3/23/00
to

We've started to do formal inspections at my company and while in theory they're
a good idea, in practice they're often a waste of time. Here's problems that
I've seen:

1.) I'm asked to inspect a design or piece of code out of context. Since I don't
really know what the thing is suppose to do I can't make intelligent comments, I
can only guess. Its a lame experience for everyone. To get the proper context
would require lots of extra work and research (perhaps a whole week).

2.) A lot of the issues that arise are opinion not fact. I'll say "This will be
hard to maintain in the future", someone else says "No, its more object oriented
the way it is!" Who's right?

3.) Even the best inspectors miss obvious things. Two weeks after a good
inspection I'll find a dumb and obvious error that the inspectors and I missed.
It makes me wonder if the process is very effective.

4.) We have big disagreements about what a design artifact looks like: some want
UML diagrams, some want text, some want CRC cards, and some want oral
presentations.

5.) We haven't allocated enough time in the schedual for them. Its easy to see
them as wasting valuable time.

6.) We're collecting defect counts and rates as per PSP. Many people think PSP
is a joke and resent the bean counting.

7.) We're suppose to inspect a design or piece of code when it is done, but
they're never done. A month after an inspection I'll ask someone about thier
piece and they'll say "Oh yeah I had to rewrite it because of this and that."
Makes me feel like the inspection was a waste of time.

I think some of our problems are due to our inexperience. I'm glad to say I
don't see too many ego problems. Most of us are mature enough to admit we make
mistakes and want help finding them. I think some of the problems arise from the
slow, infrequent, and delayed feedback that formal inspections give. It often
takes two weeks to schedual an inspection, if you're doing rapid development
that's just too long.

Personally I like doing reviews in the form of pair programming as advocated by
XP. We've done this at work in a more limited context and it seems to be more
effective and more fun. Unfortunately its a very hard idea to sell. Bottom line
seems to be that the more people who look at the design or code the more bugs
they'll find.

Regards,
C. Helck


Jay Miller

unread,
Mar 24, 2000, 3:00:00 AM3/24/00
to
<sl...@imap1.asu.edu> wrote in message news:8bc32r$4jp$1...@news.asu.edu...

> Hi,
>
> First of all this is not a homework assignment, but this is something
> that I have been investigating lately..
>
> I do not work in industry, but would like to know what aspects of the
> inspection process you do not like (or find useful). I do know
> inspections can be quite effective in detecting defects, but many
> organization do not use them for various reasons. I am reading articles
on
> inspection, but I do not feel that they provide me the insight that I
> want that those of you who work in software development on a daily basis
> can provide.
>
> Thanks in advance,
> Stephanie

My company has had great success with inspections over the last couple
years. We have a good set of guidelines and use individual inspections
instead of group inspections. All steps of the development cycle are
inspected; business analysis, specs, design docs, code, manuals, everything
gets inspected. We have checklists which correspond to the appropriate
stage in development to help catch problems early. Our inspections include
adherence to standards, style, correctness, and efficiency.

All of our inspections are done by peers within the group. Simply knowing
that someone else is going to critique your work has improved our work. It
has also kept egos out of the way. Since introducing inspections we have
been better able to meet deadlines and deliver a quality product. The
number of bugs found by our QA team and the length of their testing has been
reduced significantly.

Are inspections my favorite part of my job? Not by a long shot, but with a
streamlined process they don't take too much time and I am much more
confident in the product when it goes to QA or production. That alone makes
it worth the effort.

Jay

Chris Game

unread,
Mar 24, 2000, 3:00:00 AM3/24/00
to
In an earlier post, Ken Foskey said.....

....


> To make inspection work you need:
>
> 1. A checklist.
> 2. Preparation document trivial issues before you walk in.
> 3. A slow metered pace that targets defects
> 4. A cycle to ensure defects are corrected.

....

I'd add:
5. Standards - for the code or doc being inspected;
6. Agreed / approved source documents;
7. Training (particularly the moderator - for process issues - and the
reader);
8. Records - both for product quality trail and for process
improvement.

As the title rather implies there are different problems for different
categories of work-product.

--
===============================================

Chris Game <chri...@bigfoot.com>
===============================================

Ken Foskey

unread,
Mar 25, 2000, 3:00:00 AM3/25/00
to
Chris Helck wrote:
>
> We've started to do formal inspections at my company and while in
> theory they're a good idea, in practice they're often a waste of
> time. Here's problems that I've seen:
>
> 1.) I'm asked to inspect a design or piece of code out of context.
> Since I don't really know what the thing is suppose to do I can't
> make intelligent comments, I can only guess. Its a lame experience
> for everyone. To get the proper context would require lots of extra
> work and research (perhaps a whole week).

This is an issue, a good checklist will help here but you really need
to be involved in the project even if only on a peripheral basis.

>
> 2.) A lot of the issues that arise are opinion not fact. I'll say
> "This will be hard to maintain in the future", someone else says "No,
> its more object oriented the way it is!" Who's right?

This is an issue. Formal focuses on black and white issues. Comments
such as above might be made and curtailed quickly by the moderator.
The stylistic issues tend to flatten out to an agreed standard over
time, some differences are tolerated some are negotiated without
formal intervention. Isn't it better that these issues are dealt with
openly.

> 3.) Even the best inspectors miss obvious things. Two weeks after a
> good inspection I'll find a dumb and obvious error that the
> inspectors and I missed. It makes me wonder if the process is very
> effective.

Weeks after a program went into production a bug is found. The bug is
obvious and testing should have picked it up. Do we stop testing?

> 4.) We have big disagreements about what a design artifact looks
> like: some want UML diagrams, some want text, some want CRC cards,
> and some want oral presentations.

Stylistic issues again. These issues will resolve themselves over
time.

> 5.) We haven't allocated enough time in the schedual for them. Its
> easy to see them as wasting valuable time.

Studies prove they save time in the long run.

> 6.) We're collecting defect counts and rates as per PSP. Many people
> think PSP is a joke and resent the bean counting.

If it is used then it is useful. If they are just collecting then
stop collecting.

> 7.) We're suppose to inspect a design or piece of code when it is
> done, but they're never done. A month after an inspection I'll ask
> someone about thier piece and they'll say "Oh yeah I had to rewrite
> it because of this and that." Makes me feel like the inspection was
> a waste of time.

I inspect every change. If it needed major rework then perhaps you
should review what you missed and add it to your checklist.

>
> I think some of our problems are due to our inexperience. I'm glad to
> say I don't see too many ego problems. Most of us are mature enough
> to admit we make mistakes and want help finding them. I think some of
> the problems arise from the slow, infrequent, and delayed feedback
> that formal inspections give. It often takes two weeks to schedual an
> inspection, if you're doing rapid development that's just too long.
>
> Personally I like doing reviews in the form of pair programming as
> advocated by XP. We've done this at work in a more limited context
> and it seems to be more effective and more fun. Unfortunately its a
> very hard idea to sell. Bottom line seems to be that the more people

> who look at the design or code the more bugs they'll find.

I put inspections as a high priority task. I take less than a day to
turn them around and expect the same in return. If you leave it too
long then people are tempted to test the programs and they are wasting
their time.

Thanks for a good summation of the problems with inspection. Hang in
there, when you loose them you will feel naked once you are used to
them.

J.M. Ivler

unread,
Mar 26, 2000, 3:00:00 AM3/26/00
to
David Pedlar <D.Pe...@ftel.co.uk> wrote:
> To improve the efficiency of code reviews I suggest that
> firms first create coding standards that all the programmers
> agree on.

Okay, let's start with the simple fact that a "code review" is not an
inspection. The inspection process is very detailed (when you use a
regular process like Fegan Inspection). There are set criteria that must
be in place before an item can be inspected. There are roles that each
person in the inspection process plays. Without these roles,
responsibilities and starting documents in place there is NO inspection.

Code reviews are totally ineffective. Why? Because they focus on whether
the code does what the author intended, they don't ensure the code matches
the design, or the the design met all the requirements, or the
requirements were complete. In other words, you are validating that coding
styles were used and that there appears to be no functional defects to the
stated intent. That doesn't ensure a valid piece of software that will
meet the requirement.


J.M. Ivler

unread,
Mar 26, 2000, 3:00:00 AM3/26/00
to
Chuck Dillon <dil...@gcg.com> wrote:
> On the other hand if we talk about inspecting documents (which we
> haven't done in a Fagan context) it seems natural to me. When I get
> a requirements document what I do with it is inspect it. That is the
> norm. I have to inspect it in detail and determine who I can apply
> the tools I have to satisfying those requirements. A design document
> would be the same. They are prose and as an engineer my job is to
> systematically inspect them.

Alas, you look at the design doc and "inspect" it. It came from a
requirements document, but was the requirements document inspected? If
not, then you can rest assured that defects from requirements made it into
design. Was the design document inspected against the requirements
document? Have you determined that the design document didn't make a whole
bunch of assumptions that were not in the requirements? How are you
assured that thedesign is valid and defect free? Then you get to inspect
the QA Test plan against the requirements. You do ensure that the test
plan is written to the requirements document, not to the design or code,
right?

I'm really quite glad that you think you do inspection, but chances are
all you are doing is reading the document and validating that it looks
defect free based on what it says, disconnected from the process of
creating it. You aren't doing an inspection. You are doing a document
review, and that is almost as big of a waste of time as a code
review. None of this stuff is created in a vaccume and to review it in
that way makes the entire process nothing more than a feel good for those
involved.

While I cut your initial part, which talked about automating the process
with code, that's great if all you are doing is looking at coding
style. You can even do that if you are looking for things like
uninitialized variables, etc. But in an inspection you are looking for
DEFECTS and those are generally typed as major or minor. The ones that
tools will catch are "minor" defects. Major defects are instances where
the programmer added a new feature that wasn't in the design (and may have
been a defect of the design, but the answer is to go back and fix the
design, not add a potential trouble spot into the code) or didn't
effectivly cover all the design. You inspect CODE to DESIGN and that can't
really be automated as one is generally paragrpah and text and the other
is logic flows that represent those paragraphs of text.


J.M. Ivler

unread,
Mar 26, 2000, 3:00:00 AM3/26/00
to
Chris Helck <pp00...@mindspring.com> wrote:
> 1.) I'm asked to inspect a design or piece of code out of context.

Stop right there. You can't "inspect" an item without looking at the items
that generate it. If you are inspecting code and not inspecting it bach to
the requirements, then you aren't inspecting it, your doing a code
review (which is a waste of time). All the rest of your points fall under
this same answer, you are doing code reviews, not inspections.

Inspections start at the begining. The requirements. To inspect something
liike code that was generated from something like a design doc that wasn't
inspected just validates that you coded to the defects that you didn't
catch to begin with.

> 7.) We're suppose to inspect a design or piece of code when it is done, but
> they're never done. A month after an inspection I'll ask someone about thier
> piece and they'll say "Oh yeah I had to rewrite it because of this and that."
> Makes me feel like the inspection was a waste of time.

Once an items has been inspected, it is DONE. That's it. If there is a
need to recode the item then it MUST go through inspection again.

> I think some of our problems are due to our inexperience.

BINGO! May I suggest you do a very small project with a complete
inspection process. I personally recomend Fegan as I am a trained Fegan
moderator, but you can pick the one that best suits your needs. Once you
have done a complete project with the process you will understand the
value of doing all the projects that way. Once you see the first round of
"maintenance" or "enhancement" to something that has been through a
complete inspection, you will truly see the value of the process.

Don't believe it? Item 7 above is about rework. rework happens because
defects weren't caught early enough. If the code pass inspection to the
design, and the design passes inspection to the requirements, then the
chances of rework drop dramatically. The same is true for maintenance.

jmi
http://www.canitech.com/


J.M. Ivler

unread,
Mar 26, 2000, 3:00:00 AM3/26/00
to
Jay Miller <jay_m...@hotmail.com> wrote:
> instead of group inspections. All steps of the development cycle are
> inspected; business analysis, specs, design docs, code, manuals, everything
> gets inspected.

BINGO! You do it right.

> Since introducing inspections we have
> been better able to meet deadlines and deliver a quality product. The
> number of bugs found by our QA team and the length of their testing has been
> reduced significantly.

Whao, defects caught early reduce the cost of QA testing, and the rework
of catching them later in the process. That means that the development
cycle actually DECREASES when inspections are done properly.

Conside the following, a defect is created in the requirements document
and then is "caught" as a bug in the code in QA. To correct this defect
properly there has to be a review of the requirements, which will then
have to be modified. This will necessitate a change in the design that may
effect a number of pieces of code. Each piece of code will have to be
modified to ensure that the design changes were caught, which may create
additional new defects in all the code that had to be changed. Which now
all has to be re-inspected.

WOW! That could get expensive if you only caught a few defects in the
QA test phase. But when was the last time you saw a large project with
just "a few defects"? :-)

jmi
http://www.canitech.com/

Chuck Dillon

unread,
Mar 27, 2000, 3:00:00 AM3/27/00
to

If you look at the context of this thread you will see that my comments
were in the context of how I *feel* about inspecting documents written
for human consumption versus code written for compiler consumption. I
was not making any kind of statement about whether or not to do formal
'inspections'.

I don't want to get into an argument about semantics. Since my paranthetical
statement states I am not talking about formal inspections I'd hoped it
would be clear that my usage of the word 'inspection' should be interpretted as
per your favorite dictionary.

I largely agree with your comments but they are out of context.
I don't agree with your assertion that reviewing documents is a 'waste
of time'. It has unfortunately been my experience that the folks involved
in developing requirements are reluctant at best at being responsible for
the task. So it has been my experience that you have to closely *review*
their work and prod them like hell if you want any kind of a complete
document that can be used going forward. If you don't it is the exercise
of bothering to write a lame requirement that is a waste of time. In that
context I don't think reviews are a waste of time and its assinine to think
that the org. is mature enough to use formal inspections instead.


>
> While I cut your initial part, which talked about automating the process
> with code, that's great if all you are doing is looking at coding
> style. You can even do that if you are looking for things like
> uninitialized variables, etc. But in an inspection you are looking for
> DEFECTS and those are generally typed as major or minor. The ones that
> tools will catch are "minor" defects. Major defects are instances where
> the programmer added a new feature that wasn't in the design (and may have
> been a defect of the design, but the answer is to go back and fix the
> design, not add a potential trouble spot into the code) or didn't
> effectivly cover all the design. You inspect CODE to DESIGN and that can't
> really be automated as one is generally paragrpah and text and the other
> is logic flows that represent those paragraphs of text.

Not surprisingly you are missing the context of the thread. When I am asked
to inspect code I *feel* like I want to automate it. I'm not recommending
that it be done that way.

-- ced

James L. Scheff

unread,
Mar 27, 2000, 3:00:00 AM3/27/00
to
Most managers and organizations will not spend resources (money, time) up
front to reduce costs down the road. Inspections are expensive (several
people, several hours). Theoretically, they will pay for themselves, but
only down the road.

There are automated tools to locate problems in code, both statically and at
run time. Tools like PC-Lint, Code Wizard, BoundsChecker, Purify, etc., but
what percentage of developers use even those tools?

Ken Foskey

unread,
Mar 28, 2000, 3:00:00 AM3/28/00
to
Chuck Dillon wrote:
>
> I largely agree with your comments but they are out of context.
> I don't agree with your assertion that reviewing documents is a
> 'waste of time'. It has unfortunately been my experience that the
> folks involved in developing requirements are reluctant at best at
> being responsible for the task. So it has been my experience that
> you have to closely *review* their work and prod them like hell if
> you want any kind of a complete document that can be used going
> forward. If you don't it is the exercise of bothering to write a
> lame requirement that is a waste of time. In that context I don't
> think reviews are a waste of time and its assinine to think that the
> org. is mature enough to use formal inspections instead.

I agree, a review is not a waste of time. In order to make the review
work you have state that you need to look at it closely, this borders
on inspection without the formality.

There are some (many) definitions for inspections, here are the ones I
work with;

walk though - informal browse of the code

review - high level presentation to users and peers not directly
involved.

Inspection - a careful detailed look at the product without paperwork.

Formal inspection - as per above with a lot more focus on extracting
defects, using checklists to ensure you have not missed anything,
using a log to ensure that all issues are resolved and not lost.

Inspections can be very effective, they must be approached in the
right way however. The 'duty of care' as lawyers put it must be
foremost in your mind. Most junior (less than 5 years experience) do
not fully understand this. A lot of senior ones appear to have missed
it as well.

J.M. Ivler

unread,
Mar 28, 2000, 3:00:00 AM3/28/00
to
Ken Foskey <war...@zip.com.au> wrote:
> I agree, a review is not a waste of time.

I hate it when Ken and I disagree...

> There are some (many) definitions for inspections, here are the ones I
> work with;
> walk though - informal browse of the code

Use computer based syntax validators. It's faster and better than a "walk
through".

> review - high level presentation to users and peers not directly
> involved.

For the purpose of? Unconnected to the pre-process items (which should
have been inspected) the review has no real function. If connected to the
items that have come before, how does this differ from an
inspection? Isn't the goal of this process to generate a list of
defects. Isn't the goal to use objective criteria to find those
defects? What benefit does this have over that of inspection?

> Inspection - a careful detailed look at the product without paperwork.

Checking of more than syntax, form and structure? What is being validated
by this process? That the code does what the comments inside say it
does? The power of an inspection is the power of the process. That process
has specific roles and rules. The roles allow for people to stand outside
their normal point of view, this forces them to actually inspect the item,
not just give it a glance over. The rules are there to ensure that the
process is *not* subjective.

In most inspections that are not run with a well defined set of rules and
roles you see the inspections fail as people do a minimal review of the
item to be inspected, and then make comments that are based on subjective
criteria. A defect is *NOT* subjective. The measurement (major/minor) may
be subjective in early inspections as the teams create the rules for
what consitutes a major or minor defect, but what is a defect is *not*
subjective.

> Formal inspection - as per above with a lot more focus on extracting
> defects, using checklists to ensure you have not missed anything,
> using a log to ensure that all issues are resolved and not lost.

Every inspection should be focused on locating defects, logging defects
and scaling the defect. Part of every inspection should also include a
session where defect analysis is performed. Was this defect put in through
a defect in the development process? Defects don't happen in a
vaccume. Defects are based on either sloppy work, or sloppy processes. The
goal of the inspection is to ensure that both are caught. The goal of the
defect analysis portion of the inspection process is to ensure that the
development process is made better, so there will be less chance of a
defect in the future.

If a company is doing inspection to make sure the code is defect free, and
not doing defect analysis, they are missing the greatest benefit from the
inspection process, that of CANI (Constant And Never-ending
Improvement). If there is no defect analysis then there is no learning
process from our mistakes, and if we fail to learn from them we are bound
to repeat them.

jmi
http://www.canitech.com/
[yes, the CANI in canitech comes from the above belief system. We all
have processes in our "life" that we do. Some are more successful than
others. We can learn from the success, and from the failures (defects), to
create better processes. Inspections are not just a theory that has to do
with software development, but has to do with all parts of any process
based organism, whether it is a person, a company or the development of a
piece of software.]


Matthias Nolle

unread,
Mar 28, 2000, 3:00:00 AM3/28/00
to
I do not like inspections, if the inspectors take the role of the police.
Take the situation that many seniors inspect the work of a junior. In the
end it can happen, that the junior implements how the seniors tell him
to do it, leaving out his own ideas. This is neither good for the motivation
of the junior nor do I think it is productive for the company, since the
seniors
should have coded everything beforehand.

Matthias

Francis

unread,
Mar 29, 2000, 3:00:00 AM3/29/00
to
I do not know what environment you work in, especially when you say "the
pre-process items (which should have been inspected) ". Do you mean that you
really expect that the preceding step of the lifecylce should have actually
been finished?
I have done a lot of code reviewing, this being a normal requirement of the
pharmaceutical industry, which pays a good portion of my income. And I am
working in languages that are far mostly quite unlike C, or VB (does Ladder
or IEC1131/3 mean anything to the readers of this NG?) and systems that are
quite unlike PC's or come to that mainframes.
Anyway, I devote part of my code review to reverse engineering a Software
Design Spec from the code, and most often it is far better than the one the
programmers were using.
You also said "the review has no real function"
I have always found errors duing this process, mostly 'minor' (ie cheap to
fix, but needing fixing)
I rarely fight with the programmers, more often we share a beer or two.
Francis
www.controldraw.co.uk

J.M. Ivler <iv...@net-quest.com> wrote in message
news:M45E4.8437$Og6.2...@tw12.nn.bcandid.com...

J.M. Ivler

unread,
Mar 29, 2000, 3:00:00 AM3/29/00
to
Francis <nosp...@cwcom.net> wrote:
> I do not know what environment you work in, especially when you say "the
> pre-process items (which should have been inspected) ". Do you mean that you
> really expect that the preceding step of the lifecylce should have actually
> been finished?

Inspection (and some would say "formal inspection") requires that each
item be defect free before the next part of the process start. How can you
be assured that the design of your system is correct if you don't have
complete and defect free requirement specifications?

Now, some will say that if I get some specification I can start on the
design and when the requirements are completed, then I can validate my
design to the completed and inspected defect free requirements. This
overlap of processes is possible, if one is willing to redo work that had
been designed to a requirement that was then found to be defective and had
to be restructured. In other words, what appears to be a potential cost
savings can in fact be far costlier.

If the process adopted is to develop a defect free requirements
specification, then when the design is done it will have to be done once,
to that defect free specification. The development team can then validate
the design against the requirements specification, as well as standards
that have been deemed appropriate for the design exit criteria. Once the
design has exited defect free, then coding can commence.

While the design process is going on, there is also a team developing a
testing specification to the same defect free requirments
specification. Since the test plan has been developed to validate and test
the requirements, not the design, the possibility that the product will
work to spec is much higher.

While coding is going on portions of the test plan can be used to validate
that the code works to the requirements specification. Once the code has
been inspected and is defect free the entire inspected test suite is run
against the application to ensure that it performs defect free to spec.

In most shops the test plan is written by the same people that write the
code (I know how best to test my own software). This is even true in
ISO9000 shops (document the practice, as long as the process is documented
it doesn't matter if it's good). In the end the test plan tests that the
code operates functionally as it was prrogrammed to do. In some cases the
test plan might actually be written to the design document (inspected or
not). This only validates that the code functions as designed, not as
required.

> Anyway, I devote part of my code review to reverse engineering a Software
> Design Spec from the code, and most often it is far better than the one the
> programmers were using.

But does that mean the code they generated is defect free, and/or, that it
meets the requirements? While it's wonderful that you have the talent and
ability to create a design spec from a code component, are you sure that
the code operates as was planned, per the requirements spec?

> You also said "the review has no real function"
> I have always found errors duing this process, mostly 'minor' (ie cheap to
> fix, but needing fixing)

A defect is nothing in some industries. In others it can cause a
multi-million dollar spaceship to disapear. In one of the industries I
worked in it could cause a plane loaded with passengers, like my mom, or
my wife and kids, to go down. Imagine an MD-11 belly tank transferring
fuel to the wet tail and back again. This process happens to ensure that
while the fuel flows, the system maintains the stability of the flight as
well (weight transference). What if there was a simple error in this
code? What if, when a sensor went out, the fuel system started to try to
manage the flow and caused the system to start the pumps one way, over
pump, and then start the pumps the other way creating an oscolation of
the plane that like an expanding wave got worse with each switch of the
flow. One of my greatest nightmares was seeing this reported in the Risks
Digest.

I didn't come to inspection because it made my job easier. It made it
possible for me to sleep easier at night knowing that my family and
families like mine were riding on software that was defect free. And I
stayed with the inspection process after the fact because I saw it save
millions of dollars in software development (even before there was OO it
helped in developing reuse libraries).

jmi
http://www.canitech.com/

Ken Foskey

unread,
Mar 29, 2000, 3:00:00 AM3/29/00
to
"J.M. Ivler" wrote:
>
> Inspection (and some would say "formal inspection") requires that
> each item be defect free before the next part of the process start.
> How can you be assured that the design of your system is correct if
> you don't have complete and defect free requirement specifications?
>

<subtle mode>

I am sorry but this is bullshit.

</subtle mode>

1. NO inspection (formal or otherwise) can give you a defect free
system. It is based on people who make mistakes same way as the
original documentor or programmer makes mistakes. Requirements are
sometimes so hidden that you cannot dig them out.

2. You may choose to proceed to the next phase despite the known
problems because the deadline or finances or whatever tells you to.

I fully support inspections, they have steered me a true course so
many times and when they are not done my project typically have far
more problems. 95% defect free release production system is a very
good result, 80% is the typical result from non-inpsecting projects.
If you want to go extreme on inspections (i.e. medical) then 99% is
achievable and expensive.

J.M. Ivler

unread,
Mar 29, 2000, 3:00:00 AM3/29/00
to
Ken Foskey <war...@zip.com.au> wrote:
> "J.M. Ivler" wrote:
> >
> > Inspection (and some would say "formal inspection") requires that
> > each item be defect free before the next part of the process start.
> > How can you be assured that the design of your system is correct if
> > you don't have complete and defect free requirement specifications?
> >
> <subtle mode>
> I am sorry but this is bullshit.
> </subtle mode>

If you shoot to be good, you will be average. If you shoot to be great you
will be good. If you shoot to be outstanding you will be great. If you
shoot to be perfect you will be outstanding.

Nothing is ever 100%, but if you start by saying that you are willing to
accept 95%, then you have already compromised your ability to ever get
close to 100%. And, by shaving a percentage point here or there you can
sooner or later justify that 90% is just as good because its so much
better than 80%.

The process isn't perfect. I have held a requirements document that we
have had to go back and change even though we thought it was 100% defect
free, because we found defects while in design. But, because we had such a
high standard of being defect free we generated code that I would bet my
life and my families life on.

> 1. NO inspection (formal or otherwise) can give you a defect free
> system.

So you accept what percentage of defects in a life support system that you
may be hooked up to as acceptable? How about in an aircraft your about to
ride across the ocean in?

> I fully support inspections, they have steered me a true course so
> many times and when they are not done my project typically have far
> more problems. 95% defect free release production system is a very
> good result, 80% is the typical result from non-inpsecting projects.
> If you want to go extreme on inspections (i.e. medical) then 99% is
> achievable and expensive.

<rant>
Why shoot for 95%? Why start with expectations that say that it's okay to
have a 5% failure? When you start an inspection, as a moderator, do you
start by saying "okay folks, I want to get 95% of the defects." or do you
say "we are here to ensure that this item is defect free."? If you aren't
going to give 100%, then how the hell do you expect to get 100%? Pure
luck? It isn't even close to achievable if you don't seek to achieve it!
</rant>

Sorry Ken, we disagree here. I don't accept 95%, and the next time you get
in a plane that was built by Douglas after 1992, remember that not one of
the Fegan Moderators believed in 95% defect free as an acceptable standard
for software that sat on that plane.

jmi
http://www.canitech.com/

Roland Petrasch

unread,
Mar 29, 2000, 3:00:00 AM3/29/00
to
Hi,

concerning you statements


> Okay, let's start with the simple fact that a "code review" is not an inspection.

> Code reviews are totally ineffective...
I completely disagree. Experience from practise show that reviews are a
very effective way for defect detection. Actually, they are a special
form of inspections ! Therefore an inspection is a "formal review". Read
the book "Software Inspection" from Tom Gilb, Dorothy Graham and
Susannah Finzi. Or take a look in the book of books about testing from
Myers. Also Balzert (German author) describes inspections as "formal
reviews". Where do you get the idea to assert that a review is not an
inspection despite the fact that the whole world has a different point
of view ?

Roland

Roland Petrasch

unread,
Mar 29, 2000, 3:00:00 AM3/29/00
to

Your statement

> Okay, let's start with the simple fact that a "code review" is not an
> inspection. The inspection process is very detailed
insists that review are not very detailed. That is not true (IMHO). Take
a look in several standards like IEEE-STD 1028-1988 - IEEE Standard for
Software Reviews and Audits, 1988, then you will find reviews precesses
are very detailed. Also a good information is here:
http://www.informatik.uni-bremen.de/~uniform/vm97/methods/m-rev.htm

Roland

Chuck Dillon

unread,
Mar 29, 2000, 3:00:00 AM3/29/00
to

"J.M. Ivler" wrote:
>
>...


>
> If you shoot to be good, you will be average. If you shoot to be great you
> will be good. If you shoot to be outstanding you will be great. If you
> shoot to be perfect you will be outstanding.
>
> Nothing is ever 100%, but if you start by saying that you are willing to
> accept 95%, then you have already compromised your ability to ever get
> close to 100%. And, by shaving a percentage point here or there you can
> sooner or later justify that 90% is just as good because its so much
> better than 80%.

In most of the world economics plays a large role. An optimal solution
in most situations from the perspective of the company is almost always
well short of perfection. If you build a perfect word processor you
won't sell one because nobody will be able to afford the damn thing.
Unless of course you are willing to sell it at a loss.

The point is that you can't make these broad statements like they apply
to everybody. They might apply to you. I hope they apply to the folks
building software to control aircraft, space shuttles and life support
systems. But that is a small fraction of the SE world.

J.M. Ivler

unread,
Mar 29, 2000, 3:00:00 AM3/29/00
to
Roland Petrasch <petr...@computer.org> wrote:
> Hi,

> concerning you statements


> > Okay, let's start with the simple fact that a "code review" is not an inspection.

> > Code reviews are totally ineffective...
> I completely disagree. Experience from practise show that reviews are a
> very effective way for defect detection.

Define defect detection. If you are saying that a code review can spot bad
programming practices, bad software dvelopment and bad syntax or logic
errors, I won't disagree. But it doesn't come close to being able to allow
you to create defect free software. The time spent in review follows the
80/20 rule. You will catch 80% (the gross defects), but will miss the
20%. Some of these will show up in testing. If the test plan is not
written by the person creating the code this number will be greater than
if the coder writes the tests. The cost for fixing the defect caught in
test is more expensive than if it had been caught in code.

But what if the defect caught in test was generated in the initial
requirements? What is the cost of fixiing the defect at the test point,
rather than catching and fixing the defect when it originated?

Code reviews are a waste of tiome because the 20% of the defects that get
through are far too costly.

> Where do you get the idea to assert that a review is not an
> inspection despite the fact that the whole world has a different point
> of view ?

The whole world? I think not. I know that I was trained as a Fegan
Moderator and in that training, way back in what '91 I guess, there was
even a slide that said "Inspection != Review". Michael was teaching that
then, and I have yet to hear it to be taught differently.

J.M. Ivler

unread,
Mar 29, 2000, 3:00:00 AM3/29/00
to
Chuck Dillon <dil...@gcg.com> wrote:
> In most of the world economics plays a large role. An optimal solution
> in most situations from the perspective of the company is almost always
> well short of perfection.

Agreed 100%.

> If you build a perfect word processor you
> won't sell one because nobody will be able to afford the damn thing.
> Unless of course you are willing to sell it at a loss.

If you were to build a car like Microsoft builds their operating system,
would you buy and use it? Microsoft is trying to get share in the server
marketplace, but it can't because people have linux servers that they have
up and running to 200+ days and no Microsoft product has that level of
quality and reliability. Yes Microsoft florishes, and we all pay for it
with the "blue screen of death" and other fine features of a shit quality
product. Can you get away with building crap? Yep. Can you become
successful with that strategy? Yep.

But watch out or someday someone might just beat your butt with quality
(not everyone wants to pay to drive a quality car, that's why Kia has car
sales in America - some people are always willing to settle for 80%).

> The point is that you can't make these broad statements like they apply
> to everybody. They might apply to you. I hope they apply to the folks
> building software to control aircraft, space shuttles and life support
> systems. But that is a small fraction of the SE world.

As Microsoft has proved so well.

And how many sites do you return to that give you "404-Not Found" errors?

When given a choice, quality will win, but as you pointed out, it maters
how much it costs for that quality. The WSJ charges people to access their
web-site. Not everyone pays for that. The people that want that quality in
reporting and news do.

jmi
http://www.canitech.com/

Francis

unread,
Mar 29, 2000, 3:00:00 AM3/29/00
to
You said

> Inspection (and some would say "formal inspection") requires that each
> item be defect free before the next part of the process start. How can you
> be assured that the design of your system is correct if you don't have
> complete and defect free requirement specifications?
>

Looks hopelessly unrealistic to me, in fact I thought Defect Free was
recognised as impossible.
And

Keith Collyer

unread,
Mar 30, 2000, 3:00:00 AM3/30/00
to
iv...@net-quest.com (J.M. Ivler) wrote in
<WWbE4.8796$Og6.2...@tw12.nn.bcandid.com>:

>Francis <nosp...@cwcom.net> wrote:
>> I do not know what environment you work in, especially when
>> you say "the pre-process items (which should have been
>> inspected) ". Do you mean that you really expect that the
>> preceding step of the lifecylce should have actually been
>> finished?
>

>Inspection (and some would say "formal inspection") requires
>that each item be defect free before the next part of the
>process start. How can you be assured that the design of your
>system is correct if you don't have complete and defect free
>requirement specifications?

Not necessarily. Read Tom Gilb's Principles of Software
ENgineering Management or Software Inspections. Although he
requires that an item be defect free before starting the next
stage, this does not necessarily mean complete. Or, rather, it
only needs to be complete within the bounds of what you are
trying to achieve in the current evolutionary stage.

>Now, some will say that if I get some specification I can
>start on the design and when the requirements are completed,
>then I can validate my design to the completed and inspected
>defect free requirements. This overlap of processes is
>possible, if one is willing to redo work that had been
>designed to a requirement that was then found to be defective
>and had to be restructured. In other words, what appears to
>be a potential cost savings can in fact be far costlier.

But the risks are greatly reduced if sensible bounding is
applied at all points. We have fought for years to get away
from the Waterfall, don't throw all that away! The point is
that you should only proceed on the basis of correctness, but
completeness is not necessary (and, in practice, not often
achievable anyway).

>This is even true in ISO9000 shops (document the practice, as
>long as the process is documented it doesn't matter if it's
>good).

It must not only be documented, but also produce repeatable
results.

Keith Collyer

unread,
Mar 30, 2000, 3:00:00 AM3/30/00
to
nosp...@cwcom.net (Francis) wrote in
<a_aE4.1925$%X3.54946@news1-hme0>:

>Anyway, I devote part of my code review to reverse
>engineering a Software Design Spec from the code, and most
>often it is far better than the one the programmers were
>using.

In what sense is it "better"? Better written, or better
structured, perhaps, but how can you be sure it is a better
match to the requirements or a better way of meeting them? And
that is surely the real test.

Roland Petrasch

unread,
Mar 30, 2000, 3:00:00 AM3/30/00
to

> But what if the defect caught in test was generated in the initial
> requirements? What is the cost of fixiing the defect at the test point,
> rather than catching and fixing the defect when it originated?
> Code reviews are a waste of tiome because the 20% of the defects that get
> through are far too costly.
Oh, god ! I think there is no need for me to answer to this completely
false statement.

>
> > Where do you get the idea to assert that a review is not an
> > inspection despite the fact that the whole world has a different point
> > of view ?
>
> The whole world? I think not. I know that I was trained as a Fegan
> Moderator and in that training, way back in what '91 I guess, there was
> even a slide that said "Inspection != Review". Michael was teaching that
> then, and I have yet to hear it to be taught differently.

Then PLEASE: Read the books ! Talk to the people ! and DON'T GUESS !!!

Michael Edwards

unread,
Mar 30, 2000, 3:00:00 AM3/30/00
to
In article <38E36C12...@computer.org>, Roland Petrasch
<petr...@computer.org> wrote:

>
>> But what if the defect caught in test was generated in the initial
>> requirements? What is the cost of fixiing the defect at the test point,
>> rather than catching and fixing the defect when it originated?
>> Code reviews are a waste of tiome because the 20% of the defects that get
>> through are far too costly.
>Oh, god ! I think there is no need for me to answer to this completely
>false statement.

What I think he is trying to say is that it's a waste of time for a
particular bug, to catch it in a code review, when you could have caught
it in a requirement review.

I'm not sure just what his last sentence is trying to say, however.

I will mention that my last inspection instructor kept on repeating
that he thought that full blow code reviews were a waste of time. He
would say "Code reveiws are better than no reviews at all, but not by
much."

His idea was that reviews were very cost effective early on in the
process (requirements, specifications). However, having a full review of
code wasn't cost effective in his mind - he preferred the idea of just
doing desk checks (one on one reviews with a peer).

--
Mike Edwards - Shoreline, WA, USA

J.M. Ivler

unread,
Mar 31, 2000, 3:00:00 AM3/31/00
to
Michael Edwards <medward...@jetcity.com> wrote:
> What I think he is trying to say is that it's a waste of time for a
> particular bug, to catch it in a code review, when you could have caught
> it in a requirement review.

Maybe I should hire you as a translator. :-)

> I will mention that my last inspection instructor kept on repeating
> that he thought that full blow code reviews were a waste of time. He
> would say "Code reveiws are better than no reviews at all, but not by
> much."

I agree with him 10,000%

> His idea was that reviews were very cost effective early on in the
> process (requirements, specifications). However, having a full review of
> code wasn't cost effective in his mind - he preferred the idea of just
> doing desk checks (one on one reviews with a peer).

There I disagree a bit. A properly completed inspection in requirements
and design doesn't guarentee working code, but it sure as hell beats any
other method I have ever seen at achieving it. It should be noted that
after a inspection of requirements and design, code seems to go much
faster. The process of code inspection (if you leave the syntax cheching
to software that is a lot better at it than a human) goes rather quickly.

Imagine a test phase that zips along, and the code isn't always going
back to be "fixed" because it wasn't written properly... and you never
really know if the code failed, or was it a bad test plan?

I'l agree that doing an inspection is a pain-in-the-ass. But the cost of
doing one can be recovered many fold if you eliminate defects in
Requirements rather than finding them in the field. And please, no BS
about how that only counts with airplanes, life support, etc. Go talk to
some of the people that have to sell the crap you create. It's real
interesting to hear the spin that they put on your unplanned "features".

jmi
http://www.canitech.com/

J.M. Ivler

unread,
Mar 31, 2000, 3:00:00 AM3/31/00
to
Ken Foskey <war...@zip.com.au> wrote:
> I do not disagree with you, you should shoot for perfection with
> these things. Inspectors must understand the law of diminishing
> returns though.

We do, that's why some of the items come back again, because we aren't
perfect and we missed something.

> I totally agree that people must be trained in order to do inspections
> properly. I believe that a common failure is to do the inspections
> too late.

If you didn't "inspect" the requirements, then you haven't done an
inspection. If the requirements were shit I can promise you that the
resultant software will follow that line just as much as a defect free
requirements spec will generate excellent software if designed and coded
to spec.

> I have 'formal inspections' documented at my place of slavery.

Interesting metaphore. :-)

> The correct procedure is in place and the output is correctly documented
> and the attitude is all wrong. It is just as bad as a 'code review'.
> Formality does not guarantee success, attitude does.

Agreed. Properly trained inspectors. People that understand the roles that
they have been asigned and how to inspect within those roles. A decent set
of well defined entry and exit criteria. All these are needed to make an
inspection work. The real benefit to all this isn't so much "the first
pass through" as it is the process refinement that happens as people learn
what causes the defects and how to avoid them in the future.

jmi
http://www.canitech.com/


Jason Che-han Yip

unread,
Mar 31, 2000, 3:00:00 AM3/31/00
to
"J.M. Ivler" wrote:
<snip>

> resultant software will follow that line just as much as a defect free
> requirements spec will generate excellent software if designed and coded
> to spec.

I'm somewhat wary when I read something like this. What do you mean by
"defect-free requirements spec will generate excellent software if design and
coded to spec"?

Does this mean the spec will never change? Does this mean that the customer
is absolutely sure that what is written in the abstract spec is actually what
s/he wants the concrete system to do?

Does this mean that design and coding are trivial tasks? What about having to
deal with new technologies?

OTOH, if all you're saying is that excellent software must deliver what the
customer wants, then I don't have any concerns.
--
Graduate Student, SE Lab, University of Calgary,
http://www.sern.enel.ucalgary.ca/yip
----
XP2000 - June 21-23, 2000
eXtreme Programming and Flexible Processes in Software Engineering
http://www.spe.ucalgary.ca/extreme

Ben Kovitz

unread,
Apr 2, 2000, 4:00:00 AM4/2/00
to
Here are a few observations about people's unwillingness to inspect,
review, or even read requirements documents:

Very often they're written by analysts who've done an extraordinary job of
research and attention to detail, but they aren't written in a way that
the customer or programmers can understand. Dutiful adherence to any
rigid methodology in particular leads to "force fits" in which you can't
say simple things in simple ways. For example, writing a description of
data flows between an input "function" and a validation "function" instead
of just saying that data that doesn't meet rules X, Y, and Z is invalid.

When the information isn't expressed in terms that the intended readers
can understand, they seldom speak up. Few people, when presented with 60
use cases that are so vague that no one but the author can tell what they
mean, or weird boilerplate sections about assumptions, dependencies,
inclusions, exclusions, and lots more that doesn't apply to the project at
hand, will say, "I don't understand a word of this." Instead, most people
figure that those sections must have some sort of use, but they don't know
what it is, so it must be someone else's job. So they give the document
an OK, wondering what the whole point of the exercise was. Even customers
often just roll their eyes and sign off on the document, not having a clue
what it says.

The idea that every distinguishable requirement should be a sentence, to
aid traceability, greatly detracts from readability. Information that is
naturally tabular or naturally fits into a diagram has to be written in
the form of tens or even hundreds of "shall" sentences. This obscures the
relationships and underlying simplicity within the requirements--and their
comprehensibility. Happily, there's an easier way: you can just write
(and number) one requirement statement for traceability, and in that
statement just refer to the table or diagram that contains all the messy
details.

Very often, requirements documents contain lots of generic, boilerplate
text that has nothing to do with the specifics of the project. For
example, a lot of requirements templates have big sections that explain
all the desirable attributes of a requirement: atomic, unambiguous,
observable, etc. I've seen requirements documents that practically give
you a beginning course in user-interface design before saying anything
pertinent to the project. Starting with the opening "purpose and scope"
section and spread throughout the document, there's "decoy text" that
doesn't tell the reader anything useful for the project at hand. People
get the feeling that their time is being wasted, so they go to some other
source, like word of mouth, to get the real information.

Very often the requirements document reads like a legal document: aimed at
a hostile reader who is intent on misinterpreting anything he can get away
with, and not at software developers eager to create a useful product.
Consequently the document is so filled with qualifiers and repetitions
that you can't read it without getting a headache.

Very often the requirements document is filled with highly abstract
statements about the software, like "the application shall generate a
graphical representation of facilities data." Without examples, it's hard
for people to understand what that means. A helpful practice is to give a
simple example or even to briefly explain the rationale for the
requirement. That gives the reader a little context within which to see
how the requirement makes sense and how it will play out when the system
goes live. Just think how much clearer the sentence becomes when you
follow it with, "For example, an AutoCAD file showing all the telephone
poles, equipment, and cables superimposed on a map of a neighborhood."

And finally, people often don't trust the requirements document. If it
looks like bureaucratese, then it doesn't matter how good a job the
analyst has done on research, people will assume that it's all going to be
irrelevant when it comes time to do the real programming. The solution to
this is to keep the document focused on real, relevant facts that the
programmers must address when designing the system--things that stimulate
the programmers' imaginations. Then they'll read it.

--
Ben Kovitz <apteryx at chisp dot net>
Author, Practical Software Requirements: A Manual of Content & Style
http://www.amazon.com/exec/obidos/ASIN/1884777597
http://www.manning.com/Kovitz

Ken Foskey

unread,
Apr 3, 2000, 3:00:00 AM4/3/00
to
Ben Kovitz wrote:
>
> And finally, people often don't trust the requirements document. If
> it looks like bureaucratese, then it doesn't matter how good a job
> the analyst has done on research, people will assume that it's all
> going to be irrelevant when it comes time to do the real programming.
> The solution to this is to keep the document focused on real,
> relevant facts that the programmers must address when designing the
> system--things that stimulate the programmers' imaginations. Then
> they'll read it.

On a side trip from this thought. I have seen many templates
containing a list of heading for the poor analyst to fill out. No
instructions, no training just 'write me a spec using this'.

I found a requirements template that explains why those mysterious
sections are all in there:

http://www.systemsguild.com/GuildSite/Robs/Template.html

Jyrki Heikkinen

unread,
Apr 4, 2000, 3:00:00 AM4/4/00
to
Keith Collyer wrote <8bva2i$iu5$3...@newsg3.svr.pol.co.uk>...

> Read Tom Gilb's Principles of Software

> Engineering Management or Software Inspections.


> Although he requires that an item be defect free

> before starting the next stage ..

Being defect-free was a goal of "old" inspections.

Nowadays Tom Gilb says: "Representative samples
should provide enough information to decide whether
a document is clean enough to exit at, for example,
'0.2 major defects per page maximum remaining.'”

For an excellent introduction to Gilb Inspections,
see his article Planning to Get the Most Out of
Inspection, published in Software Quality
Professional, March 2000:

http://sqp.asq.org/vol2_issue2/sqp_v2i2_gilb.html

Keith Collyer

unread,
Apr 4, 2000, 3:00:00 AM4/4/00
to
be...@manning.com (Ben Kovitz) wrote in
<benko-02040...@10.0.0.2>:

>Very often they're written by analysts who've done an
>extraordinary job of research and attention to detail, but
>they aren't written in a way that the customer or programmers
>can understand.

This is often because the analysts try to make one document do
two jobs, document the users' problems, and document what they
system must do. Not surprisingly, trying to be all things to
all men means you fail.

>Dutiful adherence to any rigid methodology in particular
>leads to "force fits" in which you can't say simple things in
>simple ways. For example, writing a description of data
>flows between an input "function" and a validation "function"
>instead of just saying that data that doesn't meet rules X,
>Y, and Z is invalid.

I say: be pragmatic, not dogmatic!


Mr. Poologic

unread,
Apr 5, 2000, 3:00:00 AM4/5/00
to
In article <38D9D2...@ftel.co.uk>,
David Pedlar <D.Pe...@ftel.co.uk> wrote:

> sl...@imap1.asu.edu wrote:
> >
> > Hi,
> >
> > First of all this is not a homework assignment, but this is
something
> > that I have been investigating lately..
> >
> > I do not work in industry, but would like to know what aspects of
the
> > inspection process you do not like (or find useful). I do know
> > inspections can be quite effective in detecting defects, but many
> > organization do not use them for various reasons.
>

We have rules to mitigate these problems

> Generally people do not like their work being criticised.
> Thus inspections can often be painful.

We don't allow personal references. Not "your code" or "you
do X here". Also not sarcasm. Also we encourage compliments
and respect.

Getting compliments on your work turns this around.

> The problem is made worse by there being no universally accepted
> criteria of what makes a good program. Therefore there will always be
> scope for arguments.
>

We have a rule that if it is not an objective bug or a violation
of an explicit standard, then the programmer or producer of the
document decides whether it needs to be addressed. Any standards
are agreed upon in advance by everyone. If there is a grey
area, we don't make it a standard. We do have guidelines that
are not standards, but the programmer has sole authority to
interpret guidelines.

Basically, the members of a review team have no authority whatsoever
to impose anything. They are merely an information resource. If
a reviewer wants to comment on something other than a bug or a
standards violation, then the comment will be logged and the
programmer has a perfect right to basically ignore the comment.

With these conventions there is no basis for arguments. When
there is a grey area whether something is a bug, then the comment
is logged and the programmer sorts it out later.

The review team leader decides if the document needs to be
re-reviewed, but this is largely a matter of counting the
total number of bugs and comparing it with a threshold.

In some cases, a programmer may be required to consult with his
development team leader or project manager on a issue. But he is
never, never required to reach a consensus with a mere review team
member on a issue.


>
> If the inspections limited themselves to looking for actual bugs
> rather than finding problems with the style of the work, that
> would be more pleasant but perhaps less effective.
>
> I once attended an inspection of a document where again there
> was plenty of argument, and the author(victim!) became visibly
> stressed.
> Part of the problem may be that the purpose of the
> document was not adequately defined before-hand.
>
> To improve the efficiency of code reviews I suggest that
> firms first create coding standards that all the programmers
> agree on. There also needs to be agreement on the relative
> importance that should be given to efficiency as against
> readability in the code.
>
> D.Pedlar
> d...@ftel.co.uk My views only.
>

--
http://members.aol.com/tadamsmar/poologic.htm
Learn how to win your college hoops tourney
office pool


Sent via Deja.com http://www.deja.com/
Before you buy.

Ken Foskey

unread,
Apr 7, 2000, 3:00:00 AM4/7/00
to
Jyrki Heikkinen wrote:
>
> For an excellent introduction to Gilb Inspections,
> see his article Planning to Get the Most Out of
> Inspection, published in Software Quality
> Professional, March 2000:
>
> http://sqp.asq.org/vol2_issue2/sqp_v2i2_gilb.html

Great article well worth a read. It has already changed some
Inspection practices at work and I only gave it to the QA man this
morning.

Five stars....

Ed Vojcak

unread,
Apr 7, 2000, 3:00:00 AM4/7/00
to

Check out:

http://www.kaner.com/imposs.htm

For an education on Software Inspection.

Ed Vojcak PE CQE

In article <8bc32r$4jp$1...@news.asu.edu>,


sl...@imap1.asu.edu wrote:
> Hi,
>
> First of all this is not a homework assignment, but this is something
> that I have been investigating lately..
>
> I do not work in industry, but would like to know what aspects of the
> inspection process you do not like (or find useful). I do know
> inspections can be quite effective in detecting defects, but many

> organization do not use them for various reasons. I am reading
articles on
> inspection, but I do not feel that they provide me the insight that I
> want that those of you who work in software development on a daily
basis
> can provide.
>
> Thanks in advance,
> Stephanie

Ben Kovitz

unread,
Apr 7, 2000, 3:00:00 AM4/7/00
to
In article <8cfqsu$76d$1...@nnrp1.deja.com>, Mr. Poologic
<pool...@my-deja.com> wrote:

>Basically, the members of a review team have no authority whatsoever
>to impose anything. They are merely an information resource.

I think this is a great policy.

There's something disturbing going on in organizations where the review
team acts as a "gateway" to going on to the next stage, except when review
team is users or their representatives (people who are supplying the
money, in any event). The implicit attitude toward developers is that
they're children who need constant scolding from parent figures. Without
someone to go over their work with a white glove, the implicit theory
goes, the developers would goof off and do as little work as possible.

Perhaps this is another important factor in why people don't like to
review documents. If the analyst or developer is writing with the sole
aim of *not having any problems* in order to get past the review gateway,
he'll likely make the document as boring and conservative as possible,
dutifully adhering to every rule and inventing as little as possible, in
order to provide as small a target for criticism as possible.

I think the policy that you describe above is wonderful, because it
encourages developers to actively seek out areas of uncertainty in order
to get more eyes to look at them. Put something rough on the table in the
*hope* that others will see the inevitable problems early. Early stages
of development then become more like Karl Popper's view of science:
finding problems and criticizing being the main way that theories--or in
this case, specifications--become adapted to the world.

--
Ben Kovitz <bkovitz at uswest dot net>

Ben Kovitz

unread,
Apr 7, 2000, 3:00:00 AM4/7/00
to
In article <8cd726$mr3$2...@newsg3.svr.pol.co.uk>, kei...@computer.org (Keith
Collyer) wrote:

>be...@manning.com (Ben Kovitz) wrote in
><benko-02040...@10.0.0.2>:
>
>>Very often they're written by analysts who've done an
>>extraordinary job of research and attention to detail, but
>>they aren't written in a way that the customer or programmers
>>can understand.
>
>This is often because the analysts try to make one document do
>two jobs, document the users' problems, and document what they
>system must do. Not surprisingly, trying to be all things to
>all men means you fail.

Definitely! Requirements documents in particular seem prone to the "all
things to all people" trap. When people view the requirements document as
a CYA document for both the customer and developer, they start wanting the
document to include every possible decision to be made during development:
schedules, staffing, test plans (!), lists of features to *not* implement
(!), traceability matrices, programming languages and tools, tutorials on
how to write requirements, tutorials on how to evaluate requirements,
internal data flows and functions, quality-review questionnaires, and
more. The things that the customer actually wants to happen, the problem
domain, the thing to be produced--these things get lost within all the
extraneous text, or, amazingly, not described at all.

>>Dutiful adherence to any rigid methodology in particular
>>leads to "force fits" in which you can't say simple things in
>>simple ways. For example, writing a description of data
>>flows between an input "function" and a validation "function"
>>instead of just saying that data that doesn't meet rules X,
>>Y, and Z is invalid.
>
>I say: be pragmatic, not dogmatic!

Definitely (again)! Something to look at, though, is just how easy it is
to fall into dogmatism. I'm not sure, but I think there are two main
reasons that people become dogmatists: the need for a common framework in
terms of which to state expectations, and a desire for rigor. Both of
these are quite legitimate concerns.

When everyone at the company understands that projects consist of
such-and-such phases, that such-and-such document provides such-and-such
information, etc., they have a huge head start on any new project. They
can dispense with the stage of understanding how everything is supposed to
fit together, because they already know. All they need to attend to are
the details. Also, people can avoid arguing about what's the best way to
write the spec, and just make use of the standard.

Deliberately constrictive forms of expression, like the data flows and
functions of structured analysis, provide a systematic way to achieve
rigor. For example, a principle of structured analysis is that the inputs
and outputs of a data-flow diagram must "balance". It's very easy to
systematically check that every output from a diagram derives from an
input. Without the fairly inflexible framework of data flows and
functions, you'd have to invent a new way to detect mistakes on every
project--and your new way might not work very well, whereas structured
analysis has years of experience behind it. Use cases are great because
they're so simple and flexible, but they're terribly prone to goofs, gaps,
and internal contradictions.

I think trouble sets in, in the form of obscurity and hypercomplexity,
when people apply their time-tested, familiar, rigorous methods to
problems where they just don't apply. For example, it would be silly to
spec out a video game as a set of use cases, or a scripting language as a
set of data flows and functions. There needs to be some point (or many
points, or even continuously if you take the XP approach) at which people
consciously address the question of what sort of problem they're working
with and evaluate different methods of describing and solving it.

Any other ideas on how people get tempted into dogmatism?

Andy Dent

unread,
Apr 9, 2000, 3:00:00 AM4/9/00
to
In article <WWbE4.8796$Og6.2...@tw12.nn.bcandid.com>, J.M. Ivler
<iv...@net-quest.com> wrote:

>Inspection (and some would say "formal inspection") requires that each
>item be defect free before the next part of the process start. How can you
>be assured that the design of your system is correct if you don't have
>complete and defect free requirement specifications?
>

>Now, some will say that if I get some specification I can start on the
>design and when the requirements are completed, then I can validate my
>design to the completed and inspected defect free requirements. This
>overlap of processes is possible, if one is willing to redo work that had
>been designed to a requirement that was then found to be defective and had
>to be restructured. In other words, what appears to be a potential cost
>savings can in fact be far costlier.

There's one simple shift in perspective to help clarify a lot of situations.

I suspect coding is often started because (no particular order or significance):
1) it's assumed the lessons learned will be useful even if re-design occurs
2) the developers assume enough commonality in circumstance that much of
what they do will be applicable
3) without coding a test case the developers feel they don't know enough
about the situation to design a robust solution.

One approach I've found works well, particularly with junior developers,
is to explicitly designate a coding experiment.

As an experiment, it has a stated aim - usually teaching us enough about
the problem domain or behaviour of environment to be able to design a
robust solution.

Part of an experiment is designing in testability and measurement.

An experiment is easily thought of as having well-defined boundaries and
thus is less likely to evolve into production code the way a "prototype"
may.

Management like to know WHY things are being done. Being able to say "we
have the following risk areas... and have experiments going to determine
how much real risk is there" is the kind of answer that reassures them an
engineering process is taking place.

--

Andy Dent BSc MACS AACM, Software Designer, A.D. Software, Western Australia
OOFILE - Database, Reports, Graphs, GUI for c++ on Mac, Unix & Windows
PP2MFC - PowerPlant->MFC portability
http://www.oofile.com.au/

0 new messages