Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

The problem COBOL programmers have moving to objects

3 views
Skip to first unread message

Warren Zeigler

unread,
Dec 17, 2002, 1:04:42 PM12/17/02
to
It has been rumored for some time: COBOL programmers have more problems than
others moving to object-oriented programming.

Is it a rumor? Insult?

Some have even been told that they are too old, or that their minds are "too
full."

What is the truth?

While teaching Understanding Objects, I have seen this problem. More than
3/4 of the students that were coming from COBOL had serious problems in
class.

I also have spend time with these students, helping them and analyzing the
problem. The cause is interesting. It has NOTHING to do with objects.

Keep in mind that, while moving to objects, we are also moving to the newer
languages and programming techniques in a field that is new and has made
rapid progress. In other words, when programmers move to objects, they also
have to update other skills. The older the prior language, the more skills
that must be updated.

COBOL syntax and coding style emphasized simple statements and readability.
Subsequent procedural languages have more complex statements, often placing
functions within functions, causing individual statements to be more
complex. C and other languages syntax and coding styles are somewhere
between brief and obfuscation for the COBOL programmer.

Combine this with the fact that most object training has the students
reading many code examples.

The end result is that COBOL programmers take longer to comprehend the
complex syntax. It is simply a matter of practice.

If this were all, it would not be a big problem. The problem is, while the
teacher and other students are going from syntax to concept, the COBOL
programmer is still on the syntax, and doesn't follow the rest of the
explanation. This results in the student missing many citical explinations.

The fix is simple:
1) The instructor needs to identify COBOL programmers and constantly use
them as indicators of instruction speed, when code is being read.
2) Inform the students of the problem, that working w/ complex syntax is
something that they need to practice. Also, like any intelligent person, if
they are aware of this kind of problem they can compensate w/ questions
and/or other study, until they have enough experience w/ more complex
syntaxes.

Are there other problems for all programmers moving to objects? See
www.UnderstandingObjects.com/problems.htm for more examples.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Topmind

unread,
Dec 17, 2002, 2:58:10 PM12/17/02
to
> It has been rumored for some time: COBOL programmers have more problems
than
> others moving to object-oriented programming.
>
> Is it a rumor? Insult?

Didn't this topic rage last year? Why bring it up
again? Do you have *new* information on it?

What about they take courses in C first?

>
> Are there other problems for all programmers moving to objects? See
> www.UnderstandingObjects.com/problems.htm for more examples.


"Abstractions: Abstractions are the glue in object oriented software
development, connecting code to the objects it represents. "

OO has no monopoly on "abstraction".

"Data Structures and Collections: A side effect of the move to objects
is the need to easily handle multiple items. Most procedural languages
did this through the use of arrays."

The trend now is to use relational techniques. Relational
structures are more scalable than bags, sets, queues, etc.
I would recommend they also take some relational
theory and SQL courses. Bags, sets, queues, etc. are
rather arbitrary distinctions IMO. No wonder they
are confused. Why give different names to similar
things that vary on minor orthogonal features or limitations?
Why not use identify them by their limitations (only
one key field, etc.)? Smalltalk's taxonomy-happy collection
thinking F'ed up the collections world IMO.

Further, COBOL appears to be in as much demand as any
other language these days. COBOL software appears to
be slightly outliving COBOL programmers. Thus, the
need for COBOLers is likely to stay reasonable
in the near term.

I suspect that many COBOLers think that it is their
lack of skills in newer stuff that is holding them
back, when in fact it is the tech recession overall that
is the problem. Some Java programmers are having
a hard time also. There is still too many
programmers in the workforce relative to the
weak demand. Programmers are mostly in demand
when companies are expanding, changing, and
investing. Much of that kind of activity
halts during slow times, and only baseline
maintenance is required. Unlike inventory,
software does not disappate over time. Thus,
most jobs are in making or selling product,
not newfangled management tools.

In summary, I recommend COBOLers take a course
in C, relational theory, and SQL from a community
college or something. Those courses will probably
be cheaper than yours since they are more
general cirriculum, and reduce the
need to pay for specialized training when
they really need the basics at first.

After they are comfortable
in those, then dig into the tangled,
inconsistent methodology-de-jur world of objects.

You probably don't want to tell them that
because it may reduce your students, but that
is my recommendation.


>
> --
> Warren Zeigler

-T-

Wolfgang Formann

unread,
Dec 17, 2002, 4:15:04 PM12/17/02
to

Warren,

When I wrote my first program (in school), it was a kind of assembler language.
Actually it was an Olivetti P602 with really extreme limited data & address
space. I think washing machine you buy today has a more powerful chip inside :-)

Okay, then I learned Algol, Fortan and finally Basic, all in school. After that
I programmed my own TI-59 and later TRS-80 at home, my first commercial job
was to port a Basic program to CP/M and later to DOS 1.0.

[[ Which just means that I am simply old :-( ]]

Then came the army, and later my first 'real' job. That's when I had to learn
COBOL and in about the same time K&R C.

So, maybe I was not the 'typical' COBOL programmer, since I had to use COBOL
in one week/month and C in another.

Happily I do not use COBOL for a quite long time, it's now all C++, and a little
bit good but old ANSI-C.

The biggest difference between COBOL and C/C++ are (turn back your clock for
about 15 years):
o) local variables
o) ... parameters given to subroutines aka functions
o) the internal representation of data (int/double/... vs. PICTURE clause)
o) dynamic memory
o) the preprocessor
o) source files can be compiled independent and linked later
o) ...

The concept of local variables, or to be more specific the scope of variables,
is a very very different concept for both languages. In COBOL there is *one*
DATA DEFINITION SECTION where you declare *all* the variables you need in the
program. This is so different to C (or C++) where variables can have totally
different scope (global, file-global, local (including register), class-local
with both flavors class-member and class-static, and then there namespaces too).

I think that this totally different concept needs at least one reboot of
a COBOL programmer's brain :-)

Parameters to subroutines. Now consider a COBOL statement like
PERFORM foo.
No parameters, and compare it to void foo(17, 34); This is the second reboot a
COBOL programmer needs. Again a big difference, although it might be similar
to calls into some external libraries.

Next, in COBOL you define variables with a picture clause, internally they
are represented in some BCD format, you have to define the length in digits
and if the can get negative, this has to be declared too. Compare this with
a simple 'int' or 'long' statement. Next reboot :-) Yes, there is a binary
format too, but one who uses that binary format should not use COBOL,
mybe FORTRAN would fit better.

Then dynamic memory. In the days I programmed, there was no way to dynamically
allocate memory, just no way :-( I had to use some external file to write
variable amount of data, some kind of program-owned swap space. Next reboot!

Then the preprocessor ... A special beast of burden! the #include statement
is the only one which has some sibling statement in COBOL, but I do not know
of #if and the like.

The last thing I remember was, that one source gave one binary. In C/C++ you
usually have multiple sources giving one binary.


And at the end of the story, in COBOL you can use statements checking
overflow/underflow/division by zero/... which are not part of C/C++ language.

Similar the built in ISAM files, the report generator, and such goodies of
COBOL.


This is my understanding of COBOL, okay it is 15 years old, hopefully the
language has evolved, I did not use it for that time.


For all the languages I know I think COBOL has the almost the maximum of
differences compared to C/C++ (okay, let's ignore Lisp). So I can really
understand that COBOL programmers need to learn more different concepts
than people used to work with other languages.

I think that a teacher which has to teach C/C++ to a COBOL person needs
to know at least the basic concepts of COBOL, simple because that is the
only way to know how to explain the differences, you need to know that
a variable in C/C++ has a scope, which does not exist in COBOL. You need
to know that there is a picture clause in COBOL where you can define a
4 digit positive numeric variable, you can even redefine the space used
by this variable, so that you can access very simple the second digit
of this variable; you can't do that as easy in C/C++. And similar the rest
of all those differences.


Like a person who knows only PASCAL, such a guy will start a program with:
#define begin {
#define end }
I am sure you have seen this :-) And a typical PASCAL programmer will
always write something like:
char bar[4] = "ABCD";
... it will take month's until the \0 at the end of each C-string will
get 'typical style'.

And similar with other languages ...

The last problem I 'guess': COBOL programmers are usually not young guys,
I expect them to be at least 40 or even older, I mean 'typical', not all!


Just a brain dump :-)
Wolfgang

Warren Zeigler

unread,
Dec 17, 2002, 5:44:55 PM12/17/02
to
"Topmind" <top...@technologist.com> wrote in message
news:MPG.18692256...@news.earthlink.net...

> What about they take courses in C first?

That would also help, but they would need to use it long enough to get used
to the compound statements.

> In summary, I recommend COBOLers take a course
> in C, relational theory, and SQL from a community

> college or something....

If they are independent, they can do what they want. If they are part of a
group moving to an object language...

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Warren Zeigler

unread,
Dec 17, 2002, 5:52:35 PM12/17/02
to
"Wolfgang Formann" <w.insert_my_l...@arcor.de> wrote in message
news:3DFF93D8...@arcor.de...

> Then came the army, and later my first 'real' job. That's when I had to
learn
> COBOL and in about the same time K&R C.
>
> So, maybe I was not the 'typical' COBOL programmer, since I had to use
COBOL
> in one week/month and C in another.

Correct. It is not a situation of being "contaminated" by COBOL, but one of
not having practice with something like C. I have taught students with
similar background, and they have had no problem w/ the syntax.

> I think that a teacher which has to teach C/C++ to a COBOL person needs

> to know at least the basic concepts of COBOL...

That would be nice, but does not happen often.

Thanks for the rest of your comments.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Neil Butterworth

unread,
Dec 17, 2002, 6:30:50 PM12/17/02
to

"Warren Zeigler" <warren...@attbi.com> wrote in message
news:TUNL9.4690$qF3.59@sccrnsc04...

> "Wolfgang Formann" <w.insert_my_l...@arcor.de> wrote in message
> news:3DFF93D8...@arcor.de...

> > I think that a teacher which has to teach C/C++ to a COBOL person needs


> > to know at least the basic concepts of COBOL...
>
> That would be nice, but does not happen often.

I don't see why not - you can learn the basics of COBOL in an afternoon.
I've only written one COBOL program in muy professional careeer (to unpack
an ISAM database into ASCII files) but it took me literaly a couple of hours
of reading a COBOL introductory text followed by a couple of hours of coding
(and swearing - writing your first COBOl program will indelibly print the
correct spelling of the word ENVIRONMENT on your brain). And I agree with
Wolfgang that understanding their mimdset is important. I remember the first
COBOL guy I taught C to - I was telling the class about malloc() and he
stuck his hand up and asked "But what is this 'memory' stuff? And why would
I ever want to 'allocate' any of it?"

Anyway, take it from me that the difficulties of teaching COBOL programmers
C++ are as nothing to the horrors of teaching VAX system administrators UNIX
:-)

NeilB


Topmind

unread,
Dec 18, 2002, 12:59:46 PM12/18/02
to
> "Topmind" <top...@technologist.com> wrote in message
> news:MPG.18692256...@news.earthlink.net...
>
> > What about they take courses in C first?
>
> That would also help, but they would need to use it long enough to get used
> to the compound statements.

If that is what it takes, that is what it takes.

>
> > In summary, I recommend COBOLers take a course
> > in C, relational theory, and SQL from a community
> > college or something....
>
> If they are independent, they can do what they want. If they are part of a
> group moving to an object language...

True. I suppose you deal with many cases where an IT
department is moving to say Java or .NET, and there is
a fixed deadline.

Frankly, the COBOLers should have been learning such
stuff on their own. The days where employers pay for
training is pretty much dying, and the IT field
changes often (for good or bad).

Maybe Pascal with some Delphi would be a good
option for them. Pascal has more formality
than the C family, which may appeal to
COBOLer's more.

JavaScript may also be a good starting point.
It is easier to explain what is going on
behind the scenes than Java, because of the
dictionary-like nature of its objects.

>
> --
> Warren Zeigler
> wzei...@UnderstandingObjects.com
>

-T-

H. S. Lahman

unread,
Dec 18, 2002, 1:19:01 PM12/18/02
to
Responding to Zeigler...

I would suggest another fix: teach OO concepts first and then OOPL
syntax. This is especially true if the target OOPL is one like C++.


*************
There is nothing wrong with me that could
not be cured by a capful of Drano.

H. S. Lahman
h...@pathfindersol.com
Pathfinder Solutions -- We Make UML Work
http://www.pathfindersol.com
(888)-OOA-PATH


Warren Zeigler

unread,
Dec 18, 2002, 1:23:03 PM12/18/02
to
"H. S. Lahman" <vze2...@verizon.net> wrote in message
news:3E00BC4E...@verizon.net...

> I would suggest another fix: teach OO concepts first and then OOPL
> syntax. This is especially true if the target OOPL is one like C++.

I am one ahead of you.

My class is 4 1/2 days. Code is not brought in until the third day, but
eventually it is still brought in.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Topmind

unread,
Dec 19, 2002, 3:20:39 AM12/19/02
to
>
> I would suggest another fix: teach OO concepts first and then OOPL
> syntax. This is especially true if the target OOPL is one like C++.

But it is tough to teach something without a representation.

UML is just yet another representation. Whether it is better
than code or not is another Holy War here.

>
>
> *************
> There is nothing wrong with me that could
> not be cured by a capful of Drano.
>
> H. S. Lahman

-T-

Warren Zeigler

unread,
Dec 19, 2002, 1:48:37 PM12/19/02
to
"Topmind" <top...@technologist.com> wrote in message
news:MPG.186b21c22...@news.earthlink.net...
> H. S. Lahman

> > I would suggest another fix: teach OO concepts first and then OOPL
> > syntax. This is especially true if the target OOPL is one like C++.
>
> But it is tough to teach something without a representation.
>
> UML is just yet another representation. Whether it is better
> than code or not is another Holy War here.

I teach 2 days w/o ANY code, then 2 1/2 days with a lot of code from several
languages.

The first representation I use is real-world objects. This has a few
benifits:
1) It brings in the student's prior experience
2) Since objects started w/ simulation, almost all essential concepts can be
taught this way
3) It works well when teaching simple analysis and design (use cases) and
UML

--
Warren Zeigler
wzei...@UnderstandingObjects.com


JXStern

unread,
Dec 19, 2002, 2:59:46 PM12/19/02
to
On Tue, 17 Dec 2002 18:04:42 GMT, "Warren Zeigler"
<warren...@attbi.com> wrote:
>It has been rumored for some time: COBOL programmers have more problems than
>others moving to object-oriented programming.
...

>Are there other problems for all programmers moving to objects? See
>www.UnderstandingObjects.com/problems.htm for more examples.

Thank you for spamming the group once again.

J.

Wolfgang Formann

unread,
Dec 19, 2002, 3:58:55 PM12/19/02
to

JXStern wrote:
> On Tue, 17 Dec 2002 18:04:42 GMT, "Warren Zeigler"
> <warren...@attbi.com> wrote:
>
>>It has been rumored for some time: COBOL programmers have more problems than
>>others moving to object-oriented programming.
>>

> ....


>
>>Are there other problems for all programmers moving to objects? See
>>www.UnderstandingObjects.com/problems.htm for more examples.
>>
>
> Thank you for spamming the group once again.
>

J., would please be so kind and tell me (or us) why you think this
is spamming?

Okay, Warren did post in (at least) two newsgroups the same question,
but why is this spamming?

Wondering?

Wolfgang

David Cattarin

unread,
Dec 19, 2002, 3:16:30 PM12/19/02
to
top...@technologist.com (Topmind) wrote in message news:<MPG.186b21c22...@news.earthlink.net>...

> >
> > I would suggest another fix: teach OO concepts first and then OOPL
> > syntax. This is especially true if the target OOPL is one like C++.
>
> But it is tough to teach something without a representation.
>
> UML is just yet another representation. Whether it is better
> than code or not is another Holy War here.

A diagram is worth a thousand lines of code. ;)
Dave

Universe

unread,
Dec 19, 2002, 4:23:10 PM12/19/02
to
David Cattarin wrote:

> top...@technologist.com (Topmind) wrote in message news:<MPG.186b21c22...@news.earthlink.net>...
>> >
>> > I would suggest another fix: teach OO concepts first and then OOPL
>> > syntax. This is especially true if the target OOPL is one like C++.
>>
>> But it is tough to teach something without a representation.
>>
>> UML is just yet another representation. Whether it is better
>> than code or not is another Holy War here.

For those ignorant of the *facts* discovered, summarized within and
accepted by most practicng Cognitive Psychology.

> A diagram is worth a thousand lines of code. ;)

Ain't that an *objective* "kick in the head to the *nabobs* of
negative OO criticism. :-}

Elliott
--
http://www.radix.net/~universe ~*~ Enjoy! ~*~
Hail OO Modelling! * Hail the Wireless Web!
@Elliott 2002 my comments ~ newsgroups+bitnet OK

JXStern

unread,
Dec 19, 2002, 4:25:42 PM12/19/02
to
On Thu, 19 Dec 2002 21:58:55 +0100, Wolfgang Formann
<winsert_my_l...@arcor.de> wrote:
>J., would please be so kind and tell me (or us) why you think this
>is spamming?
>
>Okay, Warren did post in (at least) two newsgroups the same question,
>but why is this spamming?

It is promoting his commercial interest, and I believe you will find
the same opinions expressed by him periodically on the group. It is
the commercial aspect that constitutes spamming. Omit the mention of
his classes, and it would just be mindless repitition, but not worth
objecting to.

J.

Topmind

unread,
Dec 19, 2002, 5:24:28 PM12/19/02
to
> David Cattarin wrote:
>
> > top...@technologist.com (Topmind) wrote in message news:<MPG.186b21c22...@news.earthlink.net>...
> >> >
> >> > I would suggest another fix: teach OO concepts first and then OOPL
> >> > syntax. This is especially true if the target OOPL is one like C++.
> >>
> >> But it is tough to teach something without a representation.
> >>
> >> UML is just yet another representation. Whether it is better
> >> than code or not is another Holy War here.
>
> For those ignorant of the *facts* discovered, summarized within and
> accepted by most practicng Cognitive Psychology.

"Facts" and "Psychology" tend to be distant cousins.
The closest psychology has to "facts" is statistical
relationships between symptoms. Beyond that, it is
mostly about competing models put forth by various
practicioners.

>
> > A diagram is worth a thousand lines of code. ;)

I don't know if that is the case with UML. I doubt
runnable UML is concise. Whether programming with
graphics or code is better is probably a subjective
thing.

>
> Ain't that an *objective* "kick in the head to the *nabobs* of
> negative OO criticism. :-}
>
> Elliott

-T-

Warren Zeigler

unread,
Dec 19, 2002, 10:56:07 PM12/19/02
to
"Topmind" <top...@technologist.com> wrote in message
news:MPG.186be7a6d...@news.earthlink.net...

> "Facts" and "Psychology" tend to be distant cousins.

I don't know. There seems to be so many facts that they can change them evey
year!

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Warren Zeigler

unread,
Dec 19, 2002, 10:57:56 PM12/19/02
to
"JXStern" <JXSternC...@gte.net> wrote in message
news:j8e40v46rgr023d7j...@4ax.com...

> Omit the mention of
> his classes, and it would just be mindless repitition, but not worth
> objecting to.

Really? Please show the post where the issue of COBOL programmers having
problems due to compound statements is ever mentioned. Please.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Topmind

unread,
Dec 20, 2002, 12:21:38 AM12/20/02
to
> "JXStern" <JXSternC...@gte.net> wrote in message
> news:j8e40v46rgr023d7j...@4ax.com...
> > Omit the mention of
> > his classes, and it would just be mindless repitition, but not worth
> > objecting to.
>
> Really? Please show the post where the issue of COBOL programmers having
> problems due to compound statements is ever mentioned. Please.
>

It did appear to me like a spam on your part also. I don't
know if it actually was, but it *looked* that way, to
be frank.

Topmind

unread,
Dec 20, 2002, 12:30:08 AM12/20/02
to

IMO, it looks like spamming because he brings up a problem,
then gradually provides a solution, which happens to be his
own school.

Now I can't read minds, so I don't know what his true
motivation was. But the sequence above looks suspicious
to a reader familiar with what he does for a living.

If he knows the solution (his own courses), then
why ask what the problem is?

>
> Wolfgang
>

-T-

Warren Zeigler

unread,
Dec 20, 2002, 9:19:18 AM12/20/02
to
"Topmind" <top...@technologist.com> wrote in message
news:MPG.186c4948c...@news.earthlink.net...

> It did appear to me like a spam on your part also. I don't
> know if it actually was, but it *looked* that way, to
> be frank.


I can see why, BUT -

While I have discussed the fact that problems exist before, this is the
first time I have discussed one of the reasons for the problems openly.

This, and the other reasons, are very valuable. Anyone can solve the
problems once they are fully identified. It is identifying the problems that
is so hard. (IOW: Anyone can answer a question, it's knowing what questions
to ask that is so valuable.)

I decided to show this one. Yes, people can take it and use it. I hope that
they will see that the class has real value w/ the other problems discusses,
but whether they buy or not, this post can help COBOL programmers.

Why this one? I have identified roughly 10 problems (there is some overlap),
and this is the easiest to solve for people not taking the class.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Warren Zeigler

unread,
Dec 20, 2002, 9:21:04 AM12/20/02
to
"Topmind" <top...@technologist.com> wrote in message
news:MPG.186c4b0a9...@news.earthlink.net...

> If he knows the solution (his own courses), then
> why ask what the problem is?

Because I am talking about a solution to a particular problem - a problem
some do not know about. If I don't first state the problem, the solution
would sound like so much noise.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


H. S. Lahman

unread,
Dec 20, 2002, 11:17:49 AM12/20/02
to
Responding to Zeigler...

>>I would suggest another fix: teach OO concepts first and then OOPL
>>syntax. This is especially true if the target OOPL is one like C++.
>
>
> I am one ahead of you.
>
> My class is 4 1/2 days. Code is not brought in until the third day, but
> eventually it is still brought in.

Aha! I don't think the problem you are having is with COBOL developers
per se. I think your main problem is budgetary.

I see know way to teach what OOA/D is about in 4-1/2 days, much less
provide enough exercises and identify procedural notions that are bad
habits in OO development. There have been entire books written on just
how to do Class Diagrams properly. Trying to distill down that sort of
complexity in a few days when the students are ingrained with an
approach that is completely different is not going to work, regardless
of the language the student use.

Warren Zeigler

unread,
Dec 20, 2002, 3:41:24 PM12/20/02
to
"H. S. Lahman" <vze2...@verizon.net> wrote in message
news:3E0342E7...@verizon.net...

> I see know way to teach what OOA/D is about in 4-1/2 days

This is NOT a OOA//D class. It gives enough early concepts for someone to
join a development team.

You CANNOT just teach coding w/o the principles behind for OO, and expect
the programmer to do well. This is one of the common mistakes in OO
training, even for a new or intermediate coder.


--
Warren Zeigler
wzei...@UnderstandingObjects.com


Uncle Bob (Robert C. Martin)

unread,
Dec 20, 2002, 4:52:46 PM12/20/02
to
"Warren Zeigler" <warren...@attbi.com> might (or might not) have
written this on (or about) Tue, 17 Dec 2002 18:04:42 GMT, :

>I also have spend time with these students, helping them and analyzing the
>problem. The cause is interesting. It has NOTHING to do with objects.

The sheer number of concepts that a COBOL programmer has to learn in
order to come up to speed in a C++, Java, or C# environment is
daunting. It includes:

- Dynamic memory management
- Recursion
- Interrupts
- Threads
- Functions
- Local variables
- References
- Block structure
- Stack Frames

And much more.
Robert C. Martin | "Uncle Bob"
Object Mentor Inc.| unclebob @ objectmentor . com
PO Box 5757 | Tel: (800) 338-6716
565 Lakeview Pkwy | Fax: (847) 573-1658 | www.objectmentor.com
Suite 135 | | www.XProgramming.com
Vernon Hills, IL, | Training and Mentoring | www.junit.org
60061 | OO, XP, Java, C++, Python |

"...a member of my local hiking club is a
nudist. I once asked him if he ever hiked in the nude. He responded
that he was a nudist, he wasn't nuts."
-- Daniel Parker

Uncle Bob (Robert C. Martin)

unread,
Dec 20, 2002, 5:00:18 PM12/20/02
to
Wolfgang Formann <w.insert_my_l...@arcor.de> might (or might
not) have written this on (or about) Tue, 17 Dec 2002 22:15:04 +0100,
:

>I think that a teacher which has to teach C/C++ to a COBOL person needs
>to know at least the basic concepts of COBOL

Yes, that certainly helps. But, as you point out, there's a lot of
ground to cover just to bring the COBOLer up to the point where they
can begin to understand what an object is.

I have noted that some shops are making the transition to Java by
hiring new Java programmers and "promoting" their COBOL programmers to
architects. They put these "architects" through a week long course on
Objects and UML, and then expect them to create object oriented
designs that the Java programmers will follow.... ;-(

Uncle Bob (Robert C. Martin)

unread,
Dec 20, 2002, 5:04:03 PM12/20/02
to
"Warren Zeigler" <warren...@attbi.com> might (or might not) have
written this on (or about) Wed, 18 Dec 2002 18:23:03 GMT, :

>"H. S. Lahman" <vze2...@verizon.net> wrote in message
>news:3E00BC4E...@verizon.net...
>> I would suggest another fix: teach OO concepts first and then OOPL
>> syntax. This is especially true if the target OOPL is one like C++.
>
>I am one ahead of you.
>
>My class is 4 1/2 days. Code is not brought in until the third day, but
>eventually it is still brought in.

I take exactly the opposite approach. I bring in code as early as
possible. It's amazing how easy it is to explain a bit of syntax when
you can show it executing on screen, make a change, and show the
difference in execution. It's also amazing how easy it is to explain
the concepts of OO, when you can express them in an executing example.

Uncle Bob (Robert C. Martin)

unread,
Dec 20, 2002, 5:05:00 PM12/20/02
to
dit...@ix.netcom.com (David Cattarin) might (or might not) have
written this on (or about) 19 Dec 2002 12:16:30 -0800, :

>A diagram is worth a thousand lines of code. ;)

Until you need something that executes.

Warren Zeigler

unread,
Dec 20, 2002, 5:04:46 PM12/20/02
to
"Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in
message news:2i470vo4tcc0jehok...@4ax.com...

> Yes, that certainly helps. But, as you point out, there's a lot of
> ground to cover just to bring the COBOLer up to the point where they
> can begin to understand what an object is.
>
> I have noted that some shops are making the transition to Java by
> hiring new Java programmers and "promoting" their COBOL programmers to
> architects. They put these "architects" through a week long course on
> Objects and UML, and then expect them to create object oriented
> designs that the Java programmers will follow.... ;-(

...until it comes time to schedule the training, then many say "Can you do
this in 2 or 3 days instead of a week?"

It would be funny, but for all of the pain it causes. (Developers care about
their work, and when they are disabled from it in any way, there is serious
distress of one kind of another.)

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Warren Zeigler

unread,
Dec 20, 2002, 5:14:44 PM12/20/02
to
"Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in
message news:io470v4fteh2r21b6...@4ax.com...

> I take exactly the opposite approach. I bring in code as early as
> possible. It's amazing how easy it is to explain a bit of syntax when
> you can show it executing on screen, make a change, and show the
> difference in execution. It's also amazing how easy it is to explain
> the concepts of OO, when you can express them in an executing example.

You are probably just teaching one language.

Actually, I consider staying out of code when first introducing concepts
VERY important. We do not think in the abstractions code represents. I have
found from my experience as a teacher and developer that when concepts are
taught in code, the student only retains a poor to fair knowledge. OO has
many concepts build one on another. Without a clear understanding of base
concepts, other concepts are poorly learned.

Learning outside of code also establishes a good foundation for use cases,
UML, and other pre-coding skills.

After these are initially taught, then learning how to code becomes a good
reinforcement of concepts learned.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Warren Zeigler

unread,
Dec 20, 2002, 5:17:13 PM12/20/02
to
"Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in
message news:p2470vo9nc955u7s3...@4ax.com...

> The sheer number of concepts that a COBOL programmer has to learn in
> order to come up to speed in a C++, Java, or C# environment is
> daunting. It includes:
>
> - Dynamic memory management
> - Recursion
> - Interrupts
> - Threads
> - Functions
> - Local variables
> - References
> - Block structure
> - Stack Frames
>
> And much more.

And these are mostly taught w/ the language. Then you have associations,
collaborations and other higher-level concepts.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Shayne Wissler

unread,
Dec 20, 2002, 5:39:24 PM12/20/02
to
What a sorry state our industry is in when someone can get away with
teaching people to fling around abstractions without knowing their meaning.

You don't teach children mathematics by giving them a grand tour of the
concepts of mathematics; you start with counting and arithmetic. Likewise,
you should not start someone new to programming with OO design, UML
diagrams, and woozy metaphors about the "real world" as seen through the
distorted lens of the "teacher".

The only way to learn how to architect is to first learn how to design; the
only way to learn how to design is to first learn how to program; and the
only way to learn that is to learn a particular language.

Attempting to reverse this order yeilds incomptetence in all areas.


Shayne Wissler

JXStern

unread,
Dec 20, 2002, 7:28:54 PM12/20/02
to
On Fri, 20 Dec 2002 15:52:46 -0600, "Uncle Bob (Robert C. Martin)"
<u.n.c.l...@objectmentor.com> wrote:
>The sheer number of concepts that a COBOL programmer has to learn in
>order to come up to speed in a C++, Java, or C# environment is
>daunting. It includes:
>
> - Dynamic memory management
> - Recursion
> - Interrupts
> - Threads
> - Functions
> - Local variables
> - References
> - Block structure
> - Stack Frames

Nonsense.

Half of these are in Cobol already, whether the programmer knows about
them or not.

The other half are not necessary to many/most OO applications.

J.

H. S. Lahman

unread,
Dec 21, 2002, 10:44:47 AM12/21/02
to
Responding to Zeigler...

>>I see know way to teach what OOA/D is about in 4-1/2 days
>
>
> This is NOT a OOA//D class. It gives enough early concepts for someone to
> join a development team.
>
> You CANNOT just teach coding w/o the principles behind for OO, and expect
> the programmer to do well. This is one of the common mistakes in OO
> training, even for a new or intermediate coder.

My point (ex typos) was that the problem is that the time allotted to
training is insufficient. I don't think it is possible to teach enough
about OOA/D to "do well" at OOP in 4-1/2 days, much less 3- days -- even
if the students had no contrarian biases.

David Cattarin

unread,
Dec 21, 2002, 7:06:15 PM12/21/02
to
"Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in message news:<iv470vcjfmfng0nv0...@4ax.com>...

> dit...@ix.netcom.com (David Cattarin) might (or might not) have
> written this on (or about) 19 Dec 2002 12:16:30 -0800, :
>
> >A diagram is worth a thousand lines of code. ;)
>
> Until you need something that executes.

Of course! It certainly isn't a replacement. Diagrams, UML or
otherwise, are the best way to document a system at a high level.
After all, you honestly wouldn't present an analysis to a customer
using thousands of lines of code, would you?

Dave

Warren Zeigler

unread,
Dec 21, 2002, 8:01:10 PM12/21/02
to
"H. S. Lahman" <vze2...@verizon.net> wrote in message
news:3E048CAC...@verizon.net...

> My point (ex typos) was that the problem is that the time allotted to
> training is insufficient. I don't think it is possible to teach enough
> about OOA/D to "do well" at OOP in 4-1/2 days, much less 3- days -- even
> if the students had no contrarian biases.

"Do well" is a relative phrase.

My class is designed to be taught along w/ a language and a process. Yes,
even so there are several topics were 4 1/2 days is not enough.My goal is to
give the basic skills possible in the time and point to other areas to
study.

Frankly, most developers starting w/ object w/o a mentor often get lost and
spend a lot of time clueless, not even knowing what questions to ask. I get
them to the point that the reasonably intelligent person knows what to do or
where to go for information. They also are basically oriented as to the
basics of objects, and have been taught the basic concepts required to work
with design patterns. (For example: many creational patterns require the
concept of "static" or "shared," but this is often not taught.)

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Warren Zeigler

unread,
Dec 21, 2002, 8:07:48 PM12/21/02
to
"Shayne Wissler" <thal...@yahoo.com> wrote in message
news:w_MM9.289013$GR5....@rwcrnsc51.ops.asp.att.net...

> What a sorry state our industry is in when someone can get away with
> teaching people to fling around abstractions without knowing their
meaning.

You are talking out of your nose. I have had comp sci grades equate my
lecture on abstractions with a fast version of a university 400 class.
Others have said that, after attending other classes, my lesson on
abstractions FINALLY ties together the "whys" of what Rational and others
teach.

> You don't teach children mathematics by giving them a grand tour of the
> concepts of mathematics; you start with counting and arithmetic. Likewise,
> you should not start someone new to programming with OO design, UML
> diagrams, and woozy metaphors about the "real world" as seen through the
> distorted lens of the "teacher".

Distorted is a description of your view of OO if you do not start w/ the
real world. OO was invented for simulation, and most of the concepts are
simple when taught from that perspective. (After discussing polymorphism, a
recent student wrote in an evaluation that I make a complex topic simple.
This student had been to a few language classes and worked w/ objects.)

> Attempting to reverse this order yields incomptetence in all areas.
The proof is in the pudding, and I get repeat business.

The simple fact is, you and I have disagreed about a few things before, and
now you attack in ignorance. So much for integrity.

--
Warren Zeigler
wzei...@UnderstandingObjects.com

Warren Zeigler

unread,
Dec 21, 2002, 8:10:07 PM12/21/02
to
"JXStern" <JXSternC...@gte.net> wrote in message
news:scd70vgfk9aej832b...@4ax.com...

> Nonsense.
>
> Half of these are in Cobol already, whether the programmer knows about
> them or not.
>
> The other half are not necessary to many/most OO applications.

I stumbled on that one myself, until I went from looking at the language to
talking to programmers. Note that even you said "whether the programmer
knows about them or not." The issue is not what is possible in COBOL. The
issue is the mindset and experience of the student.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Warren Zeigler

unread,
Dec 21, 2002, 8:14:36 PM12/21/02
to
"Topmind" <top...@technologist.com> wrote in message
news:MPG.186c4b0a9...@news.earthlink.net...
> IMO, it looks like spamming because he brings up a problem,
> then gradually provides a solution, which happens to be his
> own school.

To clear this up: Why do I spend time in the newsgroups?

While I must admit that I wouldn't mind more business, newsgroups have given
$0, and that is likely to stay the same.

So, Why? This is EXCELLENT experience in a few areas, including:
* Checking my perspective on concepts
* Explaining some concepts
* Seeing where people are having problems, and their perspectives

As per the COBOL issue posting, I not only think that it will help others,
but I had some EXCELLENT feedback on related issues, giving me a few
references that are now on my must study list.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Shayne Wissler

unread,
Dec 21, 2002, 8:26:27 PM12/21/02
to
Warren Zeigler wrote:

>> You don't teach children mathematics by giving them a grand tour of the
>> concepts of mathematics; you start with counting and arithmetic.
>> Likewise, you should not start someone new to programming with OO design,
>> UML diagrams, and woozy metaphors about the "real world" as seen through
>> the distorted lens of the "teacher".
>
> Distorted is a description of your view of OO if you do not start w/ the
> real world. OO was invented for simulation, and most of the concepts are
> simple when taught from that perspective.

Unsurprisingly, your second sentence contradicts your first. *I'm* the one
who advocates the real world; you're the one conflating software
engineering with modeling and simulation.


Shayne Wissler

Uncle Bob (Robert C. Martin)

unread,
Dec 23, 2002, 8:41:37 AM12/23/02
to
"Warren Zeigler" <warren...@attbi.com> might (or might not) have
written this on (or about) Fri, 20 Dec 2002 22:14:44 GMT, :

>"Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in
>message news:io470v4fteh2r21b6...@4ax.com...
>
>> I take exactly the opposite approach. I bring in code as early as
>> possible. It's amazing how easy it is to explain a bit of syntax when
>> you can show it executing on screen, make a change, and show the
>> difference in execution. It's also amazing how easy it is to explain
>> the concepts of OO, when you can express them in an executing example.
>
>You are probably just teaching one language.
>
>Actually, I consider staying out of code when first introducing concepts
>VERY important. We do not think in the abstractions code represents. I have
>found from my experience as a teacher and developer that when concepts are
>taught in code, the student only retains a poor to fair knowledge.

This is exactly the opposite of my experience. (Which just tells me
that there's no one true path.) In my experience code crystallizes
understanding. We can wave our hands about object and relationships
and abstraction, and we can make it all sound warm and fuzzy. (I'm
not saying that this is what *you* do). However, until you show some
code, nobody really understands anything. Once you show some code,
everybody goes: "Oh! That's what you meant!"


>
>Learning outside of code also establishes a good foundation for use cases,
>UML, and other pre-coding skills.

I much prefer that UML be leaned as an adjunct to code. UML that is
divorced from code is pretty useless IMHO.

Uncle Bob (Robert C. Martin)

unread,
Dec 23, 2002, 8:46:04 AM12/23/02
to
dit...@ix.netcom.com (David Cattarin) might (or might not) have
written this on (or about) 21 Dec 2002 16:06:15 -0800, :

I doubt I'd use much UML either. Customers aren't usually interested
in technical documentation. Indeed, the best presentation I can make
to a customer is a demonstration of some part of the system.

Uncle Bob (Robert C. Martin)

unread,
Dec 23, 2002, 8:46:49 AM12/23/02
to
"Warren Zeigler" <warren...@attbi.com> might (or might not) have
written this on (or about) Sun, 22 Dec 2002 01:14:36 GMT, :

>While I must admit that I wouldn't mind more business, newsgroups have given
>$0, and that is likely to stay the same.

Another one of our opposites.

JXStern

unread,
Dec 23, 2002, 1:57:34 PM12/23/02
to
On Sun, 22 Dec 2002 01:10:07 GMT, "Warren Zeigler"
<warren...@attbi.com> wrote:
>> Nonsense.
>>
>> Half of these are in Cobol already, whether the programmer knows about
>> them or not.
>>
>> The other half are not necessary to many/most OO applications.
>
>I stumbled on that one myself, until I went from looking at the language to
>talking to programmers. Note that even you said "whether the programmer
>knows about them or not." The issue is not what is possible in COBOL. The
>issue is the mindset and experience of the student.

Any Cobol programmer who created even CICS/3270 applications probably
knows more about structuring a GUI than most OO practitioners.

Any Cobol programmer who built almost any kind of back-end code
probably knows more about persistence and storage (and probably
database) than most OO practitioners.

I can just imagine such people showing up at an OO class and being
taught an orthogonal bag of compsci technologies and being told these
are going to be the key to building good OO applications.

Actually, I can do more than imagine, I've taught such people and such
classes, and been the Cobol programmer pre-1985, and been the OO
programmer pretty much since (when I'm not being the database
developer instead). The real issue is that early programming
languages went to some lengths, in both theory and practice, to
separate applications development from such systems issues. It was
never entirely a successful effort, of course, and that battle
continues today in language and tool design. But thirty years ago, it
was at least a minor triumph of tool design that Cobol programmers
didn't have to know all this stuff. It's a big trip in the wayback
machine to tell them they have to know it now.

J.

PS - this is not to deny, really, that for a Cobol programmer to hack
OO code in Java-like languages *is* likely to involve all of this
stuff eventually, nor that most of it will indeed be new to them. But
there is a huge irony in trying to present this stuff to them as new.

David Cattarin

unread,
Dec 23, 2002, 2:50:17 PM12/23/02
to
"Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in message news:<mq4e0vock401or6fd...@4ax.com>...

> dit...@ix.netcom.com (David Cattarin) might (or might not) have
> written this on (or about) 21 Dec 2002 16:06:15 -0800, :
>
> >"Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in message news:<iv470vcjfmfng0nv0...@4ax.com>...
> >> dit...@ix.netcom.com (David Cattarin) might (or might not) have
> >> written this on (or about) 19 Dec 2002 12:16:30 -0800, :
> >>
> >> >A diagram is worth a thousand lines of code. ;)
> >>
> >> Until you need something that executes.
> >
> >Of course! It certainly isn't a replacement. Diagrams, UML or
> >otherwise, are the best way to document a system at a high level.
> >After all, you honestly wouldn't present an analysis to a customer
> >using thousands of lines of code, would you?
>
> I doubt I'd use much UML either.

The types of diagrams that I choose depend on the audience. Good
diagrams, however, make the communication process easier. At times,
UML can be clumsy.

> Customers aren't usually interested
> in technical documentation.

Keyword: Usually.

> Indeed, the best presentation I can make
> to a customer is a demonstration of some part of the system.

Without doubt, nothing gives the customer a nice warm-fuzzy feeling
like a demo that is successful, especially when it is on equipment
they control.

Customers are a tempramental beast and you have to cater to their
level of understanding. When dealing with a tech-savy group, UML
documentation and demos can work well; and with others, demos are
good, but you still need some additional documentation for
explaination.

In any case, almost all documentation delivered to a customer includes
different kinds of diagrams (not counting screen shots) that help to
explain how the system works.

Dave

Universe

unread,
Dec 23, 2002, 3:21:53 PM12/23/02
to
David Cattarin wrote:

> "Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in message news:<mq4e0vock401or6fd...@4ax.com>...

>> dit...@ix.netcom.com (David Cattarin) might (or might not) have
>>

>> >"Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in message news:<iv470vcjfmfng0nv0...@4ax.com>...

>> >> dit...@ix.netcom.com (David Cattarin) might (or might not) have
>> >>

>> >> >A diagram is worth a thousand lines of code. ;)
>> >>
>> >> Until you need something that executes.
>> >
>> >Of course! It certainly isn't a replacement. Diagrams, UML or
>> >otherwise, are the best way to document a system at a high level.
>> >After all, you honestly wouldn't present an analysis to a customer
>> >using thousands of lines of code, would you?
>>
>> I doubt I'd use much UML either.

> The types of diagrams that I choose depend on the audience. Good
> diagrams, however, make the communication process easier. At times,
> UML can be clumsy.

>> Customers aren't usually interested
>> in technical documentation.

He'll *never* get it. Another pentultimate example of "pragmasoft"
c..pola and its tired butt, oh so lame mutterings and ways of doing
things. Jeez Louise.

Showing customers an overall high level conceptual view of system - its
major domain related elements, how they interact, and they work to fulfill
use case requirements is "technical" in the compsci, sw-eng sense only
minimally.

With good object notation modelling such a diagram(s) is a typically a
very effective and often highly intuitive way to communicate and dialog
with the customers.

Elliott
--
http://www.radix.net/~universe ~*~ Enjoy! ~*~
Hail OO Modelling! * Hail the Wireless Web!
@Elliott 2002 my comments ~ newsgroups+bitnet OK

Topmind

unread,
Dec 23, 2002, 8:02:08 PM12/23/02
to


I find it depends on the person. Some relate to diagrams
(and differently to different kinds of diagrams), others
to screen shots, others to report samples, etc. One person
was not "getting" something until I started using
laundry analogies. Time for ULL (Unified Laundry Language)?


>
> Elliott
> --
> http://www.radix.net/~universe ~*~ Enjoy! ~*~

-T-

Warren Zeigler

unread,
Dec 23, 2002, 11:16:27 PM12/23/02
to
"Topmind" <top...@technologist.com> wrote in message
news:MPG.18715244a...@news.earthlink.net...

> I find it depends on the person. Some relate to diagrams
> (and differently to different kinds of diagrams), others
> to screen shots, others to report samples, etc. One person
> was not "getting" something until I started using
> laundry analogies. Time for ULL (Unified Laundry Language)?

Well put. Some people are auditory learners, some visual (diagrams), some
from other methods. UML is one form of communication - possibly the best for
high-level discussions for those who use it, but it is just one of many
tools for communication.

Use what works best for the situation.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Warren Zeigler

unread,
Dec 23, 2002, 11:23:32 PM12/23/02
to
"JXStern" <JXSternC...@gte.net> wrote in message
news:time0v02g06g63haa...@4ax.com...

> Any Cobol programmer who created even CICS/3270 applications probably
> knows more about structuring a GUI than most OO practitioners.

Why do you say this? Business math has little to do with event-driven
programming.

> The real issue is that early programming
> languages went to some lengths, in both theory and practice, to
> separate applications development from such systems issues. It was
> never entirely a successful effort, of course, and that battle
> continues today in language and tool design. But thirty years ago, it
> was at least a minor triumph of tool design that Cobol programmers
> didn't have to know all this stuff. It's a big trip in the wayback
> machine to tell them they have to know it now.

You have the essence of the difference, but I disagree that it is a step
back. The question is: What was the language developed for? COBOL was for
business financials. Java, C++ and .NET are target for a MUCH broader scope
of application. It is easy to isolate from the machine when you have a
clear, narrow target, like COBOL had.

> But there is a huge irony in trying to present this stuff to them as new.

I have taught many COBOL programmers - along with others - in the process of
evolving the class. I have dropped more than half of the origional class -
and added that much again - and more, according to pragmatic demand. I can
tell you that most COBOL programmers did not learn or forgot several basic
issues.
--
Warren Zeigler
wzei...@UnderstandingObjects.com


Warren Zeigler

unread,
Dec 23, 2002, 11:29:02 PM12/23/02
to
"Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in
message
> This is exactly the opposite of my experience. (Which just tells me
> that there's no one true path.) In my experience code crystallizes
> understanding. We can wave our hands about object and relationships
> and abstraction, and we can make it all sound warm and fuzzy. (I'm
> not saying that this is what *you* do). However, until you show some
> code, nobody really understands anything. Once you show some code,
> everybody goes: "Oh! That's what you meant!"

I would say that working w/ code SEEMS to crystallize understanding. The
student can quickly do exercises, but they LOSE:
1) The connection to the real world, which hampers analysis and design
2) The ability to take the concept to other languages
3) The ability to see variations on a theme, since they see a narrow
implementation of a broad concept, instead of the broad concept
4) From #4, the ability to easily use it in a creative situation, since
learning from code is more cookbook learning, instead of learning the "why"
and letting the student work from there to many possible implementations.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Shayne Wissler

unread,
Dec 23, 2002, 11:34:47 PM12/23/02
to

"Warren Zeigler" <warren...@attbi.com> wrote in message
news:ioRN9.486029$NH2.33830@sccrnsc01...

> "Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in
> message
> > This is exactly the opposite of my experience. (Which just tells me
> > that there's no one true path.) In my experience code crystallizes
> > understanding. We can wave our hands about object and relationships
> > and abstraction, and we can make it all sound warm and fuzzy. (I'm
> > not saying that this is what *you* do). However, until you show some
> > code, nobody really understands anything. Once you show some code,
> > everybody goes: "Oh! That's what you meant!"
>
> I would say that working w/ code SEEMS to crystallize understanding. The
> student can quickly do exercises, but they LOSE:
> 1) The connection to the real world, which hampers analysis and design

Imagine that. Knowing how to program makes you lose contact with the real
world.

> 2) The ability to take the concept to other languages

Knowing how to express the concept in one language hampers your ability to
express it in others? So I guess the ideal designer would ever be language
ignorant.

> 3) The ability to see variations on a theme, since they see a narrow
> implementation of a broad concept, instead of the broad concept

What a high opinion you have of your students.

> 4) From #4, the ability to easily use it in a creative situation, since
> learning from code is more cookbook learning, instead of learning the
"why"
> and letting the student work from there to many possible implementations.

Ditto.

Warren Zeigler

unread,
Dec 23, 2002, 11:35:51 PM12/23/02
to
"Shayne Wissler" <thal...@yahoo.com> wrote in message
news:7x8N9.59042$qF3.4949@sccrnsc04...

You missed the point. While discussing the real world, diagrams of the real
world are introduced. Simulations are the source of the comp sci concepts
for OO, and we follow this line, just like most science texts follow the
origin of an idea down to the equations that resulted.

NO science just starts w/ equations. Look at college texts. It starts with
context and progresses from there. To say that not starting w/ code is not
software engineering is to say somehow that software is different from all
other engineering, science or art. Sorry, but to learn FIRST from code
distorts and hampers the science, forcing each student to reverse-engineer.
Most do not get it right, and it is stupid to try to have everyone re-invent
the wheel through reverse engineering. This is not education. It is
laziness. (Here is my code, a little demonstrating this and a little
demonstrating that. Go figure out the principles yourself.)

--
Warren Zeigler
wzei...@UnderstandingObjects.com


JXStern

unread,
Dec 24, 2002, 1:22:50 PM12/24/02
to
On Tue, 24 Dec 2002 04:23:32 GMT, "Warren Zeigler"
<warren...@attbi.com> wrote:
>> Any Cobol programmer who created even CICS/3270 applications probably
>> knows more about structuring a GUI than most OO practitioners.
>
>Why do you say this? Business math has little to do with event-driven
>programming.

Are you familiar with what CICS is or what a 3270 is? Hey, are you
familiar with what a GUI is? Event-driven programming is a means to
an end, y'know, and that end is already familiar to a whole lot of
Cobol programmers. Strictly speaking, there might well be tools in
the Java space that make GUI design an automagic process so that they
wouldn't have to know anything about events.

>You have the essence of the difference, but I disagree that it is a step
>back. The question is: What was the language developed for? COBOL was for
>business financials. Java, C++ and .NET are target for a MUCH broader scope
>of application. It is easy to isolate from the machine when you have a
>clear, narrow target, like COBOL had.

Ah yes, but what if I want to do business financials in Java etc? As
probably most of your Cobol-to-OO people will?

>I have taught many COBOL programmers - along with others - in the process of
>evolving the class. I have dropped more than half of the origional class -
>and added that much again - and more, according to pragmatic demand. I can
>tell you that most COBOL programmers did not learn or forgot several basic
>issues.

Sure, the average practitioner hasn't cracked a book or taken a class
in years and years, will have forgotten tons of stuff. At least, that
was true twenty years ago, back in Cobol days. The gotta-do-new-stuff
that you see in modern programmers was present, but at a much lower
level, IIRC. The industry was changing much more slowly, so there
wasn't the panic of running-as-fast-as-you-can-just-to-stay-in-the-
same-place. There wasn't even any huge pride in knowing a thousand
obscure tips and techniques, but there was a much larger interest in
analysis, design, and, dare I say it, completion and deployment. I
presume these old cultural trends are still mostly present in
present-day Cobol'ers. These kinds of cultural things are probably
greater problems in having them move to objects, than is any single
technical matter, or maybe even all the technical matters combined.

J.

Warren Zeigler

unread,
Dec 24, 2002, 2:23:24 PM12/24/02
to
"JXStern" <JXSternC...@gte.net> wrote in message
news:dv8h0v8n0bfrpdjhj...@4ax.com...

> Are you familiar with what CICS is or what a 3270 is? Hey, are you
> familiar with what a GUI is? Event-driven programming is a means to
> an end, y'know, and that end is already familiar to a whole lot of
> Cobol programmers. Strictly speaking, there might well be tools in
> the Java space that make GUI design an automagic process so that they
> wouldn't have to know anything about events.


I was an MVS, VTAM and NCP system programmer in an IMS shop.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


"JXStern" <JXSternC...@gte.net> wrote in message

news:dv8h0v8n0bfrpdjhj...@4ax.com...


> On Tue, 24 Dec 2002 04:23:32 GMT, "Warren Zeigler"
> <warren...@attbi.com> wrote:
> >> Any Cobol programmer who created even CICS/3270 applications probably
> >> knows more about structuring a GUI than most OO practitioners.
> >
> >Why do you say this? Business math has little to do with event-driven
> >programming.
>
> Are you familiar with what CICS is or what a 3270 is? Hey, are you
> familiar with what a GUI is? Event-driven programming is a means to
> an end, y'know, and that end is already familiar to a whole lot of
> Cobol programmers. Strictly speaking, there might well be tools in
> the Java space that make GUI design an automagic process so that they
> wouldn't have to know anything about events.

I was an MVS, VTAM and NCP system programmer in an IMS shop.

> Ah yes, but what if I want to do business financials in Java etc? As
> probably most of your Cobol-to-OO people will?

No problem, but you pay a price for the flexability of Java. The more
flexible, the more complex.

> Sure, the average practitioner hasn't cracked a book or taken a class
> in years and years, will have forgotten tons of stuff. At least, that
> was true twenty years ago, back in Cobol days. The gotta-do-new-stuff
> that you see in modern programmers was present, but at a much lower
> level, IIRC. The industry was changing much more slowly, so there
> wasn't the panic of running-as-fast-as-you-can-just-to-stay-in-the-
> same-place. There wasn't even any huge pride in knowing a thousand
> obscure tips and techniques, but there was a much larger interest in
> analysis, design, and, dare I say it, completion and deployment. I
> presume these old cultural trends are still mostly present in
> present-day Cobol'ers. These kinds of cultural things are probably
> greater problems in having them move to objects, than is any single
> technical matter, or maybe even all the technical matters combined.

Yes, and this is the important point - there are a lot of these people out
there, valuable because of the experience they have: domain, analaysis, and
many of the demands of coding. This is the type that often has problems w/
the transition to OO.


Warren Zeigler

unread,
Dec 24, 2002, 2:32:21 PM12/24/02
to
"Shayne Wissler" <thal...@yahoo.com> wrote in message
news:HtRN9.440676$%m4.1...@rwcrnsc52.ops.asp.att.net...

>
> "Warren Zeigler" <warren...@attbi.com> wrote in message
> > 1) The connection to the real world, which hampers analysis and design
>
> Imagine that. Knowing how to program makes you lose contact with the real
> world.

There is a big difference between understanding a real-world problem and
having the skills to translate this to a quality analysis, design, or
implementation.

>
> > 2) The ability to take the concept to other languages
>
> Knowing how to express the concept in one language hampers your ability to
> express it in others? So I guess the ideal designer would ever be language
> ignorant.

If the concept is only understood in a language context, that context:
1) Is often a subset of the full concept
2) Alters the understanding w/ language constructs

> > 3) The ability to see variations on a theme, since they see a narrow
> > implementation of a broad concept, instead of the broad concept
>
> What a high opinion you have of your students.

They are students because they want to learn. To only give a narrow set of
concepts would be poor teaching, not an insult to students.

> > 4) From #4, the ability to easily use it in a creative situation, since
> > learning from code is more cookbook learning, instead of learning the
> "why"
> > and letting the student work from there to many possible
implementations.
>
> Ditto.

In this day, the teacher is paid to do the best possible to get the student
ready for productive work, not to send them to the bench with a whole bunch
of problems to work out. You seem to advocate poor teaching, then show a
lack of understanding of education. Are you a trainer? Where are you coming
from, with all of these comments, showing such ignorance of an area where
you are insulting others?

JXStern

unread,
Dec 24, 2002, 2:56:57 PM12/24/02
to
On Tue, 24 Dec 2002 19:23:24 GMT, "Warren Zeigler"
<warren...@attbi.com> wrote:

>"JXStern" <JXSternC...@gte.net> wrote in message
>news:dv8h0v8n0bfrpdjhj...@4ax.com...
>> Are you familiar with what CICS is or what a 3270 is? Hey, are you
>> familiar with what a GUI is? Event-driven programming is a means to
>> an end, y'know, and that end is already familiar to a whole lot of
>> Cobol programmers. Strictly speaking, there might well be tools in
>> the Java space that make GUI design an automagic process so that they
>> wouldn't have to know anything about events.
>
>I was an MVS, VTAM and NCP system programmer in an IMS shop.

I see. Were you ever an applications developer in Cobol or OO?

>> Ah yes, but what if I want to do business financials in Java etc? As
>> probably most of your Cobol-to-OO people will?
>
>No problem, but you pay a price for the flexability of Java. The more
>flexible, the more complex.

Not sure that follows.

And if it does, I'm not sure flexibility is a virtue.

I mean, if I loved flexible and complex, wouldn't I program in raw
bits?

>Yes, and this is the important point - there are a lot of these people out
>there, valuable because of the experience they have: domain, analaysis, and
>many of the demands of coding. This is the type that often has problems w/
>the transition to OO.

Well yeah, if they can see that the virtues of OO as they are so often
preached are completely at odds with everything they know, then it
must seem that they are being taught to do things the hard way. Add
to that the natural resistance to change, and you've got a handfull,
alrighty.

J.

Shayne Wissler

unread,
Dec 24, 2002, 3:05:17 PM12/24/02
to

"Warren Zeigler" <warren...@attbi.com> wrote in message
news:9D2O9.449001$P31.153548@rwcrnsc53...

> "Shayne Wissler" <thal...@yahoo.com> wrote in message
> news:HtRN9.440676$%m4.1...@rwcrnsc52.ops.asp.att.net...
> >
> > "Warren Zeigler" <warren...@attbi.com> wrote in message
> > > 1) The connection to the real world, which hampers analysis and design
> >
> > Imagine that. Knowing how to program makes you lose contact with the
real
> > world.
>
> There is a big difference between understanding a real-world problem and
> having the skills to translate this to a quality analysis, design, or
> implementation.

By what definition of analysis?

> > > 2) The ability to take the concept to other languages
> >
> > Knowing how to express the concept in one language hampers your ability
to
> > express it in others? So I guess the ideal designer would ever be
language
> > ignorant.
>
> If the concept is only understood in a language context

Strawman. No one is proposing only understanding with a language context.

> > > 3) The ability to see variations on a theme, since they see a narrow
> > > implementation of a broad concept, instead of the broad concept
> >
> > What a high opinion you have of your students.
>
> They are students because they want to learn. To only give a narrow set of
> concepts would be poor teaching, not an insult to students.

Another strawman. No one is proposing a restriction on how broad of concepts
to teach.

> > > 4) From #4, the ability to easily use it in a creative situation,
since
> > > learning from code is more cookbook learning, instead of learning the
> > "why"
> > > and letting the student work from there to many possible
> implementations.
> >
> > Ditto.
>
> In this day, the teacher is paid to do the best possible to get the
student
> ready for productive work, not to send them to the bench with a whole
bunch
> of problems to work out.

Strawman.

> You seem to advocate poor teaching, then show a
> lack of understanding of education.

Correction: your strawman advocates poor teaching.

> Are you a trainer? Where are you coming
> from, with all of these comments, showing such ignorance of an area where
> you are insulting others?

Ad hominem.

Warren Zeigler

unread,
Dec 24, 2002, 5:50:22 PM12/24/02
to
"JXStern" <JXSternC...@gte.net> wrote in message
news:lheh0vstmjt9bf79e...@4ax.com...

> I see. Were you ever an applications developer in Cobol or OO?

6 years w/ C++, one w/ Delphi and now a few w/ Java. No COBOL.

--
Warren Zeigler
wzei...@UnderstandingObjects.com

Topmind

unread,
Dec 24, 2002, 6:32:02 PM12/24/02
to
> >
> > I would say that working w/ code SEEMS to crystallize understanding. The
> > student can quickly do exercises, but they LOSE:
> > 1) The connection to the real world, which hampers analysis and design
>
> Imagine that. Knowing how to program makes you lose contact with the real
> world.
>

That was my experience WRT dating :-)

-T-

Karl Kiesel

unread,
Dec 27, 2002, 5:33:03 AM12/27/02
to

Wolfgang Formann schrieb:

> This is my understanding of COBOL, okay it is 15 years old, hopefully the
> language has evolved, I did not use it for that time.

Yes COBOL has evolved: a new Standard has been approved lately! You can
still load down the final draft of the new Standard at
http://www.incits.org/tc_home/j4.htm
for free (the official document costs some $300 but only differences are
in numbering and Copyright)
To get a first impression of language constructs defined for OO-support
with COBOL, I think it is best to start at the Concepts section E.17 on
page 768.

regards K. Kiesel

Ron

unread,
Dec 27, 2002, 12:22:40 PM12/27/02
to
Well, you know...there is such a thing as Object Oriented
COBOL. Which, by the way in my opinion, is implemented in
a much more straightforward, easily understood, direct
fashion than the deliberately arcane, obfuscated C++ and
JAVA. Why don't you teach OO concepts to we COBOL
programmers USING COBOL?! Then we WON'T HAVE major
problems understanding syntax.

The other thing is most classes such as this start using
OO terms without giving precise definitions of what they
mean. I have never yet read an actual, clear-cut definition
of what an Object really is. When you start throwing out
terms like class, method, object, inheritance
and so on without clear explanations it just confuses us.
We cannot get our mind around these terms because the terms
are not defined except as in examples. What IS a class?
What IS an object? What IS a method? We automatically
attempt to relate these terms to what we are used to - such
as: class = subroutine, method = paragraph, inheritance =
passing parms to the subroutine. No wonder we are lost.
Just saying things like "..then you instantiate the object"
and showing an example in some bizarre code throws COBOL
programmers for a loop. Oh you mean..."...then you call the
subroutine"? Well not really. COBOL programmers in general are
very capable of 'getting it'. In my opinion it's rarely taught
in a manner that we understand.
My suggestions are:

1. Tell us WHY THIS IS BETTER. If we understand that, we'll be
converts from the get-go. Why is it better to 'INVOKE'
an object than it is to 'CALL' a subroutine? What is the
advantage to us of these concepts?
2. Tell us WHAT ALL THESE TERMS MEAN. Give clear, precise
definitions of this things. Don't just show us examples
and expect us to get it. Then....
3. Show us IN COBOL how to do it. Use COBOL examples and
COBOL syntax. Then we won't have to fight syntax.
4. Then....Help us move on to other languages.

Warren Zeigler

unread,
Dec 27, 2002, 4:39:39 PM12/27/02
to
THANKS! I have downloaded it. OO COBOL is probably the most used OO language
that I don't have code examples for in my manual. That will change now.

--
Warren Zeigler
wzei...@UnderstandingObjects.com

"Karl Kiesel" <Karl....@fujitsu-siemens.com> wrote in message
news:3E0C2C5F...@fujitsu-siemens.com...

Warren Zeigler

unread,
Dec 27, 2002, 4:41:11 PM12/27/02
to
I am listening, and am going to add OO COBOL examples.

As for terminology, I agree. If you were one of my students, you would not
have that problem.

--
Warren Zeigler
wzei...@UnderstandingObjects.com

"Ron" <No...@swbell.net> wrote in message
news:aui291$ij1$1...@ngspool-d02.news.aol.com...

Richard

unread,
Dec 27, 2002, 4:53:12 PM12/27/02
to
"Warren Zeigler" <warren...@attbi.com> wrote

> It has been rumored for some time: COBOL programmers have more problems than
> others moving to object-oriented programming.

One major reason for this is that much of 'OO Programming' solves
problems that Cobol programmers don't have.

Business programming mainly involves dealing with one thing at a time.
To process a series of sales orders we process the next order header,
read the customer, then process each order line in turn. Much of the
'encapsulation' is related to being able to deal with several similar
objects simultaneously in an easy manner - problems dealt with in
Cobol normally don't need that.

Compared to C, C++ allows the creation of user types that can be
handled as if they were built in types. To implement, for example, a
fixed point decimal money type in C would require that there be a set
of functions to manipulate it. There would be functions to add two
together, to multiply by an int, to convert to string etc. For a
similar type with 4 decimals there would be a completely new set of
functions.

Classes with overloaded functions 'solves' all those problems in C.
But these are problems that are already solved in Cobol for all types
of Business applications. There is no need for 'overloading' because
the Cobol types already allow for this.

Object creation and destruction take care of all (or most) of the
problems that come with C's low level memory management, most of which
is a direct result of C's process model. For example if there is a
malloc() done in a function then this must be free()ed on exit to
avoid a leak. C++ solves that with destructors, Java solves it with
garbage collection, Cobol never had the problem, sub-programs may be
CALLed and CANCELed and the memory is managed without leaks or wild
pointers.

While C programmers can see that C++ and Java solve the problems that
they face everyday, Cobol programmers simply don't have those
problems.

> While teaching Understanding Objects, I have seen this problem. More than
> 3/4 of the students that were coming from COBOL had serious problems in
> class.

Because they kept asking "What the f**k is the use of that ?". The
'solutions' that you present just do not equate to any 'problems' that
Cobol programmers have.

> The end result is that COBOL programmers take longer to comprehend the
> complex syntax. It is simply a matter of practice.

First you have to convince them that there _is_ a problem by teaching
them C and Pascal, and the applications that these languages are more
often used for, then you have to teach them the solution which is C++
or Java.

It isn't that C syntax is 'too complex' for their brains, it is just
that it is completely pointless until you show them the problems that
_other_ languages have.

Warren Zeigler

unread,
Dec 27, 2002, 9:41:59 PM12/27/02
to
"Richard" <rip...@Azonic.co.nz> wrote in message
news:217e491a.02122...@posting.google.com...

> "Warren Zeigler" <warren...@attbi.com> wrote
>
> > It has been rumored for some time: COBOL programmers have more problems
than
> > others moving to object-oriented programming.
>
> One major reason for this is that much of 'OO Programming' solves
> problems that Cobol programmers don't have.

Interesting proposition.

> Business programming mainly involves dealing with one thing at a time.
> To process a series of sales orders we process the next order header,
> read the customer, then process each order line in turn. Much of the
> 'encapsulation' is related to being able to deal with several similar
> objects simultaneously in an easy manner - problems dealt with in
> Cobol normally don't need that.

Really?

I agree that one of the core issues of OO is the handling of multiplicity.
Your example is a perfect example of multiplicity: line items in an order. I
assume that you are saying "each order line in turn" as if, for each line,
you are using the database index, reading the line, processing, then going
to the database for the next line. This does not eliminate multiplicity, it
just has the database handle it.

> Compared to C, C++ allows the creation of user types that can be
> handled as if they were built in types. To implement, for example, a
> fixed point decimal money type in C would require that there be a set
> of functions to manipulate it. There would be functions to add two
> together, to multiply by an int, to convert to string etc. For a
> similar type with 4 decimals there would be a completely new set of
> functions.
>
> Classes with overloaded functions 'solves' all those problems in C.
> But these are problems that are already solved in Cobol for all types
> of Business applications. There is no need for 'overloading' because
> the Cobol types already allow for this.
>
> Object creation and destruction take care of all (or most) of the
> problems that come with C's low level memory management, most of which
> is a direct result of C's process model. For example if there is a
> malloc() done in a function then this must be free()ed on exit to
> avoid a leak. C++ solves that with destructors, Java solves it with
> garbage collection, Cobol never had the problem, sub-programs may be
> CALLed and CANCELed and the memory is managed without leaks or wild
> pointers.
>
> While C programmers can see that C++ and Java solve the problems that
> they face everyday, Cobol programmers simply don't have those
> problems.

Because COBOL programmers let the database handle multiplicity, even in
cases where this is much slower. In fact, the problem exists, just in a
different form: Database growth (if the rows are not deleted) and
fragmentation over time.

> > While teaching Understanding Objects, I have seen this problem. More
than
> > 3/4 of the students that were coming from COBOL had serious problems in
> > class.
>
> Because they kept asking "What the f**k is the use of that ?". The
> 'solutions' that you present just do not equate to any 'problems' that
> Cobol programmers have.
>
> > The end result is that COBOL programmers take longer to comprehend the
> > complex syntax. It is simply a matter of practice.
>
> First you have to convince them that there _is_ a problem by teaching
> them C and Pascal, and the applications that these languages are more
> often used for, then you have to teach them the solution which is C++
> or Java.

That is not my problem. I supply traing to businesses that have made this
determination and are already moving to objects.

> It isn't that C syntax is 'too complex' for their brains, it is just
> that it is completely pointless until you show them the problems that
> _other_ languages have.

I never said "for their brains." It's just something that EVERYONE has to
practice a little. It is not a bad learning curve, but it exists, and if it
is ignored, it interferes w/ other training. It's just a matter of taking it
into account.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Topmind

unread,
Dec 27, 2002, 11:12:24 PM12/27/02
to
> > > It has been rumored for some time: COBOL programmers have more problems
> than
> > > others moving to object-oriented programming.
> >
> > One major reason for this is that much of 'OO Programming' solves
> > problems that Cobol programmers don't have.
>
> Interesting proposition.
>
> > Business programming mainly involves dealing with one thing at a time.
> > To process a series of sales orders we process the next order header,
> > read the customer, then process each order line in turn. Much of the
> > 'encapsulation' is related to being able to deal with several similar
> > objects simultaneously in an easy manner - problems dealt with in
> > Cobol normally don't need that.
>
> Really?
>
> I agree that one of the core issues of OO is the handling of multiplicity.
> Your example is a perfect example of multiplicity: line items in an order. I
> assume that you are saying "each order line in turn" as if, for each line,
> you are using the database index, reading the line, processing, then going
> to the database for the next line. This does not eliminate multiplicity, it
> just has the database handle it.

Kaching!

Databases are slow? That is also what some say about OOP.
Fragmentation is usually *not* a problem
that app developers have to worry about. Often the DB takes
care of it itself using automatic periodic maintenance. And,
even if it does not do it all by itself, it is usually the
worry of the DBA, not the app developer.

>
> > > While teaching Understanding Objects, I have seen this problem. More
> than
> > > 3/4 of the students that were coming from COBOL had serious problems in
> > > class.
> >
> > Because they kept asking "What the f**k is the use of that ?". The
> > 'solutions' that you present just do not equate to any 'problems' that
> > Cobol programmers have.
> >
> > > The end result is that COBOL programmers take longer to comprehend the
> > > complex syntax. It is simply a matter of practice.
> >
> > First you have to convince them that there _is_ a problem by teaching
> > them C and Pascal, and the applications that these languages are more
> > often used for, then you have to teach them the solution which is C++
> > or Java.
>
> That is not my problem. I supply traing to businesses that have made this
> determination and are already moving to objects.

What about recommending that the COBOLer's sharpen their
database skills? Otherwise they may expect objects to boost
them where it may not make much difference.

>
> > It isn't that C syntax is 'too complex' for their brains, it is just
> > that it is completely pointless until you show them the problems that
> > _other_ languages have.
>
> I never said "for their brains." It's just something that EVERYONE has to
> practice a little. It is not a bad learning curve, but it exists, and if it
> is ignored, it interferes w/ other training. It's just a matter of taking it
> into account.


However, the issue may not really be the benefits or downfalls
of COBOL, but rather just understanding where the market is.
The market is sometimes stupid. It over-chases fads then
over backs-off. (For the sake of argument, assume it may be
a fad.) But, those fads are a form of inter-developer
communication and vendors weave a lot of products around those
fads. Thus, one is locked in for good or bad. It is a form
of "when in Rome, do what the romans are doing".

That way if students or the purchasers are less likely to
be let down if benefits don't pour forth.

That way you are less likely to have sassy, pissed
students that are remeniscent of me.

I am not saying criticize OO in front of clients
and students, just say that looking
for benefits is a moot issue at this point.
The industry wrapped too much around it already.
English may not be the ideal language, but it
is the industry standard for biz, for good or
bad.

>
> --
> Warren Zeigler
> wzei...@UnderstandingObjects.com
>

-T-

Peter E. C. Dashwood

unread,
Dec 28, 2002, 7:32:20 AM12/28/02
to
Having spent the last 37 years writing COBOL and currently using OO
COBOL along with Java and all the Web Stuff, I find this thread
extremely insightful. The original poster put his finger on a number
of problems which I have also found when teaching OO concepts to COBOL
people.

I recommend that they learn Java (because it is DESIGNED around OO and
it is useful in today's environment) as a means to grasp OO concepts.

More comments below...

Karl Kiesel <Karl....@fujitsu-siemens.com> wrote in message news:<3E0C2C5F...@fujitsu-siemens.com>...

The OO section of this standard has not been approved.

Certainly, COBOL has evolved. For some years now both Fujitsu and
MicroFocus have provided their own implementations of OO COBOL
compilers. (The Fujitsu one is very close to the proposed standard...)

For over 2 years now I have been using OO COBOL to create ActiveX COM
components and, more recently, wrapping these in SOAP to create Web
Services. I have had no problem in deploying these components to the
desktop under Windows and to the Web, neither do they have problems
co-existing with components developed in other languages.

If you are interested in this, I wrote an article on it last year.
Here's the link: http://66.166.131.10/Archives/V3/V30201.asp

It is a long way from Batch processing on IBM mainframes <G>.

The main problems with OO COBOL are that it is too verbose, and it was
"bolted on" to COBOL (although they did an excellent job of this...).
It is a bewildering experience to try and pick up the syntax and the
OO concepts at the same time.

However, people who already understand OO will have less problem with
it; hence, I recommend Java first.

The COBOL community has not rushed to embrace OO COBOL and support for
it is being withdrawn in some quarters. As it is becoming clearer that
the future of programming lies with Objects, this can only augur
ominously for the future of COBOL.

If you would like to see some sample OO COBOL code, with an
explanation, here's another link:

http://www.aspalliance.com/aldotnet/examples/coboldotnet.aspx

Pete.

Rick S.

unread,
Dec 28, 2002, 12:26:06 PM12/28/02
to
"Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote

> The sheer number of concepts that a COBOL programmer has to learn in
> order to come up to speed in a C++, Java, or C# environment is
> daunting. It includes:
>
> - Dynamic memory management
> - Recursion
> - Interrupts
> - Threads
> - Functions
> - Local variables
> - References
> - Block structure
> - Stack Frames

Oddly too, I have been working as a COBOL programmer since January of
1989. I did manage to escape from it for about 18 months around 97-98,
for awhile I was working with C, then, but I had always wanted to work
in Europe. So with the Year2000 bug, and the Euro, I was able to move
to France in May of 1998, but it required moving back to working with
mainframes and COBOL.

I can't really complain. Getting a different perspective on life, et
cetera. In February I should have my French "green card".

But when think about what I would like to next, it isn't exactly object
oriented programming. In fact, part of me would like to escape from
programming entirely.

Outside of pointers, in C/C++, I don't think there's anything really that
hard in your list above. The hardest thing is sitting in a chair for
about 40 hours a week, in front of a screen. Although in between June
of 2000 and last September I was working on a set of applications that
had business logic that was very complicated. Sometimes even understanding
what the system does is complicated.

Even C/C++ pointers aren't hard in the textbook sample programs, but once
you start dealing with them daily, it's no fun, I would say.

I realize this is mostly a technical discussion, and I apologize for these
non-technical comments, but just the same, I wonder about all of this. In
the 90s I had an interest in object oriented programming, and I wanted
very much to do it, but now it's hard for me to get super excited about
the idea of earning money via programming. I still do it, but I just
am not really excited about it. And my point is, maybe there are a lot
of COBOL programmers like that.

Some days "outdoor work" sounds so nice. No stuffy office air, et cetera.

Sorry for these quasi-off topic comments. Also, I am not really sure
about all of this. It's just my opinion as of this evening. Maybe I
am just on the edge of catching my "second wind" in terms of programming,
et cetera.

Oh well ... rambling here... sorry.

Warren Zeigler

unread,
Dec 28, 2002, 1:04:05 PM12/28/02
to
"Peter E. C. Dashwood" <dash...@enternet.co.nz> wrote in message
news:b3638c46.02122...@posting.google.com...

> If you are interested in this, I wrote an article on it last year.
> Here's the link: http://66.166.131.10/Archives/V3/V30201.asp

A few comments on your article:

I started another thread on the topic of "what is an object." In your
article, you talk about "instances of small, efficient, modules." This is an
excellent definition, and very close to what I teach.

In your history, you call Java Beans possibly the first component. Sun went
to Borland for help in developing Java Beans because of the success of
Delphi's VCL (Visual Component Library). Boland's components were the first
to be written in the same language that they ran in. (Visual BASIC added
this later - I think in 5). Delphi's internal name at Borland was VBK -
Visual BASIC Killer. The VCL was an answer to MS's Visual Basic 16 bit
components - written in C then C++, running in Visual BASIC.

> I recommend that they learn Java (because it is DESIGNED around OO and
> it is useful in today's environment) as a means to grasp OO concepts.

I now recommend Java, C# or Visual Basic.NET, depending on the programmer's
prior experience and current development environment.

> The main problems with OO COBOL are that it is too verbose, and it was
> "bolted on" to COBOL (although they did an excellent job of this...).
> It is a bewildering experience to try and pick up the syntax and the
> OO concepts at the same time.

Could you expand on this a little?
--
Warren Zeigler
wzei...@UnderstandingObjects.com

Warren Zeigler

unread,
Dec 28, 2002, 1:10:33 PM12/28/02
to
"Rick S." <rick...@hotpop.com> wrote in message
news:21790eb9.02122...@posting.google.com...

> But when think about what I would like to next, it isn't exactly object
> oriented programming. In fact, part of me would like to escape from
> programming entirely.
>
> Outside of pointers, in C/C++, I don't think there's anything really that
> hard in your list above. The hardest thing is sitting in a chair for
> about 40 hours a week, in front of a screen.

> I realize this is mostly a technical discussion, and I apologize for these


> non-technical comments, but just the same, I wonder about all of this.

While your comments were not technical, as an "educational thread," student
motivation is a primary topic.

There are many poorly motivated students, including those:
1) Tired (like you)
2) Discouraged by C++ (a system language and overly complex for application
development)
3) In class because they were sent, and not really interested
4) Have heard that the transition is difficult, and don't have confidence
5) Are not ready/willing to go through a series of learning curves again

Learning takes effort. If the student is not ready, other issues become
irrelevant.

Warren Zeigler
wzei...@UnderstandingObjects.com


Richard

unread,
Dec 28, 2002, 2:59:27 PM12/28/02
to
"Warren Zeigler" <warren...@attbi.com> wrote

> I agree that one of the core issues of OO is the handling of multiplicity.
> Your example is a perfect example of multiplicity: line items in an order. I
> assume that you are saying "each order line in turn" as if, for each line,
> you are using the database index, reading the line, processing, then going
> to the database for the next line. This does not eliminate multiplicity, it
> just has the database handle it.

Exactly. Business applications are based around holding data in
databases - there is little need for dynamic creation of memory
objects when these are required to be permanent records on disk
anyway.



> Because COBOL programmers let the database handle multiplicity, even in
> cases where this is much slower.

While storing data on disk may be 'slower' than in memory it has the
advantage of sharing this data and making it permanent. Business
applications may be operated by tens to hundreds of users, data items
need to be stored in a consistent manner in case recovery is needed.

Arguing that with OO it would be better to store all the order lines
as objects in memory while processing just completely misses the whole
point of business applications. As order lines are created they need
to allocate stock and this is recorded on the product master record
which everyone else must access in order to see current stock
availability. For many reasons it is necessary to see the lines that
make up the allocated stock (by some other user). This can only be
done if the order line is written to the database as it is entered and
when the stock is allocated to it.

In most cases OO solutions of creating data in memory just isn't
appropriate - it is a solution that Business Applications can't use to
problems Cobol programmers don't have.

> In fact, the problem exists, just in a
> different form: Database growth (if the rows are not deleted) and
> fragmentation over time.

Business Applications typically keep all their data for some years. I
have clients who have not done culls for several years. In some cases
this is a legal requirement - Invoices must be kept reproducable for
seven years.

> I never said "for their brains." It's just something that EVERYONE has to
> practice a little. It is not a bad learning curve, but it exists, and if it
> is ignored, it interferes w/ other training. It's just a matter of taking it
> into account.

Most Cobol programmers understand that decomplexification is a good
thing. C and C++ code tends to be over complex, not just because it
is combined by nesting function calls into expresions and parameters,
but also because of side effects (which Cobol avoids), short
circuiting, precedence and association.

C and C++ programmers must train themselves to follow rules and avoid
traps that the language allows. For example in C assignment is an
operator. This allows assignments within logical expressions.
However, one has to take into account precedence and short circuiting
when doing this otherwise results will not be as required. Cobol
simply does not allow these problems to arise.

C may be more complex but Cobol is more reliable. OO may solve some
of C's problems but C++'s solution set does not map even close to
Cobol's problem set.

Cobol programmers don't 'get' OO because you fail to show them
anything that is useful to their needs. It is not because they can't
understand complex expressions, but because you fail to understand
their complex (business) environment.

Warren Zeigler

unread,
Dec 28, 2002, 3:34:27 PM12/28/02
to
Comments inline

--
Warren Zeigler
wzei...@UnderstandingObjects.com


"Richard" <rip...@Azonic.co.nz> wrote in message
news:217e491a.02122...@posting.google.com...

> "Warren Zeigler" <warren...@attbi.com> wrote
>
> > I agree that one of the core issues of OO is the handling of
multiplicity.
> > Your example is a perfect example of multiplicity: line items in an
order. I
> > assume that you are saying "each order line in turn" as if, for each
line,
> > you are using the database index, reading the line, processing, then
going
> > to the database for the next line. This does not eliminate multiplicity,
it
> > just has the database handle it.
>
> Exactly. Business applications are based around holding data in
> databases - there is little need for dynamic creation of memory
> objects when these are required to be permanent records on disk
> anyway.

SOME things need to be on disk. That doesn't mean that EVERY time a
programmer needs to handle multiplicity that placing it on disk is best.
That is a VERY unreasonable restriction.

>
> > Because COBOL programmers let the database handle multiplicity, even in
> > cases where this is much slower.
>
> While storing data on disk may be 'slower' than in memory it has the
> advantage of sharing this data and making it permanent. Business
> applications may be operated by tens to hundreds of users, data items
> need to be stored in a consistent manner in case recovery is needed.

Sometimes, not all of the time.

> Arguing that with OO it would be better to store all the order lines
> as objects in memory while processing just completely misses the whole
> point of business applications. As order lines are created they need
> to allocate stock and this is recorded on the product master record
> which everyone else must access in order to see current stock
> availability. For many reasons it is necessary to see the lines that
> make up the allocated stock (by some other user). This can only be
> done if the order line is written to the database as it is entered and
> when the stock is allocated to it.
>
> In most cases OO solutions of creating data in memory just isn't
> appropriate - it is a solution that Business Applications can't use to
> problems Cobol programmers don't have.

Some or most, but not all.

> > In fact, the problem exists, just in a
> > different form: Database growth (if the rows are not deleted) and
> > fragmentation over time.
>
> Business Applications typically keep all their data for some years. I
> have clients who have not done culls for several years. In some cases
> this is a legal requirement - Invoices must be kept reproducable for
> seven years.

And other applications are purged daily. Don't take the extreme and try to
make a rule out of it, then use it to deny flexability!

> > I never said "for their brains." It's just something that EVERYONE has
to
> > practice a little. It is not a bad learning curve, but it exists, and if
it
> > is ignored, it interferes w/ other training. It's just a matter of
taking it
> > into account.
>
> Most Cobol programmers understand that decomplexification is a good
> thing. C and C++ code tends to be over complex, not just because it
> is combined by nesting function calls into expresions and parameters,
> but also because of side effects (which Cobol avoids), short
> circuiting, precedence and association.
>
> C and C++ programmers must train themselves to follow rules and avoid
> traps that the language allows. For example in C assignment is an
> operator. This allows assignments within logical expressions.
> However, one has to take into account precedence and short circuiting
> when doing this otherwise results will not be as required. Cobol
> simply does not allow these problems to arise.
>
> C may be more complex but Cobol is more reliable. OO may solve some
> of C's problems but C++'s solution set does not map even close to
> Cobol's problem set.
>
> Cobol programmers don't 'get' OO because you fail to show them
> anything that is useful to their needs.

A broad generalization from one who has not tried to educate large numbers
of people. It adds up to bad Monday morning quarterbacking, then publishing
bad advice. It's almost embarrasing to read someone say these things.

JXStern

unread,
Dec 28, 2002, 8:42:32 PM12/28/02
to
On 28 Dec 2002 11:59:27 -0800, rip...@Azonic.co.nz (Richard) wrote:
>Exactly. Business applications are based around holding data in
>databases - there is little need for dynamic creation of memory
>objects when these are required to be permanent records on disk
>anyway.
>
>> Because COBOL programmers let the database handle multiplicity, even in
>> cases where this is much slower.

Since any OO program has to first read the data out of the database,
too, I don't see that this accusation of "slower" makes any sense.

J.

Peter E. C. Dashwood

unread,
Dec 28, 2002, 8:42:38 PM12/28/02
to
Warren Zeigler <warren...@attbi.com> wrote in message
news:pIlP9.145441$qF3.10336@sccrnsc04...

> "Peter E. C. Dashwood" <dash...@enternet.co.nz> wrote in message
> news:b3638c46.02122...@posting.google.com...
>
> > If you are interested in this, I wrote an article on it last year.
> > Here's the link: http://66.166.131.10/Archives/V3/V30201.asp
>
> A few comments on your article:
>
> I started another thread on the topic of "what is an object." In your
> article, you talk about "instances of small, efficient, modules." This is
an
> excellent definition, and very close to what I teach.
>
> In your history, you call Java Beans possibly the first component. Sun
went
> to Borland for help in developing Java Beans because of the success of
> Delphi's VCL (Visual Component Library). Boland's components were the
first
> to be written in the same language that they ran in. (Visual BASIC added
> this later - I think in 5). Delphi's internal name at Borland was VBK -
> Visual BASIC Killer. The VCL was an answer to MS's Visual Basic 16 bit
> components - written in C then C++, running in Visual BASIC.
>
Thanks for that Warren. The article is necessarily coloured by my own
experience which is very cursory when it comes to Delphi. I wasn't aware of
the Sun/Borland connection. I wasn't even sure if Delphi supported
components or OO... all I really know is that the first implementations of
it generated Pascal and the BDE is a constant headache on a Corporate
network <G>. Anyway, I am now better informed...


> > I recommend that they learn Java (because it is DESIGNED around OO and
> > it is useful in today's environment) as a means to grasp OO concepts.
>
> I now recommend Java, C# or Visual Basic.NET, depending on the
programmer's
> prior experience and current development environment.
>

Again, I based this on my own experience. I have taught many people COBOL
(and various Assemblers) over the years but getting to grips with OO seems
daunting to most COBOL programmers (self included when I first started
nearly 3 years ago...). I struggled with the new syntax for OO COBOL for
some weeks (and I am not normally a slow learner). In the end I decided to
take a break and learn Java. (When I say "learn" I mean "Teach myself"... I
find this way I can learn at my own pace and it works best for me. The books
I used were Rogers Cadenhead's "Java 2 in 24 hours" and Sybex's 1000 page
"Java 2 Complete". Both absolutely excellent start points...). Within 10
days I had it. Wrote some Java applets and the odd Class, satisfied myself I
could handle it. Wrote a few beans...great stuff. Went back to OO COBOL and
sailed through it. The difference was that I now had the "hooks" to hang the
syntax on... without an underlying understanding of OO concepts like
inheritance, polymorphism, instantiation, it is just an overwhelming
experience.

The whole theme of this thread is about the difficulties faced by COBOL
people coming to OO. I believe the problem is that a complex and verbose
syntax must be dealt with at the same time as a bunch of new concepts.
Separate these two things and COBOL people can learn OO just as well as
anybody else.

Java has a compact syntax and so the concepts can be examined without
distraction.

Getting Wil Price's excellent book on "Elements of Object Oriented COBOL"
certainly helps...

Adding Web based skills like XML, JavaScript, etc. seems to have afforded me
all the tools I need at the moment. I don't anticipate learning C# at this
stage and I have picked up a "reading knowledge" of VB from the teams I
manage.

> > The main problems with OO COBOL are that it is too verbose, and it was
> > "bolted on" to COBOL (although they did an excellent job of this...).
> > It is a bewildering experience to try and pick up the syntax and the
> > OO concepts at the same time.
>
> Could you expand on this a little?

Have a look at the syntax of OO COBOL. It is daunting when it talks of
REPOSITORY, FACTORY METHODS, CLASSes, OBJECTs, if you have no idea what
these things mean. Furthermore, syntax that is familiar to a COBOL
programmer changes in the OO environment. Methods are INVOKED (rather than
CALLED), the concept of instantiating an Object and getting a pointer
(Object Reference) to it is foreign to most COBOL people who are not used to
dealing with low level things like address pointers... It is a strange and
confusing new world to many. Then there is a whole new design and
development methodology to go with it. It is small wonder that Fortress
COBOL has not exactly rushed to embrace it...

And there is still argument about this very syntax. A standard that should
have been produced in 1997 has finally been delivered some weeks ago. This
standard has taken 17 years to produce and it STILL hasn't finalised OO
COBOL... (Don't start me on this...it is a VERY sore point to me <G>). If it
wasn't for some farsighted vendors like Fujitsu and MicroFocus we still
wouldn't be able to write OO COBOL.

I think the points raised in this thread are all valid and I have been very
interested to read them.

I was referred to this NG by someone in comp.lang.cobol, and have been very
interested to see some of what is being posted here. I intend to "pop in"
here more in future <G>

Pete.

Posted Via Usenet.com Premium Usenet Newsgroup Services
----------------------------------------------------------
** SPEED ** RETENTION ** COMPLETION ** ANONYMITY **
----------------------------------------------------------
http://www.usenet.com

Peter E. C. Dashwood

unread,
Dec 28, 2002, 8:54:34 PM12/28/02
to

Ron <No...@swbell.net> wrote in message
news:aui291$ij1$1...@ngspool-d02.news.aol.com...
I read this after I had replied to Warren Zeigler further up this thread. It
looks to me Ron like you have had exactly the same experience I did when I
first encountered OO. I endorse 100% what you have posted EXCEPT the first
bit about OO COBOL syntax being clear. The problem with learning OO concepts
through use of OO COBOL syntax is that it takes too long... You CAN do it
and Wil Price does a fantastic job of doing it in his book "Elements of
Object Oriented COBOL", but you don't really get the same clarity that you
do when you learn Java.

I guess it comes down to individuals. I completely agree that trying to
explain what an Object is by means of a Java code example, might as well be
Swahili to the average COBOL programmer.

It is pretty academic anyway, in my opinion. I believe COBOL will cease to
be used as a serious development language within the next 10 years and even
having OO available is too little too late to save it now. There would have
to be a tremendous upsurge in the use of OO COBOL to have a chance of saving
it. Such upsurge is nowhere in sight <G>.

Peter E. C. Dashwood

unread,
Dec 28, 2002, 9:11:39 PM12/28/02
to
Richard <rip...@Azonic.co.nz> wrote in message
news:217e491a.02122...@posting.google.com...

All very valid points, Richard and well worth pointing out. (I had forgotten
some of these myself <G>).

However, it would be wrong to say that COBOL programmers don't have any
problems, and it would also be wrong to say that what stops them embracing
OO is a lack of understanding of low level problems that are addressed by
OO.

You don't build OO systems because they handle memory allocation better...
(well, you shouldn't...unless that is what is really dear to your heart
<G>).

I think also we should exclude Batch Processing from the discussion. There
is no doubt in my mind that this is best carried out by procedural
processing and it is tiresome to see OO simulating it by repetitive
instantiations.

I agree completely that COBOL people are capable of learning OO and I agree
that it depends on how it is presented to them. I can't agree that there is
no need for it because COBOL doesn't have the problems of other languages,
and the reason other languages use OO is to solve these problems (although I
don't disagree with this as a fair statement...<G>)

There are many reasons why you would want to implement an OO approach.
Re-use and extension of Classes, development of components which can be
plugged into different environments, these are all reasons I use OO.

Warren Zeigler

unread,
Dec 28, 2002, 11:41:06 PM12/28/02
to
"Peter E. C. Dashwood" <dash...@nospam.enternet.co.nz> wrote in message
news:3e0e5...@Usenet.com...

> Adding Web based skills like XML, JavaScript, etc. seems to have afforded
me
> all the tools I need at the moment. I don't anticipate learning C# at this
> stage and I have picked up a "reading knowledge" of VB from the teams I
> manage.

FYI: Java and C# are very close. The bigger difference is the Java
platform -vs- .NET .

> Have a look at the syntax of OO COBOL. It is daunting when it talks of
> REPOSITORY, FACTORY METHODS, CLASSes, OBJECTs, if you have no idea what
> these things mean. Furthermore, syntax that is familiar to a COBOL
> programmer changes in the OO environment. Methods are INVOKED (rather than
> CALLED), the concept of instantiating an Object and getting a pointer
> (Object Reference) to it is foreign to most COBOL people who are not used
to
> dealing with low level things like address pointers... It is a strange and
> confusing new world to many. Then there is a whole new design and
> development methodology to go with it. It is small wonder that Fortress
> COBOL has not exactly rushed to embrace it...

Comment after comment brings us back to the concept of instantiation. This
is more of a key to understanding objects than most people realize. It is
the only "detail" concept I teach in my first lesson, which otherwise covers
the broader picture of the paradigm. I find that many students have problems
with it even with a few explanations, and if they do not get past this, they
would have problems with everything else. (Take a look at another thread I
started on the definition of an object. I feel that this concept is key and
must be included in any quality definition of an object.)

> I think the points raised in this thread are all valid and I have been
very
> interested to read them.

I agree. For some, it is a sensitive topic. Some others have just insulted,
but those who have been through this and/or help others through the
transition know that there are problems, and they are not solved unless they
are discussed.

> I was referred to this NG by someone in comp.lang.cobol, and have been
very
> interested to see some of what is being posted here. I intend to "pop in"
> here more in future <G>

I have not been to this newsgroup yet, but I probably should.

--
Warren Zeigler
wzei...@UnderstandingObjects.com
"Peter E. C. Dashwood" <dash...@nospam.enternet.co.nz> wrote in message
news:3e0e5...@Usenet.com...

Warren Zeigler

unread,
Dec 28, 2002, 11:52:00 PM12/28/02
to
"JXStern" <JXSternC...@gte.net> wrote in message
news:dkks0v87sdg49r3cd...@4ax.com...

> Since any OO program has to first read the data out of the database,
> too, I don't see that this accusation of "slower" makes any sense.

Reading the data from the database once is fine. If you have to do several
things with the data and wind up reading from the database several times
unnecessarily is not.

Since we are on the topic, as an Oracle trainer:
1) Reading objects from the database (even through object/relational views)
is faster than reading through joins, both on the database end and
communications. The communications are faster because the database can send
ALL objects of a composite at once, quickly loading into the client, instead
of having to deal w/ several index reads then dealing w/ the cursor and
associated network overhead. (This is straight from the Oracle 8i and 9i
server documentation.)
2) Reading into a collection of objects on the client is much cleaner and
better for the server due to locking and "repeatable read" issues.

In short, there are several reasons why, when working with conventional or
object/relational data, that using object collections on the client is
easier, faster, and a lighter load on the server.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Warren Zeigler

unread,
Dec 28, 2002, 11:56:22 PM12/28/02
to
"Peter E. C. Dashwood" <dash...@nospam.enternet.co.nz> wrote in message
news:3e0e5...@Usenet.com...
> You don't build OO systems because they handle memory allocation better...
> (well, you shouldn't...unless that is what is really dear to your heart
> <G>).

Memory allocation is the tool. The benifits are lifetime and multiplicity,
plus the creation of a building block for all other OO techniques. (Think of
sales discussions: features -vs- benifits. We implement features. We use
benifits.)

> I think also we should exclude Batch Processing from the discussion. There
> is no doubt in my mind that this is best carried out by procedural
> processing and it is tiresome to see OO simulating it by repetitive
> instantiations.

? - Every time you run a procedural program you are instantiating it and one
set of its variables.

> I agree completely that COBOL people are capable of learning OO and I
agree
> that it depends on how it is presented to them.

Exactly.

Warren Zeigler
wzei...@UnderstandingObjects.com


Uncle Bob (Robert C. Martin)

unread,
Dec 29, 2002, 12:19:57 PM12/29/02
to
"Warren Zeigler" <warren...@attbi.com> might (or might not) have
written this on (or about) Tue, 24 Dec 2002 04:29:02 GMT, :

>"Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in
>message
>> This is exactly the opposite of my experience. (Which just tells me
>> that there's no one true path.) In my experience code crystallizes
>> understanding. We can wave our hands about object and relationships
>> and abstraction, and we can make it all sound warm and fuzzy. (I'm
>> not saying that this is what *you* do). However, until you show some
>> code, nobody really understands anything. Once you show some code,
>> everybody goes: "Oh! That's what you meant!"


>
>I would say that working w/ code SEEMS to crystallize understanding. The
>student can quickly do exercises, but they LOSE:
>1) The connection to the real world, which hampers analysis and design

I'd rather they *not* get the connection to the real world, since I
consider that to be a heinous crime perpetrated upon OO.

>2) The ability to take the concept to other languages

Actually, once you understand it in one language, moving it to another
is trivial.

>3) The ability to see variations on a theme, since they see a narrow
>implementation of a broad concept, instead of the broad concept

I quite disagree. Once you see something in the concrete, the
abstraction becomes possible.

>4) From #4, the ability to easily use it in a creative situation, since
>learning from code is more cookbook learning, instead of learning the "why"
>and letting the student work from there to many possible implementations.

Again, I disagree. But this disagreement is particularly telling. The
*why* of OO is all about the code, and not about anything else. In my
view, OO is not about the real world, not about building huge semantic
models, not about far flung architectures; rather it is about managing
the dependencies between modules at the code level.

That is how OO started, and that is all OO has ever really become.
All the rest is fluff IMHO, and a huge waste of time and effort.

Robert C. Martin | "Uncle Bob"
Object Mentor Inc.| unclebob @ objectmentor . com
PO Box 5757 | Tel: (800) 338-6716
565 Lakeview Pkwy | Fax: (847) 573-1658 | www.objectmentor.com
Suite 135 | | www.XProgramming.com
Vernon Hills, IL, | Training and Mentoring | www.junit.org
60061 | OO, XP, Java, C++, Python |

"...a member of my local hiking club is a
nudist. I once asked him if he ever hiked in the nude. He responded
that he was a nudist, he wasn't nuts."
-- Daniel Parker

Richard

unread,
Dec 29, 2002, 2:16:10 PM12/29/02
to
"Peter E. C. Dashwood" <dash...@nospam.enternet.co.nz> wrote

> However, it would be wrong to say that COBOL programmers don't have any
> problems,

Cobol certainly does have problems, but these are not in any way
adressed by the features that C++ adds to C, or by Java. The complaint
that "Cobol programmers don't get it" will remain until OO trainers
learn what Cobol applications are about and present solutions that
relate to business orientated problems.

> You don't build OO systems because they handle memory allocation better...

Actually, that is a significant part of the trend from C to C++, Java,
and C#, and a major differentiation between these.

> I can't agree that there is
> no need for it because COBOL doesn't have the problems of other languages,
> and the reason other languages use OO is to solve these problems (although I
> don't disagree with this as a fair statement...<G>)

Cobol doesn't have the _same_ problems as other languages (especially
C and Pascal), thus it needs different solutions. There is some
commonality of problem/solution where interaction with external
systems is involved.

Either the OO trainer needs to understand Cobol usage in order to
present meaningful solution, or he needs to teach the problem set
presented by C so that the OO solutions can be seen as useful.

It seems that the latter has been discovered, but the former is
dismissed.

Richard

unread,
Dec 29, 2002, 2:35:29 PM12/29/02
to
"Warren Zeigler" <warren...@attbi.com> wrote

> SOME things need to be on disk. That doesn't mean that EVERY time a
> programmer needs to handle multiplicity that placing it on disk is best.
> That is a VERY unreasonable restriction.

That was not said. Cobol also has a very flexible table handling
system that has done quite well as dealing with internal multiplicity
for many decades. This includes such things as the SEARCH verb that
avoids the need to know grubby internal details.



> Sometimes, not all of the time.

In cases where permanent shared storage is not required, tables can be
used. Given that Business Applications do not normally include such
things as simulation then tables mostly provide a sufficient tool.



> And other applications are purged daily. Don't take the extreme and try to
> make a rule out of it, then use it to deny flexability!

Other _Business_ Applications ?



> A broad generalization from one who has not tried to educate large numbers
> of people.

I will erase that from my CV immediately because you _must_ be right.

> It adds up to bad Monday morning quarterbacking, then publishing
> bad advice. It's almost embarrasing to read someone say these things.

It must be very hard to switch from teaching into learning mode when
you already have all the answers.

Richard

unread,
Dec 29, 2002, 2:48:25 PM12/29/02
to
"Warren Zeigler" <warren...@attbi.com> wrote

> Memory allocation is the tool.

Memory allocation is _a_ tool. One that requires a large amount of
programmer management to get it right in C, and is often poorly done
in C programs.

In Cobol it is not required, where it is done (eg CALL/CANCEL) it is
fully automatic and of no concern to the programmers.

> The benifits are lifetime and multiplicity,

Which can be had by other tools.

> plus the creation of a building block for all other OO techniques.

Exactly. If you fail to show Cobol programmers why they need to care
about this at all then you have not established a foundation of why OO
exists.

> (Think of
> sales discussions: features -vs- benifits. We implement features. We use
> benifits.)

and the benefit of a refrigerator to an Eskimo is ?

> ? - Every time you run a procedural program you are instantiating it and one
> set of its variables.

Yes, but only once for each run and are using that instance for all
transactions in that run.

> > I agree completely that COBOL people are capable of learning OO and I
> agree
> > that it depends on how it is presented to them.
>
> Exactly.

Sooooo, with your:

"While teaching Understanding Objects, I have seen this problem. More
than
3/4 of the students that were coming from COBOL had serious problems
in
class."

you agree that your presentation is at fault ?

Richard

unread,
Dec 29, 2002, 3:04:38 PM12/29/02
to
"Warren Zeigler" <warren...@attbi.com> wrote

> In short, there are several reasons why, when working with conventional or
> object/relational data, that using object collections on the client is
> easier, faster, and a lighter load on the server.

You are not used to multiuser systems are you. If a collection is
read into a client for potentially updating this then either they must
be locked so that other clients may not update them in the meantime or
the items must be re-read and locked individually before updating
(with consequent issues if the data has changed).

Also, of course, the collection will only be successful if no other
client has any of this locked at the time of reading.

While it is certainly true that for a single user it is best to read a
collection for the reasons you state, as soon as there are multiple
users then a finer granularity usually results in fewer interlocks and
beter overall system response.

In any case Cobol has had its own built-in database system (based on
ISAM files) since the 60s and the programmers have developed
mechanisms that they know work best.

Richard

unread,
Dec 29, 2002, 3:25:19 PM12/29/02
to
"Warren Zeigler" <warren...@attbi.com> wrote

> > data items
> > need to be stored in a consistent manner in case recovery is needed.
>
> Sometimes, not all of the time.

Do you think that it only needs to be consistent 'sometimes' ?

Warren Zeigler

unread,
Dec 29, 2002, 6:50:28 PM12/29/02
to
"Richard" <rip...@Azonic.co.nz> wrote in message
news:217e491a.02122...@posting.google.com...
> You are not used to multiuser systems are you. If a collection is
> read into a client for potentially updating this then either they must
> be locked so that other clients may not update them in the meantime or
> the items must be re-read and locked individually before updating
> (with consequent issues if the data has changed).


I AM used to multi-user systems. There are MANY times where you need a
consistant read, not a lock, holding records until you write something new.

My purpose here is not to convert the world from COBOL. You are evidently
not in a shop that is forcing you to move for one reason or another. I do
not know enough about your environment to know why this is a concern to you.
Let's not get into a fight. Some people are staying in COBOL, but moving to
OO COBOL.
--
Warren Zeigler
wzei...@UnderstandingObjects.com


Warren Zeigler

unread,
Dec 29, 2002, 6:57:57 PM12/29/02
to
"Richard" <rip...@Azonic.co.nz> wrote in message
news:217e491a.02122...@posting.google.com...

> Memory allocation is _a_ tool. One that requires a large amount of


> programmer management to get it right in C, and is often poorly done
> in C programs.

That's why C++ was invented instead of using C for OO programming. (It is
possible. The first few versions of C++ generated C, which was then
compiled.)

> In Cobol it is not required, where it is done (eg CALL/CANCEL) it is
> fully automatic and of no concern to the programmers.

No. COBOL, like all other procedural languages, allocated memory for you for
all variables. The lifetime in most of these languages is the same as the
runtime, and the multiplicity is restricted to 1. OO just makes that more
flexible, and the details of the memory management are hidden behind NEW and
either DELETE or garbage collection.


> > The benifits are lifetime and multiplicity,
> Which can be had by other tools.

Yes, but not as easy, and w/ objects you then can build with other
capabilities. Raw objects are floor 1 of capability of a large building,
which you have to take into account if you are debating OO verses anything
else.

> > > I agree completely that COBOL people are capable of learning OO and I
> > agree
> > > that it depends on how it is presented to them.
> >
> > Exactly.
>
> Sooooo, with your:
>
> "While teaching Understanding Objects, I have seen this problem. More
> than
> 3/4 of the students that were coming from COBOL had serious problems
> in
> class."
>
> you agree that your presentation is at fault ?

No. In my class I overcame the problems by compensating for them. THAT is
the point of this whole thread - how to overcome a specific problem:
transitioning people who are legacy COBOL programmers w/o other experience
to objects, motivations aside.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Warren Zeigler

unread,
Dec 29, 2002, 7:01:30 PM12/29/02
to
"Richard" <rip...@Azonic.co.nz> wrote in message
news:217e491a.02122...@posting.google.com...
> "Peter E. C. Dashwood" <dash...@nospam.enternet.co.nz> wrote
>
> > However, it would be wrong to say that COBOL programmers don't have any
> > problems,
>
> Cobol certainly does have problems, but these are not in any way
> adressed by the features that C++ adds to C, or by Java. The complaint
> that "Cobol programmers don't get it" will remain until OO trainers
> learn what Cobol applications are about and present solutions that
> relate to business orientated problems.

That is not what a trainer is paid for. If you or your boss hires a trainer,
you had better be well past that point. You are mixing problems training
with the decision to move. One issue at a time, please!

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Warren Zeigler

unread,
Dec 29, 2002, 7:15:44 PM12/29/02
to
"Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in
message > Again, I disagree. But this disagreement is particularly telling.

The
> *why* of OO is all about the code, and not about anything else. In my
> view, OO is not about the real world, not about building huge semantic
> models, not about far flung architectures; rather it is about managing
> the dependencies between modules at the code level.

> That is how OO started, and that is all OO has ever really become.
> All the rest is fluff IMHO, and a huge waste of time and effort.

No, Robert.

C++ and most of OO started with simulating the real world. The very concept
of polymorphism comes from classifications of the real world, along with
such terms as object, class and inheritance.

The etimology of the very concepts are from simula and other ways to code
real-world problems.

Robert - you once wrote a very extensive thread about how you can "see"
things, and how you then put them together. A few people can do this, and
they rarely agree. To the other 99.9% of us, this is like 1+1=2 thus E=MC2.

EVEN IF this were not true, as it has been said for many years, every
computer program simulates the real world in one way or another. ALSO,
programs are written to solve real-world problems. Whether you like it or
not, the code has something to do with real-world requirements. To deny this
is to deny a commonality in all code that many people use to get from one to
the other.

As far as OO goes, as stated above, the concepts come from the real world,
and the programmers live in the real world. We think by association, and
denying this tie is to deny the best teaching tool possible: the prior
experience of the student.

This is one of the reasons why computers are difficult for many people: They
are often taught as something new, not similar to anything that the student
knows. This makes the student like a blind person, having to start all over
in learning the smallest detail. This is terrible education.

Here is one of the keys to why the problems in learning objects have not
been solved: Most of the people training went through years of experience
that the student does not have. The teachers just teach the advanced
concepts, and the student is left to "rediscover the OO laws of nature."

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Peter E. C. Dashwood

unread,
Dec 29, 2002, 9:43:23 PM12/29/02
to

Warren Zeigler <warren...@attbi.com> wrote in message
news:8TLP9.494736$WL3.128345@rwcrnsc54...

> "Richard" <rip...@Azonic.co.nz> wrote in message
> news:217e491a.02122...@posting.google.com...
>
> My purpose here is not to convert the world from COBOL. You are evidently
> not in a shop that is forcing you to move for one reason or another. I do
> not know enough about your environment to know why this is a concern to
you.
> Let's not get into a fight.

Awwwww...fight is good! A passionate exchange of ideas between a couple of
well matched adversaries HAS to be entertaining... Could it be that you felt
you weren't likely to win...<G>?

I thought Richard's points were excellent. He was definitely ahead on points
when you called it off...<G>.

> Some people are staying in COBOL, but moving to
> OO COBOL.

Sadly for COBOL, Warren, it is too few, too late.

I started writing COBOL in 1967 and made a good living with it for about 25
years. Travelled all over the World, met some fascinating characters and saw
some amazing places. It makes me very sad to see this language dying.
Nowadays, I write it for fun (and some small profit) and I expect to do so
as long as I live. But my living is now made in consultancy and Management.

But we have to accept reality. The Market Place has decided that procedural
languages are no longer relevant. A move to OO would have extended the life
of COBOL and at least assured it of being used for writing Business
Components (I still believe it is the best tool for the job when the job is
commercial data processing). Sadly, it just hasn't happened.

You are right. Some people ARE moving to OO COBOL (self included) but I
don't think the ones who are not are wrong...it is now too late to make any
difference. Richard has provided some very good reasons for not using OO
COBOL and, although he hasn't persuaded me to give it up <G>, I can see that
many people will stay with what they are comfortable with.

Warren Zeigler

unread,
Dec 29, 2002, 10:39:42 PM12/29/02
to
"Peter E. C. Dashwood" <dash...@nospam.enternet.co.nz> wrote in message
news:3e0fb...@Usenet.com...

>
> Warren Zeigler <warren...@attbi.com> wrote in message
> news:8TLP9.494736$WL3.128345@rwcrnsc54...
> > "Richard" <rip...@Azonic.co.nz> wrote in message
> > news:217e491a.02122...@posting.google.com...
> >
> > My purpose here is not to convert the world from COBOL. You are
evidently
> > not in a shop that is forcing you to move for one reason or another. I
do
> > not know enough about your environment to know why this is a concern to
> you.
> > Let's not get into a fight.
>
> Awwwww...fight is good! A passionate exchange of ideas between a couple of
> well matched adversaries HAS to be entertaining... Could it be that you
felt
> you weren't likely to win...<G>?
>
> I thought Richard's points were excellent. He was definitely ahead on
points
> when you called it off...<G>.

I have a whole presentation on the advantages of objects. It starts w/
complexity management, components, runtime flexibility and the like.

I have also learned that some people get real profficient in a language, and
arguing points is not only futile, but sometimes wrong when a person has
been able to be creative in a particular language for many years.

That aside, I am trying to make a point about education here, and get
opinions. If he wants to argue why, he should move to a different thread.
Personnaly, I am not an evangelest, I just enable people when they
themselves decide to make the change.

--
Warren Zeigler
wzei...@UnderstandingObjects.com


Topmind

unread,
Dec 30, 2002, 3:40:12 AM12/30/02
to

>
> I have a whole presentation on the advantages of objects. It starts w/
> complexity management, components, runtime flexibility and the like.

If it is like the stuff on your website, then it is pumping
bullsh8t into those poor COBOLer's.

Complexity management, my 8ss!

-T-
oop.ismad.com

S Perryman

unread,
Dec 30, 2002, 7:52:09 AM12/30/02
to
"Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in
message news:2dbu0vk0vfv4dbdtg...@4ax.com...

> "Warren Zeigler" <warren...@attbi.com> might (or might not) have
> written this on (or about) Tue, 24 Dec 2002 04:29:02 GMT, :

>>4) From #4, the ability to easily use it in a creative situation, since


>>learning from code is more cookbook learning, instead of learning the
"why"
>>and letting the student work from there to many possible implementations.

> Again, I disagree. But this disagreement is particularly telling. The
> *why* of OO is all about the code, and not about anything else. In my
> view, OO is not about the real world, not about building huge semantic
> models, not about far flung architectures; rather it is about managing
> the dependencies between modules at the code level.

> That is how OO started, and that is all OO has ever really become.
> All the rest is fluff IMHO, and a huge waste of time and effort.

The big lie of OO often touted by Robert Martin on comp.object .

Unfortunately for him, one that is not backed by anything the creators
of OO, the Simula grp at the NCC (Nygaard, Dahl etc) have ever
said or written. That in itself speaks volumes IMHO.


Regards,
Steven Perryman

JXStern

unread,
Dec 30, 2002, 11:12:37 AM12/30/02
to
On Sun, 29 Dec 2002 04:52:00 GMT, "Warren Zeigler"
<warren...@attbi.com> wrote:
>> Since any OO program has to first read the data out of the database,
>> too, I don't see that this accusation of "slower" makes any sense.
>
>Reading the data from the database once is fine. If you have to do several
>things with the data and wind up reading from the database several times
>unnecessarily is not.

Does not follow.

Databases cache recently used pages in memory, too.

You can gain performance by giving up various ACID properties, but
generally you end up deciding that's not acceptable after all.

There are some in-memory databases that will load up EVERTHING into
RAM and can give better performance than a database which can't be
certain that a page is cached, mostly due to shorter critical sections
deep in the code, and using more hash tables and fewer b-trees, if I
understand correctly what they're doing.

That said, I suppose in many cases one can, at immense cost, build
some middleware (presumably object-oriented) that optimizes total
system architecture for the most common cases in a particular
application, and thus improve performance over even a good database
implementation. But generally you can get 80% of that ultimate
performance with 20% of the effort by just letting the database do
what it does.

>Since we are on the topic, as an Oracle trainer:
>1) Reading objects from the database (even through object/relational views)
>is faster than reading through joins, both on the database end and
>communications. The communications are faster because the database can send
>ALL objects of a composite at once, quickly loading into the client, instead
>of having to deal w/ several index reads then dealing w/ the cursor and
>associated network overhead. (This is straight from the Oracle 8i and 9i
>server documentation.)

This is mostly half-truths and gibberish, and I don't care who prints
it where.

Dumping N rows is better done in 1 call than M calls, since any
transaction is going to have some overhead. Duh.

But middleware that sucks up entire tables is probably moving N times
more data than any/all clients are going to access during one system
session, say, twenty-four hours. That generally swamps any advantage
you thought you were getting as above.

Misreading isolated technical facts like this is so common ... so sad.
What is stated as a nice rule of thumb for two-tier design has almost
zero bearing on three-tier architectural designs.

>2) Reading into a collection of objects on the client is much cleaner and
>better for the server due to locking and "repeatable read" issues.

Total gibberish, mostly already addressed above. It's "cleaner" only
if your architecture does not call on ACID properties, which it
probably would, if you understood them.

>In short, there are several reasons why, when working with conventional or
>object/relational data, that using object collections on the client is
>easier, faster, and a lighter load on the server.

Well, of course it's a lighter load on the server, if you're
duplicating its functionality elsewhere.

J.

H. S. Lahman

unread,
Dec 30, 2002, 11:16:31 AM12/30/02
to
Responding to Martin...

>>1) The connection to the real world, which hampers analysis and design
>
>
> I'd rather they *not* get the connection to the real world, since I
> consider that to be a heinous crime perpetrated upon OO.

The "real world" in the OO context is simply the problem space. For OOA
that is the customer's space. For OOP it is the computing space. For
OOD it is a bit of both. Note that if one is doing something like a DB
engine, the customer space is the computing space. In addition, larger
applications typically have several problem spaces.

However, the developer must be familiar with whatever problem space is
relevant to the software being developed. That is hardly unique to OO
development.

>>3) The ability to see variations on a theme, since they see a narrow
>>implementation of a broad concept, instead of the broad concept
>
>
> I quite disagree. Once you see something in the concrete, the
> abstraction becomes possible.

This forum is testimony to Zeigler's point. How many times has someone
come on this forum with a problem described in OOPL code when the real
problem was that the solution approach was not correct for the problem
in hand? Once one writes code one has already committed to an OOA/D
solution to the problem. Zeigler's point is that at the code level one
becomes myopic about the basic solution.

>
>
>>4) From #4, the ability to easily use it in a creative situation, since
>>learning from code is more cookbook learning, instead of learning the "why"
>>and letting the student work from there to many possible implementations.
>
>
> Again, I disagree. But this disagreement is particularly telling. The
> *why* of OO is all about the code, and not about anything else. In my
> view, OO is not about the real world, not about building huge semantic
> models, not about far flung architectures; rather it is about managing
> the dependencies between modules at the code level.
>
> That is how OO started, and that is all OO has ever really become.
> All the rest is fluff IMHO, and a huge waste of time and effort.

You are correct that the OOPLs came first, then the notations (e.g.,
Booch's Graphical C++), then the formal methodologies. That was
unfortunate because the tail is wagging the dog. It is a problem
because the OOPLs have already made compromises with the computing
environment so they are a poor place to try to learn about good OO
development.

OTOH, as others have pointed out, you are incorrect about the roots of
OO; it was, indeed, driven by a need to express simulation models.
However, the value of things like abstraction and encapsulation for
addressing well known maintainability problems of procedural development
were pretty obvious so reuse and maintainability became the focus quickly.

Perhaps more to the point, much of the dependency management that you do
at the OOP level is because the OOPLs do a good job of logical
decoupling but do not handle physical coupling very well. John Lakos
was one of the first to identify physical coupling as a problem with his
"Large Scale C++ Software Design" ('96), followed closely by Fowler's
"Refactoring" ('99). Others, including yourself, had nascent papers on
individual dependency patterns dating from the early '80s.

But it would be very hard to trace the recognition of physical coupling
as an issue to the roots of OT because the OOPL designers were clearly
not thinking about it. The fact that it took a couple of decades to
recognize the problem and suggest solutions is strongly suggestive that
the OO Founding Fathers were thinking about something else at the time.


*************
There is nothing wrong with me that could
not be cured by a capful of Drano.

H. S. Lahman
h...@pathfindersol.com
Pathfinder Solutions -- We Make UML Work
http://www.pathfindersol.com
(888)-OOA-PATH


Warren Zeigler

unread,
Dec 30, 2002, 11:25:09 AM12/30/02
to
"JXStern" <JXSternC...@gte.net> wrote in message
news:v7r01vg56pc61gt3k...@4ax.com...

> Databases cache recently used pages in memory, too.

But they are accross the network in a client/server world, creating
considerable network traffic and slowing down processing.

You are discounting Oracle and independent studies to prove your point. If
we cannot deal with facts, let's just drop it before the discussion
degenerates.
--
Warren Zeigler
wzei...@UnderstandingObjects.com


JXStern

unread,
Dec 30, 2002, 2:25:19 PM12/30/02
to
On Mon, 30 Dec 2002 16:25:09 GMT, "Warren Zeigler"
<warren...@attbi.com> wrote:
>You are discounting Oracle and independent studies to prove your point.

I am more carefully determining what statements mean in relation to
the problem at hand, rather than citing them as evidence when they
actually are not.

J.

Uncle Bob (Robert C. Martin)

unread,
Dec 30, 2002, 8:07:47 PM12/30/02
to
"Warren Zeigler" <warren...@attbi.com> might (or might not) have
written this on (or about) Mon, 30 Dec 2002 00:15:44 GMT, :

>"Uncle Bob (Robert C. Martin)" <u.n.c.l...@objectmentor.com> wrote in
>message > Again, I disagree. But this disagreement is particularly telling.
>The
>> *why* of OO is all about the code, and not about anything else. In my
>> view, OO is not about the real world, not about building huge semantic
>> models, not about far flung architectures; rather it is about managing
>> the dependencies between modules at the code level.
>
>> That is how OO started, and that is all OO has ever really become.
>> All the rest is fluff IMHO, and a huge waste of time and effort.
>
>No, Robert.
>
>C++ and most of OO started with simulating the real world. The very concept
>of polymorphism comes from classifications of the real world, along with
>such terms as object, class and inheritance.

There is a big difference between writing simulators and the kind of
"real world modeling" that we find associated with OO nowadays.
Simulators are complex applications that benefit very strongly from
the kind of dependency management that OO affords. They have nothing
to do with the kind of NOUN/VERB modeling that dominates the "real
world" school of thought.

It is true that the originators of OO did so while working on
simulators. Indeed, they were working on a language that would help
simulation (SIMULA67). However, the breakthrough insight that they
made had nothing to do with simulation. They were investigating ALGOL
blocks, and that noted that a block was really just a data structure
on the stack, that had variables in it that the subfunctions within
the block had access to. They pondered what might happen if those
data structures were allocated on the heap rather than on the stack,
and therefore had a lifetime that was decoupled from the creating
function. You can read about this in "Structured Programming" by
Dijkstra, Dahl, and Hoare, Academic Press, 1972.

This insight led almost immediately to the notion of polymorphic
interfaces. This insight would have happened regardless of whether
the initial researchers were investigating simulation or finance.
Simulation was not the stimulus. Creating thinking about blocks was.

>EVEN IF this were not true, as it has been said for many years, every
>computer program simulates the real world in one way or another.

If you accept this definition of "modeling the real world" then the
association with OO is broken, since all computer programs are not OO.

>ALSO,
>programs are written to solve real-world problems. Whether you like it or
>not, the code has something to do with real-world requirements.

This is certainly true. It is the association with OO that alludes
me.

>Here is one of the keys to why the problems in learning objects have not
>been solved: Most of the people training went through years of experience
>that the student does not have. The teachers just teach the advanced
>concepts, and the student is left to "rediscover the OO laws of nature."

My experience is that existing programmers learn OO best and quickest
by being taught by experienced OO programmers. They learn it best by
seeing it in the code.

Uncle Bob (Robert C. Martin)

unread,
Dec 30, 2002, 8:09:20 PM12/30/02
to
"H. S. Lahman" <vze2...@verizon.net> might (or might not) have
written this on (or about) Mon, 30 Dec 2002 16:16:31 GMT, :

>Zeigler's point is that at the code level one
>becomes myopic about the basic solution.

I disagree. I have seen many people get very myopic about their UML
diagrams. The problem is that people get vested in their own ideas.
A good designer learns to avoid this, regardless of whether he is
writing UML or code.

It is loading more messages.
0 new messages