Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

OOP/OOD Philosophy

21 views
Skip to first unread message

xmp...@yahoo.com

unread,
Jun 29, 2005, 11:24:50 AM6/29/05
to
Hi,


I understand the mechanics of objects; I use objects, polymorphism,
even patterns. I understand some of OOD, but I'm trying to grasp the
bigger picture, how to "think in OOD". I hope an example will help:

Let's say I want a program that given A produces B. I have 2 ways of
analyzing this. I can either formally specify how B relates to A, or I
can use the OOD approach in which I identify nouns, relationships,
arity, patterns, etc...

The first approach is much more direct and is very reliable, so why
would I use the OOD approach?

Or am I comparing apples to oranges? Is OOD for designing
applications, or is it for designing an application pattern (framework?
architecture?) that is then customized to provide the desired
application (the justification being that such a design is much more
extensible)? Do I need to stop thinking "program" and instead think
"architecture" as the primary purpose and then customize it to provide
the "program" as the secondary purpose?

This is why I need to understand the philosophy. I want to understand
how to "think OOD". I don't care about specific design techniques
unless they help illustrate this shift in thinking. What can you tell
me about this? What references (online and printed are fine) can you
point me to? I'd love something that contrasts the two methodologies
and provides examples to drive it home. Something that explains and
justifies OOD from a more philosophical perspective.

Thanks in advance.

Antonio Santiago

unread,
Jun 29, 2005, 12:30:25 PM6/29/05
to
xmp...@yahoo.com wrote:
> Hi,
>
Hi, I am interested on this topic too and, although I think I am not the
best person to answer you, here is modest opinion.

>
> I understand the mechanics of objects; I use objects, polymorphism,
> even patterns. I understand some of OOD, but I'm trying to grasp the
> bigger picture, how to "think in OOD". I hope an example will help:
>
> Let's say I want a program that given A produces B. I have 2 ways of
> analyzing this. I can either formally specify how B relates to A, or I
> can use the OOD approach in which I identify nouns, relationships,
> arity, patterns, etc...
>

Here you are saying you know how to programming in an OO language, but
you are not sure to understand the main purpose of OO design.

> The first approach is much more direct and is very reliable, so why
> would I use the OOD approach?
>

OOD helps you to organize and represents the information. All people
makes a mental design to resolve a problem and then programs the
solution. In OO terms one first designs an OO solution and then programs
it in an OO language (or not :) ).

> Or am I comparing apples to oranges? Is OOD for designing
> applications, or is it for designing an application pattern (framework?
> architecture?)

A very, very bad description could be:
The architecture is the big picture of the problem. How organize the
entery problem, application, anything.
Then you need to design how every architecture will be. Here the
patterns (like wikipedia says) are "standard solutions to common
problems in software design".

that is then customized to provide the desired
> application (the justification being that such a design is much more
> extensible)? Do I need to stop thinking "program" and instead think
> "architecture" as the primary purpose and then customize it to provide
> the "program" as the secondary purpose?
>

Design is more abstract than programming, you can design thins that are
impossible to translate directly to the code (then you need to
"normalize" things before programming it).

> This is why I need to understand the philosophy. I want to understand
> how to "think OOD". I don't care about specific design techniques
> unless they help illustrate this shift in thinking. What can you tell
> me about this? What references (online and printed are fine) can you
> point me to? I'd love something that contrasts the two methodologies
> and provides examples to drive it home. Something that explains and
> justifies OOD from a more philosophical perspective.
>

I don't know if this is the best example but:

At work I am developing an application that follows a "pipe and filter"
architecture. I dont knew it follows it since some good person on this
news channels says me. I was reinventing the wheel :(

Also, I was designing the application (more concretaly the data model)
in an OO way. When I had a possible good design
(using some design patterns) I passed to normalize my model and adapt it
to the concrete OO programming language. After a couple of changes on my
desing (and new normalizations) it was a great hit.

I think it would be impossible to do if I dont stop and think on OOD for it.

> Thanks in advance.
>

I hope it will be useful for you.

--
-----------------------------------------------------
Antonio Santiago Pérez
( email: santiago<<at>>grahi.upc.edu )
( www: http://www.grahi.upc.edu/santiago )
( www: http://asantiago.blogsite.org )
-----------------------------------------------------
GRAHI - Grup de Recerca Aplicada en Hidrometeorologia
Universitat Politècnica de Catalunya
-----------------------------------------------------

Robert C. Martin

unread,
Jun 29, 2005, 5:51:29 PM6/29/05
to
On 29 Jun 2005 08:24:50 -0700, xmp...@yahoo.com wrote:

>Hi,
>
>
>I understand the mechanics of objects; I use objects, polymorphism,
>even patterns. I understand some of OOD, but I'm trying to grasp the
>bigger picture, how to "think in OOD". I hope an example will help:

There has been an enormous amount of debate and discussion about this
topic. Some folks believe that OO is an extremely high level
technique used to make models of the world. Others believe that OO is
a mechanism for structuring source code. I fall into the latter camp.

To me, the whole notion that OO is a philosophy is silly. OO is a set
of tools that help programmers structure their code better. In
particular, it helps them invert key dependencies.

In my view the OO-ness of a system can be identified by tracing the
dependencies between the modules. If the high level policy modules
depend on (directly call) lower level modules, which then call even
lower level modules, then the program is procedural and not object
oriented.

However, if the high level policy modules are independent, or depend
solely on polymorphic interfaces that the lower level modules
implement, then the system is OO.

In other words, it all has to do with the *direction* of the
dependencies.

To learn more about this see the following article:

http://www.objectmentor.com/resources/articles/Principles_and_Patterns.PDF


-----
Robert C. Martin (Uncle Bob) | email: uncl...@objectmentor.com
Object Mentor Inc. | blog: www.butunclebob.com
The Agile Transition Experts | web: www.objectmentor.com
800-338-6716


"The aim of science is not to open the door to infinite wisdom,
but to set a limit to infinite error."
-- Bertolt Brecht, Life of Galileo

Andrew McDonagh

unread,
Jun 29, 2005, 1:46:53 PM6/29/05
to xmp...@yahoo.com

This is a large area of dicussion, might be best to take a look at a few
books or web resources and then ask specific questions.

The book 'Object Thinking' by David West, although lenghty, is very good
at showing the differences in 'thinking' about problems from a
procedural and OO point of view and then more goes on to show the
history, philisophical and politics of OO

http://www.microsoft.com/MSPress/books/6820.asp

and google with 'thinking in objects' brings up lots of refs/blogs/etc.

http://www.google.co.uk/search?biw=1280&hl=en&q=thinking+in+objects&btnG=Google+Search&meta=

HTH

Andrew

plan...@gmail.com

unread,
Jun 29, 2005, 2:24:13 PM6/29/05
to
Antonio Santiago wrote:
> Hi, I am interested on this topic too and, although I think I am not the
> best person to answer you, here is modest opinion.

That's fine with me :). Thanks for responding.


> Here you are saying you know how to programming in an OO language, but
> you are not sure to understand the main purpose of OO design.

Kind of. I'm trying to get an idea of what OOD is supposed to give me
as opposed to non-OOD. But confusing this is the possibility that I
may be looking at OOD to design the wrong thing.


> OOD helps you to organize and represents the information. All people
> makes a mental design to resolve a problem and then programs the
> solution. In OO terms one first designs an OO solution and then programs
> it in an OO language (or not :) ).

But non-OOD is also design. Non-OOD focuses more on deriving the
algorithms, while OOD focuses more on the data, although the two
intersect. One can derive data from the algorithms/relations and vice
versa. So, given two different ways of doing something, and a former
way that is clearer and more direct, why use the latter? If it's an
issue of flexibility/re-usability, what's the thinking behind that?


> A very, very bad description could be:
> The architecture is the big picture of the problem. How organize the
> entery problem, application, anything.
> Then you need to design how every architecture will be. Here the
> patterns (like wikipedia says) are "standard solutions to common
> problems in software design".

Here's an example. Let's say we have a cash register program with the
options to add, subtract, print a receipt or clear the total.

Non-OO analysis:
=========
Op = current operation invoked
Price = Price Entered
X' = value change for X for next state
Total = running total
==========
Then, the state of the system (assume it waits for input) is:
==========
(Op = Start ^ Op' = Clear) v
(Op = Add ^ Total' = Total + Price) v
(Op = Sub ^ Total' = Total - Price) v
(Op = Clear ^ Total' = 0) v
(Op = Print ^ Printed(Total))
==========

This is an informal predicate calculus model that shows the state of
the system at any time, given various inputs. The ' represents
mutability in a more mathematical way. This analysis gets to the point
and is easy to code and automatically test. In addition, a procedural
or "OOP" program could be written from this. For instance, the state
can be an object that will invoke the proper functionality from a
uniform interface, or a series of procedure calls.

Given I'm not knowledgable about OOD, please forgive a possible
butchering of OOD, but here's how I can see an OOD approach to the
problem. First, I identify the nouns in the system:

User
CashRegister
Total
Operation
Price

Now the relationships. The User interacts with the CashRegister; in
response, the CashRegister creates the appropriate Operation and allows
access to the Price. Price and Total are just numbers. Furthermore,
since there are several types of Operations, Operation is an abstract
base class with Add, Sub, Print, and Clear as children sharing a
consistent interface to allow for more streamlined code. So, we get
the following (assuming a non-event model and garbage collection for
simplicity):

loop
CashRegister.Interact
op = CashRegister.Op
op.DoOp(Total, CashRegister.Price())

However, this is a bit of a kludge. Clear and Print need only one
parameter, but take 2 in order to conform with the interface. In
addition, Print gets mutable access to the Total even though it doesn't
need it, which is unsafe. Furthermore, the analysis up to this point
was not as clear (or IMO as verifiable) as the previous one.

On the other hand, maybe I was solving the wrong problem. Maybe I need
to look at the pattern of this program and build the architecture then
customize it. In this case, what I'm trying to build is a type of
machine that accepts operations, parameters, and can maintain a state
that is the result of the previous operations. Analyzing it this way,
we get:

User
Machine
Params
OutputState
Operation

Params are what the Machine returns as data -- they are
instruction/data pairs (where data can be an additional collection).
OutputState can store any number of outputs and allows read/write
access. Otherwise, the semantics are the same:

[Assume output is an instance of OutputState]
loop
Machine.Interact
plist = Machine.Params
foreach i in plist.Size
Operation op = Factory.CreateOp(plist[i].Op)
op.DoOp(plist[i].Data, output)

This is the generalized pattern, and by deriving different Operations,
Factories, OutputsStates, and even Machines we can simulate a wide
variety of machines -- perhaps even primitive operating systems. We
can accomplish things like history lists, screen writes, etc, all with
the same basic framework, because we solved a general problem. Now,
all future machine-like tasks will consist solely of deriving the
appropriate classes.

In fact, this whole thing could be made a method of a machine class:

Machine.Run(factory, output)

Then people simply derive from this class, over-ride Interact (and
anything else they want), and provide the necessary implementations.

Comments?

> that is then customized to provide the desired
> > application (the justification being that such a design is much more
> > extensible)? Do I need to stop thinking "program" and instead think
> > "architecture" as the primary purpose and then customize it to provide
> > the "program" as the secondary purpose?
> >
> Design is more abstract than programming, you can design thins that are
> impossible to translate directly to the code (then you need to
> "normalize" things before programming it).

Right, which is what I'm seeing from both cases. An analysis of the
problem simply states relations and some of these relational statements
can't even be checked by code (take relations involving quantifications
over infinite sets or involving convenience functions that don't exist
in the implementation language). What differs here is what we are
solving. Are we solving the problem at hand, or do we choose to solve
a generalization one of whose instances is the problem at hand,
ostensibly for more flexibility?

> > This is why I need to understand the philosophy. I want to understand
> > how to "think OOD". I don't care about specific design techniques
> > unless they help illustrate this shift in thinking. What can you tell
> > me about this? What references (online and printed are fine) can you
> > point me to? I'd love something that contrasts the two methodologies
> > and provides examples to drive it home. Something that explains and
> > justifies OOD from a more philosophical perspective.
> >
> I don't know if this is the best example but:
>
> At work I am developing an application that follows a "pipe and filter"
> architecture. I dont knew it follows it since some good person on this
> news channels says me. I was reinventing the wheel :(
>
> Also, I was designing the application (more concretaly the data model)
> in an OO way. When I had a possible good design
> (using some design patterns) I passed to normalize my model and adapt it
> to the concrete OO programming language. After a couple of changes on my
> desing (and new normalizations) it was a great hit.
>
> I think it would be impossible to do if I dont stop and think on OOD for it.

Could you provide some more details? How did you analyze the problem
using OOD? How would you have analyzed it using non-OOD?


>
> > Thanks in advance.
> >
>
> I hope it will be useful for you.


It was, thank you very much.

Antonio Santiago

unread,
Jun 30, 2005, 3:36:48 AM6/30/05
to
plan...@gmail.com wrote:
> Antonio Santiago wrote:
>
>>Hi, I am interested on this topic too and, although I think I am not the
>>best person to answer you, here is modest opinion.
>
>
> That's fine with me :). Thanks for responding.
>
>
>
>>Here you are saying you know how to programming in an OO language, but
>>you are not sure to understand the main purpose of OO design.
>
>
> Kind of. I'm trying to get an idea of what OOD is supposed to give me
> as opposed to non-OOD. But confusing this is the possibility that I
> may be looking at OOD to design the wrong thing.
>
>
OOD only gives you a new way to orient your ideas, like there is a very
different point of view betwen procedural and OO programming languages
(supossing we don't use OO prog. languages only in a procedural way).
Which is the best? Depends on your needs. (The best answer in
computer-science worl is "depends on" :) )

>
>>OOD helps you to organize and represents the information. All people
>>makes a mental design to resolve a problem and then programs the
>>solution. In OO terms one first designs an OO solution and then programs
>>it in an OO language (or not :) ).
>
>
> But non-OOD is also design. Non-OOD focuses more on deriving the
> algorithms, while OOD focuses more on the data, although the two
> intersect. One can derive data from the algorithms/relations and vice
> versa. So, given two different ways of doing something, and a former
> way that is clearer and more direct, why use the latter? If it's an
> issue of flexibility/re-usability, what's the thinking behind that?
>

Yes, you can design in any way. See the differences betwen a DFS in the
"structured design" and an UML class diagram. Which is the best? Other
time it depends.

Yes, the problem isn't veru difficult, then why make a difficult
solution? Why not create a simple CashRegister class with a "total"
attribute and operation like: clear, add, sub, getTotal.

>
>
>
>>that is then customized to provide the desired
>>
>>>application (the justification being that such a design is much more
>>>extensible)? Do I need to stop thinking "program" and instead think
>>>"architecture" as the primary purpose and then customize it to provide
>>>the "program" as the secondary purpose?
>>>
>>
>>Design is more abstract than programming, you can design thins that are
>>impossible to translate directly to the code (then you need to
>>"normalize" things before programming it).
>
>
> Right, which is what I'm seeing from both cases. An analysis of the
> problem simply states relations and some of these relational statements
> can't even be checked by code (take relations involving quantifications
> over infinite sets or involving convenience functions that don't exist
> in the implementation language). What differs here is what we are
> solving. Are we solving the problem at hand, or do we choose to solve
> a generalization one of whose instances is the problem at hand,
> ostensibly for more flexibility?
>

Referent to the "infinite sets" want to say that some time ago a went to
a class of funtional programming (I'm a nebie on this) and I like very
much the way to approach to the problems. For example I can create a
function that returns an infinite set of number and "connect" to a
function that multiplies its values by 2. Is this functional program
impossible to to in an OO language? No, it can be done in a different
way, that the difference.

>
>
>
>>>This is why I need to understand the philosophy. I want to understand
>>>how to "think OOD". I don't care about specific design techniques
>>>unless they help illustrate this shift in thinking. What can you tell
>>>me about this? What references (online and printed are fine) can you
>>>point me to? I'd love something that contrasts the two methodologies
>>>and provides examples to drive it home. Something that explains and
>>>justifies OOD from a more philosophical perspective.
>>>
>>
>>I don't know if this is the best example but:
>>
>>At work I am developing an application that follows a "pipe and filter"
>>architecture. I dont knew it follows it since some good person on this
>>news channels says me. I was reinventing the wheel :(
>>
>>Also, I was designing the application (more concretaly the data model)
>>in an OO way. When I had a possible good design
>>(using some design patterns) I passed to normalize my model and adapt it
>>to the concrete OO programming language. After a couple of changes on my
>>desing (and new normalizations) it was a great hit.
>>
>>I think it would be impossible to do if I dont stop and think on OOD for it.
>
>
> Could you provide some more details? How did you analyze the problem
> using OOD? How would you have analyzed it using non-OOD?
>

Ok, First I explain the problem and then we try to make a solution in
OOD and non-OOD way.
At work we (my and my folks :) ) work with metheorological radar data. I
dont go to say what information radar data has because this is another
war :D (it is pretty "complete" and can be modeleded also in OO classes
more simple that with strucutres and simple types).

Ok, also, a set of work friend develop algorithms to filter, correct,
imporove the recived radar data, here I called as: A1, A2, A3, ...
Every algorithm (the are procedures or functions) has some input
parameters like the data to handle and some parameters to configure the
behaviour of algorithm.

The idea is to develop an applciation that give as the avility to create
any desired configuration of what algorithms must be executed on the
initial recived radar data. For example:

A1- Input: Needs a radar file name.
Do: Knows how read a radar data file
Output: returns a set of output params.

A2- Input: Needs 2D data and some configuration params (not relevant) to
work
Do: Corrects some radar data
Output: Corrected 2D data and some parameters

A3, A4, ... : All the same filosofy. An algorith needs a set of inputs,
does something and returns a set of output data that can be connected as
inputs of other algorithms.

The idea is like programming in a procedural way, but we can decide at
runtime which output params will be input params to other algorithms.


Design solutions:
-----------------

Non-OOD: I dont want to image how i can analyze my work application in a
non-OOD :) I can be a very headache.

OOD: I have a set of classes like "Algorithms" and "Data". An
"algorithm" object has 0..* input "data" objects and 0..* output "data"
objects related. Also the same "data" object can be an input for an
algorithm and an output for another.

I dont saying there is no solution in a non-OOD but I say the approach
in OOD is more closely to me than the other.

>
>
>>>Thanks in advance.
>>>
>>
>>I hope it will be useful for you.
>
>
>
> It was, thank you very much.
>

Bye Mr plan9ch7.

Tim Haugton

unread,
Jun 30, 2005, 4:01:45 AM6/30/05
to
Robert C. Martin wrote:
> On 29 Jun 2005 08:24:50 -0700, xmp...@yahoo.com wrote:
[snip]

> In my view the OO-ness of a system can be identified by tracing the
> dependencies between the modules. If the high level policy modules
> depend on (directly call) lower level modules, which then call even
> lower level modules, then the program is procedural and not object
> oriented.

I'm a big fan of dependency inversion and utilising it has helped many
projects that I've worked on. I was going to write that I didn't really
consider it *the* de facto OO-ometer, but I think that I might, just in
a slightly different way.

Like you, I also view OO as simply a way to organise code efficiently.
I've seen many truely horrid architectures which have arisen from
OOA/OOD, from noun spotting etc. And I guess I see OO as a strategy
that minimises code by allowing us to take an interface, and some
behaviour, and vary the behaviour independent of that interface.

Of course, the DIP is just an example of this, but I feel the DIP is at
its most potent across assembly (package) boundaries, where it would
indeed be an effective OO-ometer. Inside an assembly, I think the DIP
can *often* left unused without plunging into procedural code.

Or perhaps any polymorphic activity suggests an inverted dependency.
Not sure, I'd have to meditate on it.

Regards,

Tim Haughton

Mark Nicholls

unread,
Jun 30, 2005, 5:38:48 AM6/30/05
to
> >Hi,
> >
> >
> >I understand the mechanics of objects; I use objects, polymorphism,
> >even patterns. I understand some of OOD, but I'm trying to grasp the
> >bigger picture, how to "think in OOD". I hope an example will help:
>
> There has been an enormous amount of debate and discussion about this
> topic. Some folks believe that OO is an extremely high level
> technique used to make models of the world. Others believe that OO is
> a mechanism for structuring source code. I fall into the latter camp.

I fall into both (though philosophy is a little strong), I'd rather
call it a school.

>
> To me, the whole notion that OO is a philosophy is silly.

do you think there are any philosophical notions behind SE...or simply
that OO is not (a distintive) one?

> OO is a set
> of tools that help programmers structure their code better. In
> particular, it helps them invert key dependencies.

I'll try to ignore my main objection to this point, for both our
sanity, or at least just brush past it.....if I start to go on...then
feel free to tell me to 'shut the .... up'.

OK, it is possible to invert the physical dependencies between sets of
classes (modules) by moving entities between them ...we should at least
agree here......are you claiming that this is the key characteristic
value of OO?

>
> In my view the OO-ness of a system can be identified by tracing the
> dependencies between the modules.

you are?

> If the high level policy modules
> depend on (directly call) lower level modules, which then call even
> lower level modules, then the program is procedural and not object
> oriented.

this is utterly bizarre....you are claiming that the OO ness of a
system is not dependent on the logical model (i.e. 'class',
'interface'), but on the allocation of a class or interface to a
physical deployment entity (module)...and further that even if an
application uses classes, interfaces, encapsulation, abstraction,
polymorphism, it can be considered (strictly) functional in certain
deployments........I disagree.

i.e.

interface IA
{
}

class CA : IA
{
}

interface IB
{
}

class CB1
{
IA a = new CA();
}

class CB2
{
}

so I can vary the OO'ness of the above code, not by changing the code,
but by how I allocate each entity to a module?!?!?

If I put it all in 1 module....its not OO?

>
> However, if the high level policy modules are independent, or depend
> solely on polymorphic interfaces that the lower level modules
> implement, then the system is OO.

So if a physical module (set of classes) depends on 1 non polymorphic
interfaces and n (>0) polymorphic ones, its not OO??!?!?!?!?!?

Would you characterise a C program where the dependencies between
'modules' were via function pointers as OO? (I actually would
sympathise with an answer like 'a bit' but the lack of something
corresponding to an object would worry me)

>
> In other words, it all has to do with the *direction* of the
> dependencies.

aaahhhhhhh.....I'm my own grandpa.....

>
> To learn more about this see the following article:
>
> http://www.objectmentor.com/resources/articles/Principles_and_Patterns.PDF
>
>

luckily I'm going on holiday for a week from tomorrow, I've been good
up to now, I've let loads of DIPy stuff go, but this one was just too
much for me, in one stroke you seem to be trying to characterise OO as
an mechanism for implementing your 'DIP' thing.....if I believed in DIP
that would be a step too far, but as I completely reject it as a
mirage....its two steps too far.

xmp...@yahoo.com

unread,
Jun 30, 2005, 9:43:23 AM6/30/05
to
> There has been an enormous amount of debate and discussion about this
> topic. Some folks believe that OO is an extremely high level
> technique used to make models of the world. Others believe that OO is
> a mechanism for structuring source code. I fall into the latter camp.

So OO keeps data and operations together which avoids name clashes and
serves as a grouping method for the readers of the code? However, this
would preclude any need for OO design as one can use procedural design
and simply group the data and operations together in a class.


> To me, the whole notion that OO is a philosophy is silly. OO is a set
> of tools that help programmers structure their code better. In
> particular, it helps them invert key dependencies.

By inverting key dependencies, what do you mean? Or did I get it
below?


> In my view the OO-ness of a system can be identified by tracing the
> dependencies between the modules. If the high level policy modules
> depend on (directly call) lower level modules, which then call even
> lower level modules, then the program is procedural and not object
> oriented.
>
> However, if the high level policy modules are independent, or depend
> solely on polymorphic interfaces that the lower level modules
> implement, then the system is OO.

I'm a bit confused. By this are you saying that objects capture the
algorithmic pattern, applying parametized objects in lieu of fixed
calls? For example:

Non-OO by your definition:
==========================
class x {
method y (z) {
for each i in z
display(z)
}
}

OO by your definition:

class x
method y (z,a) {
for each i in z
a.display(z)
}

Instead of calling a lower level display, the class calls a display
method of a parameter passed in, which now means the algorithm is
parametized; method y no longer displays all the elements in z, but
rather applys a user-defined algorithm (the display interface) over z.
This means that it's a more abstract algorithm and is more flexible as
more can be done with it by deriving different classes that fulfill the
display interface. Is this correct?

I'll visit the link shortly. Thank you.

xmp...@yahoo.com

unread,
Jun 30, 2005, 10:00:00 AM6/30/05
to
> OOD only gives you a new way to orient your ideas, like there is a very
> different point of view betwen procedural and OO programming languages
> (supossing we don't use OO prog. languages only in a procedural way).
> Which is the best? Depends on your needs. (The best answer in
> computer-science worl is "depends on" :) )

Right; I'm not assuming that OO is the best tool out there. I work on
a large number of small projects, and for many of them procedural code
is just fine. However, I want to make sure I also understand the OO
mind-set. If it were just about using classes as the basic module and
enforcing data integrity through methods, then it's not a paradigm
shift; in fact, it's the realization of the procedural ideal. So to
me, there has to be a different way of viewing things.

> > But non-OOD is also design. Non-OOD focuses more on deriving the
> > algorithms, while OOD focuses more on the data, although the two
> > intersect. One can derive data from the algorithms/relations and vice
> > versa. So, given two different ways of doing something, and a former
> > way that is clearer and more direct, why use the latter? If it's an
> > issue of flexibility/re-usability, what's the thinking behind that?
> >
> Yes, you can design in any way. See the differences betwen a DFS in the
> "structured design" and an UML class diagram. Which is the best? Other
> time it depends.

Right, the question becomes this; what are the advantages of OO design
over non-OO design? Or at least, what are the trade-offs in each case?
The most obvious trade-off is directness. Non-OO design cuts straight
to the heart of the matter. However, does OO design offer something to
make up for its more vague and circuitous route?

[Cash Register Example]


> Yes, the problem isn't veru difficult, then why make a difficult
> solution? Why not create a simple CashRegister class with a "total"
> attribute and operation like: clear, add, sub, getTotal.

Because such a solution is essentially procedural, even if it uses
objects. In addition, it makes OO analysis pointless, because we would
have spent more time analyzing a problem, only to come to the same
solution that a more direct (predicate calculus) analysis would have
yielded; in addition, the predicate calculus analysis could have led to
the same object model. If we use the guidelines that data is private
and guarded by access methods along with the requisite guards then we
have most of that design. Grouping data and their operations together
is another general principle that can be used to get the rest of that
model. So in the end, we're looking at procedural analysis, along with
some code derivation guidelines to derive an object-based model. No OO
analysis there.


> > Right, which is what I'm seeing from both cases. An analysis of the
> > problem simply states relations and some of these relational statements
> > can't even be checked by code (take relations involving quantifications
> > over infinite sets or involving convenience functions that don't exist
> > in the implementation language). What differs here is what we are
> > solving. Are we solving the problem at hand, or do we choose to solve
> > a generalization one of whose instances is the problem at hand,
> > ostensibly for more flexibility?
> >
> Referent to the "infinite sets" want to say that some time ago a went to
> a class of funtional programming (I'm a nebie on this) and I like very
> much the way to approach to the problems. For example I can create a
> function that returns an infinite set of number and "connect" to a
> function that multiplies its values by 2. Is this functional program
> impossible to to in an OO language? No, it can be done in a different
> way, that the difference.

Functional languages can support infinity through lazy evaluation,
however this support is limited. They generate the values on an
as-needed basis -- but this doesn't fulfill the needs of quantification
over infinite sets, which requires evaluation of the entire infinite
set -- an impossible concept, but one that is useful as an analysis
tool. For example, if I quantify:

(All x such that x in Integer and Odd(x) = True)

Assuming the mathematical definition of Integer (and not the computer
limited one) this requires evaluation of each element of an infinite
set. This is something that can never be implemented on a computer,
but it is a useful analysis tool for the user.

[Radar Example]
Ok, I think I see what you're doing. You have a series of algorithms
(or objects) that feed their output into other algorithms/objects,
hence the pipeline. The outputs and inputs are objects, and I assume
you are using polymorphism in order to supply a unified interface?
This also means you could build a collection of the algorithms you want
for a given task and customize processing chains by supplying these
collections when needed, and deriving new ones to introduce new
processing functionality?


Thanks.

xmp...@yahoo.com

unread,
Jun 30, 2005, 10:02:00 AM6/30/05
to

Tim Haugton wrote:
[Snip]


> Of course, the DIP is just an example of this, but I feel the DIP is at
> its most potent across assembly (package) boundaries, where it would
> indeed be an effective OO-ometer. Inside an assembly, I think the DIP
> can *often* left unused without plunging into procedural code.


What's DIP?

xmp...@yahoo.com

unread,
Jun 30, 2005, 1:29:25 PM6/30/05
to

Answering my own question here. Just read the link provided by Mr.
Robert Martin and I see what it stands for.

Thanks.

H. S. Lahman

unread,
Jun 30, 2005, 3:18:24 PM6/30/05
to
Responding to Xmp333....

> I understand the mechanics of objects; I use objects, polymorphism,
> even patterns. I understand some of OOD, but I'm trying to grasp the
> bigger picture, how to "think in OOD". I hope an example will help:

First, note that OO development traditionally has three stages:

OO Analysis (OOA): a design that resolves the customers functional
requirements abstractly in the customer's terms and in a manner that is
independent of particular computing environments.

OO Design (OOD): a design that elaborates the OOA solution to resolve
nonfunctional requirements. Since nonfunctional requirements
necessarily depend on the specific computing environment, this solution
explicitly includes computing environment issues. But it does so at a
relatively high level of abstraction. IOW, it is a strategic view of
the full solution.

OO Programming (OOP): a design that elaborates the OOD solution at a
detailed level using an OOPL. IOW, OOP is about implementing a complete
tactical solution at the 3GL level.

All three stages involve the traditional notion of 'software design' but
the perspectives are quite different. The stages were formalized to
deal with a problem that plagued traditional development processes: the
gap between the customer view and the computing view can be enormous.
The systematic progression from

requirements -> OOA model -> OOD model -> OOP model -> executable

was designed to put step stones in that gap. It was systematic because
each stage was fairly well focused (at least relative to its precursor,
Structure Programming). Everything to the left of the executable is a
specification of the solution to its right. Similarly, everything to
the right of requirements is a solution to the specification on its
left. So the level of abstraction decreases moving to the right. At
each stage the developer provides well-defined intellectual content.
(For the last stage to produce an executable, this is now automated by
3GL compilers, linkers, and loaders.)

[Note that practicing IID with very small scale increments does not
change the nature of the stages. Whether one is dealing with DoD
projects with a cast of thousands or small feature enhancements in a
four-man team, the basic activities still follow the same pattern.]

This dichotomy between specification and solution was one of the
important things that the OO paradigm brought to the software
development table. Which segues to...

>
> Let's say I want a program that given A produces B. I have 2 ways of
> analyzing this. I can either formally specify how B relates to A, or I
> can use the OOD approach in which I identify nouns, relationships,
> arity, patterns, etc...

The OO approach always begins with abstracting some problem space.
Usually it is the customer's but it can be specific parts of the
computing space (e.g., networking and interoperability issues for
distributed processing, RDBs, etc.). That problem space abstraction is
done primarily during OOA. One selects the problem space entities
(concrete or conceptual) that are relevant to the problem in hand. One
also selects their intrinsic properties that are relevant to the problem
in hand.

Tools like noun-identification in a requirements spec are just that:
tools to assist in the fundamental intellectual activity of problem
space abstraction. So there really is no OR in your approach. The
formal OOA/D methodologies provide a systematic approach to problem
space abstraction that happens to use a variety of mechanical
techniques. So your quest to "think OO" is really a quest to understand
an OOA/D methodology.

In the end, you abstract what the problem space /is/ and express that
abstraction in a very particular way. Identifiable entities map to
objects, intrinsic characteristics are expressed in terms of knowledge
(what the entity knows) and behavior (what the entity does)
responsibilities, logical connections between entities are mapped to
relationships, and interactions between entities to solve the problem
are expressed in terms of collaboration messages. The solution is
defined when the developer connects to dots between intrinsic,
self-contained behavior responsibilities by deciding who sends messages
across which relationship paths to whom and deciding when they are sent.

That's the high-level, purist view of OOA. As the developer migrates
through the stages the level of abstraction decreases and one becomes
more and more concerned with the details. By the time one reaches OOP
one is almost in a different world where the dominant notions are things
like type systems, interfaces, and procedure calls. Nonetheless, the
fundamental structure remains that defined in the OOA; there is just a
whole lot more obscuring detail.

>
> The first approach is much more direct and is very reliable, so why
> would I use the OOD approach?

The short answer is that one uses the OO paradigm to achieve better
maintainability in the application in the face of volatile requirements
over time. If one just wants rapid, intuitive computational solutions
one should be doing functional programming.

Your question, though, is more about how to proceed within the context
of the OO paradigm. As I indicated above, there really is no OR. The
key is to recognize that one needs to make an investment in OOA, OOD,
AND OOP. How one does that depends upon the process and/or methodology
one uses, which can range from the OOP-based agile processes at one end
of the spectrum to the model-based agile processes at the other end.

[Even the OOP-based agile processes do OOA/D in various forms, so one
doesn't need UML to do it. They use things like CRCs and whatnot to
record OOA/D semi-formally. They also employ a huge suite of
refactoring practices that effectively distill OO/D into a bunch of
cookbook guidelines to manipulate the 3GL code.]

So the real answer to your question is that you need to do three things.
First, acquire a fundamental knowledge of what OOA/D is about.
Regardless of what methodology and process you eventually adopt, you
need the ABCs first. [The Books section of my blog has some suggestions
on OOA/D books.] Second, select an A&D methodology and follow it.
Third, select a development process as a framework for applying the A&D
methodology and follow it. A lot of smart people with lots of
experience developed the OO methodologies and processes so take
advantage of that and use them religiously. (It takes a /lot/ of
experience to get to the point of being qualified to second guess the
gurus.)

The differences between methodologies and processes are minor compared
to the step function between ad hoc and systematic OO development. So
long as you follow /some/ accepted methodology and process religiously,
things will tend to turn out well.


*************
There is nothing wrong with me that could
not be cured by a capful of Drano.

H. S. Lahman
h...@pathfindermda.com
Pathfinder Solutions -- Put MDA to Work
http://www.pathfindermda.com
blog: http://pathfinderpeople.blogs.com/hslahman
(888)OOA-PATH

Robert C. Martin

unread,
Jun 30, 2005, 4:51:53 PM6/30/05
to
On 30 Jun 2005 02:38:48 -0700, "Mark Nicholls"
<Nichol...@mtvne.com> wrote:

>> >Hi,
>> >
>> >
>> >I understand the mechanics of objects; I use objects, polymorphism,
>> >even patterns. I understand some of OOD, but I'm trying to grasp the
>> >bigger picture, how to "think in OOD". I hope an example will help:
>>
>> There has been an enormous amount of debate and discussion about this
>> topic. Some folks believe that OO is an extremely high level
>> technique used to make models of the world. Others believe that OO is
>> a mechanism for structuring source code. I fall into the latter camp.
>
>I fall into both (though philosophy is a little strong), I'd rather
>call it a school.
>
>>
>> To me, the whole notion that OO is a philosophy is silly.
>
>do you think there are any philosophical notions behind SE...or simply
>that OO is not (a distintive) one?

Whether there are philosophies of software development is irrelevant;
though I think there may be. My point is that OO has often been
touted as a "grand overarching philosophy" having more to do with
life, the universe, and everything, than with software.

OO is certainly a different way of thinking about software. From that
point of view it is a kind of philosophy. But it is a way of thinking
about software at the structural level; not the grand "analysis" level
(whatever that word happens to mean.)

>OK, it is possible to invert the physical dependencies between sets of
>classes (modules) by moving entities between them ...we should at least
>agree here......are you claiming that this is the key characteristic
>value of OO?

Yes.


>
>> If the high level policy modules
>> depend on (directly call) lower level modules, which then call even
>> lower level modules, then the program is procedural and not object
>> oriented.
>
>this is utterly bizarre....you are claiming that the OO ness of a
>system is not dependent on the logical model (i.e. 'class',
>'interface'), but on the allocation of a class or interface to a
>physical deployment entity (module)...and further that even if an
>application uses classes, interfaces, encapsulation, abstraction,
>polymorphism, it can be considered (strictly) functional in certain
>deployments........I disagree.

Not quite. There is a logical component. Higher level policies are
decoupled from lower level policies by having both depend on
interfaces or abstract classes.

>
>i.e.
>
>interface IA
>{
>}
>
>class CA : IA
>{
>}
>
>interface IB
>{
>}
>
>class CB1
>{
> IA a = new CA();
>}
>
>class CB2
>{
>}
>
>so I can vary the OO'ness of the above code, not by changing the code,
>but by how I allocate each entity to a module?!?!?

Certainly. If the allocation of classes to modules does not
effectively decouple those modules, then you don't have an OO
solution. The fact that classes and interfaces are used is
irrelevant, if those classes and interfaces are not used to create an
OO structure.

>If I put it all in 1 module....its not OO?

Its not so much a matter of whether its in one module or not. It's a
matter of whether or not there is an obvious and convenient fracture
zone that could be used to separate the modules.

>> However, if the high level policy modules are independent, or depend
>> solely on polymorphic interfaces that the lower level modules
>> implement, then the system is OO.
>
>So if a physical module (set of classes) depends on 1 non polymorphic
>interfaces and n (>0) polymorphic ones, its not OO??!?!?!?!?!?

OO-ness is not binary, it is a continuum. To the extent that a module
depends on concretions, it is not OO. To the extent that it depends
on abstractions it is.

>Would you characterise a C program where the dependencies between
>'modules' were via function pointers as OO?

Yes, so long as the decoupling created by the function pointers
allowed high level policy to be separated from lower level details.

>(I actually would
>sympathise with an answer like 'a bit' but the lack of something
>corresponding to an object would worry me)

The function pointers are the methods of an interface. Typically
those pointers will be held in some kind of data structure. That data
structure is an object.

Consider, for example, the FILE data type in C. Deep within it there
are function pointers to the read, write, open, close, seek methods.
FILEs are objects.

Alvin Ryder

unread,
Jul 1, 2005, 1:11:04 AM7/1/05
to
xmp...@yahoo.com wrote:
> Hi,
>
>
> I understand the mechanics of objects; I use objects, polymorphism,
> even patterns. I understand some of OOD, but I'm trying to grasp the
> bigger picture, how to "think in OOD". I hope an example will help:
>

There is no definitive "OOD think", or "OOA" we can only more or less
agree on the definition of "object based" and "object oriented".

But there are some guys who share their personal brand of thinking
while travelling towards an OO program.

Peter Coad has some pretty good books. In "Object Oriented Programming"
he walks through several examples.

Kent Beck's "Test driven development" is not about "OOD think" but he
does share his brand of thinking with very simple examples.

> Let's say I want a program that given A produces B. I have 2 ways of
> analyzing this. I can either formally specify how B relates to A, or I
> can use the OOD approach in which I identify nouns, relationships,
> arity, patterns, etc...
>

Or you can follow one of the many other paths.

In my early OOP days I tried that approach, sometimes it was ok but I
mostly went astray badly.

> The first approach is much more direct and is very reliable, so why
> would I use the OOD approach?
>

If your method works then stick with it. It helps to have several
methods up your sleave, but I don't necessarily the "OOD" one.

Some people find CRC cards handy.

> Or am I comparing apples to oranges? Is OOD for designing
> applications, or is it for designing an application pattern (framework?
> architecture?) that is then customized to provide the desired
> application (the justification being that such a design is much more
> extensible)? Do I need to stop thinking "program" and instead think
> "architecture" as the primary purpose and then customize it to provide
> the "program" as the secondary purpose?
>

You've described "application frameworks" and I made serious effort to
use "OOD think" to create such frameworks but it made each project 10
times harder.

Trying to build applications by first building a general framework (in
the hope of reuse) is like trying to climb Mount Everest on your first
hike.

Remember, for code to be reusable it must first be usable ;-)

I found it better to just focus only on "what I needed now", ironically
that path led to lots of usable (and reusable) material.

After building so many systems I could now develop such frameworks but
I don't bother because everything changes so fast, it would all go out
of date too fast.

> This is why I need to understand the philosophy. I want to understand
> how to "think OOD". I don't care about specific design techniques
> unless they help illustrate this shift in thinking. What can you tell
> me about this? What references (online and printed are fine) can you
> point me to? I'd love something that contrasts the two methodologies
> and provides examples to drive it home. Something that explains and
> justifies OOD from a more philosophical perspective.
>

I've mentioned Peter Coad and Kent Beck there is also "Thinking in
Java" by Bruce Eckel, it's pretty popular (and free for download at his
site).

I've only ever used his book to grab code snippets so I personally
cannot vouche for any "thinking" part.

> Thanks in advance.

Cheers.

Robert C. Martin

unread,
Jul 1, 2005, 12:01:24 PM7/1/05
to
On 30 Jun 2005 06:43:23 -0700, xmp...@yahoo.com wrote:

>> There has been an enormous amount of debate and discussion about this
>> topic. Some folks believe that OO is an extremely high level
>> technique used to make models of the world. Others believe that OO is
>> a mechanism for structuring source code. I fall into the latter camp.
>
>So OO keeps data and operations together which avoids name clashes and
>serves as a grouping method for the readers of the code?

Yes that's one way. A more important structuring mechanism is the
decoupling that polymorphism allows. Given two modules:

|A|------>|B|

It is possible, by using polymorphism, to invert the source code
dependency without changing the control flow:

|A|<------|B|

Consider:

package A;
import B.Y;
public class X {
private Y y;
public X(Y y) {
this.y = y;
}

public void f() {
y.g();
}
-------
package B;
public class Y {
public void g() {
}
}
-------
public static void main(String args[]) {
B.Y y = new B.Y();
A.X x = new A.X(y);
x.f();
}

This shows the module (in this case I am equating a module with a
package) dependencies going from A to B.

I can completely invert this dependency without changing the control
flow as follows:


package A;

public interface XServer {
public void g();
}

public class X {
private XServer server;
public X(XServer server) {
this.server = server;
}

public void f() {
server.g();
}
-------
package B;
import A.XServer;
public class Y implements XServer {
public void g() {
}
}
-------
public static void main(String args[]) {
B.Y y = new B.Y();
A.X x = new A.X(y);
x.f();
}

This ability to invert key module dependencies allows me to inspect
every module dependency in the system and adjust it's direction,
without changing the way the system works. This is very powerful.

>However, this
>would preclude any need for OO design as one can use procedural design
>and simply group the data and operations together in a class.

You are speaking about OO Design as though it were some specific
activity or event. I prefer to think about OO design as being a part
of overall software design. We design our software using OO
principles as *part* of our toolkit of design techniques.

>> In my view the OO-ness of a system can be identified by tracing the
>> dependencies between the modules. If the high level policy modules
>> depend on (directly call) lower level modules, which then call even
>> lower level modules, then the program is procedural and not object
>> oriented.
>>
>> However, if the high level policy modules are independent, or depend
>> solely on polymorphic interfaces that the lower level modules
>> implement, then the system is OO.
>
>I'm a bit confused. By this are you saying that objects capture the
>algorithmic pattern, applying parametized objects in lieu of fixed
>calls?

No, the DIP paper explains it in detail.

http://www.objectmentor.com/resources/articles/dip.pdf

Robert C. Martin

unread,
Jul 1, 2005, 12:52:34 PM7/1/05
to
On Thu, 30 Jun 2005 19:18:24 GMT, "H. S. Lahman"
<h.la...@verizon.net> wrote:

>Responding to Xmp333....
>
>> I understand the mechanics of objects; I use objects, polymorphism,
>> even patterns. I understand some of OOD, but I'm trying to grasp the
>> bigger picture, how to "think in OOD". I hope an example will help:
>
>First, note that OO development traditionally has three stages:

[OOA, OOD, OOP]

Actually, this is not quite as traditional as you might think at
first.

OOP came first with languages like Simula, Smalltalk, Obj-C, and C++.

OOD came later with books like Booch, Rumbaugh, Wirfs-Brock. There
were, and still are, different schools of thought about what OOD is.
Among these two schools are the so-called European and American. The
European school uses OO as a way to create an expressive Domain
Specific Language with which to express the application. The American
school thinks of OO as a way to organize and manage source code. Both
schools have merit. There are other schools as well, including the
SMOO school which is quite different.

OOA came later still, and actually hasn't really come yet. Nobody
knows exactly (or even inexactly) what OOA is. There are a number of
books and papers written about it, but they don't agree. There is not
even a set of cogent schools of thought. OOA is a term that we bandy
about with authority, but have no real definition for.

One has to wonder where the {Analysis, Design, Programming} triplet
came from. The triplet permeates our industry to the extent that no
new technique can be created without it immediately be trifurcated.

Consider Dijsktra's Structured Programming. Within a few years there
was Structured Design and Structured Analysis. Interestingly enough
SA and SD had *nothing* to do with SP. They were completely different
things. The use of the word "Structured" in front of A and D was a
brilliant marketing ploy because the word "Structured" had already
been made synonymous with "good".

When did we first believe that A, D, and P were separate activities
with separate artifacts and deliverables? When did we first come to
the conclusion that:

>requirements -> OOA model -> OOD model -> OOP model -> executable

Remarkably, this thought was formalized in a very famous paper written
in 1970 by Dr. Winston Royce, entitled "Managing the Development of
Large Software Systems". This paper is often called "The Father of
the Waterfall". The remarkable part is that the paper firmly
denounces the practice in favor of an approach in which A, D, and P
are done iteratively.

This iterative transformation means that we go from A to D to P in a
matter of minutes, repeating the process many times per day,
delivering working software every week.

>This dichotomy between specification and solution was one of the
>important things that the OO paradigm brought to the software
>development table.

The dichotomy was really brought to the table by Structured Analysis
and Structured Design. Though there were earlier rumblings, these
disciplines were the first to really formalize the different artifacts
for analysis and design.

Even so, I believe that the intention of the original SA/SD folks
(especially DeMarco) was for lightweight iterative practices.
However, in light of the heavy acceptance of Waterfall (promoted by
Winston Royce's paper that said not to do it) people interpreted SA/SD
in a staged artifact driven way.

krasicki

unread,
Jul 2, 2005, 12:01:21 AM7/2/05
to
A number of observations come to mind in this discussion.

First that the word object is a conceptual framework of meanings in
Computer Science.

Software design objects are a family of concepts often complemented by
an iconic notation. The notation is used to experiment with and refine
software designs. The design of an information model uses different
families of objects than functional designs use and so on.

Different from this is OOP and OOP notations. These notations,
although overlapping with design, are largely inventories of
programming components and their particulars.

Not all Computer Scientists are successful with design objects. To be
successful, the individual using them must be able to think in very
abstract ways. Not everyone can and even when they can, corporate
policy and practice often make the effort impossible.

This frustrating truth is largely responsible for the Extreme
*whathaveyou-usually-Programming* phenomenons. With a straight face,
these proponents assert that if design is not egalitarian and if
companies don't respect it then -snip, snip- out with it except for
perfunctory lip-service.

One cannot glibly 'think' in OOD, there isn't any such thing. OOD is
very hard work, time-consuming, expensive and easy to derail (just have
bottom-up activity happening in the background that pre-empts the
designers).

Another common mistake is the literalization of the word 'procedural'
as an either/or alternative to OOP. OOP is equally procedural as a
temporal phenomenon. The consecutive operations are bundled
differently and obviously have their own mechanics.

Architecture is a whole other related subject. Again, design
discussions having to do with architecture too easily get entangled in
OOD and programming quagmires. Architects often have to spoon feed and
baby talk their way through corporate conversations.

The theory for all of this is the theory of language, the use of iconic
notations to conceptually talk about and build very complex software
frameworks with (generally speaking). Pholosophy is coincidental.

Michael Feathers

unread,
Jul 2, 2005, 8:25:48 AM7/2/05
to
krasicki wrote:

> Not all Computer Scientists are successful with design objects. To be
> successful, the individual using them must be able to think in very
> abstract ways. Not everyone can and even when they can, corporate
> policy and practice often make the effort impossible.

I agree that it takes work, but I don't think that deep abstraction is
involved. To me, object design is more like concretization. The steps
are: 1)think of thing that can solve a problem for you, 2) think of a
way to ask it to solve the problem 3) go inside the thing and solve
the problem. It's just a little different from the procedural mindset
which is: 1) think of a way to solve the problem 3) solve the problem.
The abstraction in OO is really all about thinking about a way to ask
for a solution rather than leaping into a solution.

> This frustrating truth is largely responsible for the Extreme
> *whathaveyou-usually-Programming* phenomenons. With a straight face,
> these proponents assert that if design is not egalitarian and if
> companies don't respect it then -snip, snip- out with it except for
> perfunctory lip-service.

How many XP teams have you worked with?

> One cannot glibly 'think' in OOD, there isn't any such thing. OOD is
> very hard work, time-consuming, expensive and easy to derail (just have
> bottom-up activity happening in the background that pre-empts the
> designers).

It is like anything else. Hard when you start, but easier when you
acclimate to it. I "think in OO" but I've been doing it for a long time.


Michael Feathers
author, Working Effectively with Legacy Code (Prentice Hall 2005)
www.objectmentor.com

Nick Malik [Microsoft]

unread,
Jul 2, 2005, 9:56:54 AM7/2/05
to
<xmp...@yahoo.com> wrote in message
news:1120140120.4...@o13g2000cwo.googlegroups.com...

Dependency Injection Pattern

http://www.martinfowler.com/articles/injection.html

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
--


Nick Malik [Microsoft]

unread,
Jul 2, 2005, 10:16:46 AM7/2/05
to
"Robert C. Martin" <uncl...@objectmentor.com> wrote in message
news:0jk7c1tj3hhc2oua6...@4ax.com...

>
> Whether there are philosophies of software development is irrelevant;
> though I think there may be. My point is that OO has often been
> touted as a "grand overarching philosophy" having more to do with
> life, the universe, and everything, than with software.

There are some folks who still claim that the world is flat. We don't talk
about them as though they were a significant part of scientific thought. In
my opinion, it is fair to include "grand OO philosophers" in this category.
Anyone who says that OO is a grand philosophy is ignorant of both software
engineering and philosophy. Kant, Camus, Sartre... now that's philosophy.
We are on the same page on this one.

>
> OO is certainly a different way of thinking about software. From that
> point of view it is a kind of philosophy. But it is a way of thinking
> about software at the structural level; not the grand "analysis" level
> (whatever that word happens to mean.)
>

Once again, we agree. OO gives you a few more interesting techniques that
are considerable more difficult to do in procedural languages.

>>OK, it is possible to invert the physical dependencies between sets of
>>classes (modules) by moving entities between them ...we should at least
>>agree here......are you claiming that this is the key characteristic
>>value of OO?
>
> Yes.

Are you sure you haven't mixed OOD with AOP? I would agree with you if your
statement had been "a key characteristic of AOP is the inversion of control
and the injection of dependencies." This is an innovation on top of OO. It
is true that OO enabled it. However, I agree with the prior poster's
sentiment that an OO program is not characterized by any single pattern,
even a good one.

>>so I can vary the OO'ness of the above code, not by changing the code,
>>but by how I allocate each entity to a module?!?!?
>
> Certainly. If the allocation of classes to modules does not
> effectively decouple those modules, then you don't have an OO
> solution.

I would call this "definition parsing." You do have an OO solution in that
it uses OO mechanisms to accomplish its goal. You may or may not have a
"good" OO solution, with the subjective being the key thing I'm pointing
out. The orientation towards objects is all that is required to be OO. Not
the injection of dependencies. That came much later.

In fact, if you look up the Wikipedia definition of Aspect Oriented
Programming, you will see that the definition's author considers AOP to be
"not OO" but in fact a successor to OO development.

I agree that you don't achieve the goals of good development by splashing a
pile of objects against your problem. Object Orientation is no silver
bullet. You have to carefully analyze the situation and seperate your
interface from your implementation, with the goals of reducing coupling and
increasing cohesion. I would go a step further and state that a better OO
program can be built by using Commonality Variability Analysis (CVA) (Jim
Coplien's idea). However, CVA will not, naturally, lead to AOP. That
cognitive leap wasn't obvious.

On the other hand, NMock and Reflection *does* naturally lead folks to AOP.
I find these innovations in OOP to be much more of an indicator of forward
movement towards AOP than the fundamental OO concepts of inheritance and
polymorphism.

Michael Feathers

unread,
Jul 2, 2005, 10:17:08 AM7/2/05
to
Nick Malik [Microsoft] wrote:
> <xmp...@yahoo.com> wrote in message
> news:1120140120.4...@o13g2000cwo.googlegroups.com...
>
>>
>>Tim Haugton wrote:
>>[Snip]
>>
>>>Of course, the DIP is just an example of this, but I feel the DIP is at
>>>its most potent across assembly (package) boundaries, where it would
>>>indeed be an effective OO-ometer. Inside an assembly, I think the DIP
>>>can *often* left unused without plunging into procedural code.
>>
>>What's DIP?
>>
> Dependency Injection Pattern
>
> http://www.martinfowler.com/articles/injection.html
>

Not in this case. Bob was referring to the Dependency Inversion
Principle: http://www.objectmentor.com/resources/articles/dip.pdf

It's funny, I hadn't noticed the acronyms were the same before :)

Nick Malik [Microsoft]

unread,
Jul 2, 2005, 10:35:39 AM7/2/05
to
>>>
>>>>Of course, the DIP is just an example of this, but I feel the DIP is at
>>>>its most potent across assembly (package) boundaries, where it would
>>>>indeed be an effective OO-ometer. Inside an assembly, I think the DIP
>>>>can *often* left unused without plunging into procedural code.
>>>
>>>What's DIP?
>>>
>> Dependency Injection Pattern
>>
>
> Not in this case. Bob was referring to the Dependency Inversion
> Principle: http://www.objectmentor.com/resources/articles/dip.pdf
>
> It's funny, I hadn't noticed the acronyms were the same before :)
>
>
> Michael Feathers


My apology. I typed my reply before checking out his link. I hadn't
noticed the acronym similarity either.

Come to think of it, the two are related. One could say that the Dependency
Injection Pattern is a patterns implementation of the Dependency Inversion
Principle.

Nick Malik [Microsoft]

unread,
Jul 2, 2005, 11:48:39 AM7/2/05
to
"Michael Feathers" <mfea...@objectmentor.com> wrote in message
news:42C6A1E4...@objectmentor.com...

Nick Malik [Microsoft]

unread,
Jul 2, 2005, 11:48:23 AM7/2/05
to
Hello Plan 9,

(Is that a reference to "Plan 9 from Outer Space," perhaps? Ahhh... a fan
of bad SF Cinema :-)


>> OOD helps you to organize and represents the information. All people
>> makes a mental design to resolve a problem and then programs the
>> solution. In OO terms one first designs an OO solution and then programs
>> it in an OO language (or not :) ).
>
> But non-OOD is also design. Non-OOD focuses more on deriving the
> algorithms, while OOD focuses more on the data, although the two
> intersect. One can derive data from the algorithms/relations and vice
> versa. So, given two different ways of doing something, and a former
> way that is clearer and more direct, why use the latter? If it's an
> issue of flexibility/re-usability, what's the thinking behind that?

Really, the goal is not so much to reuse things as to seperate the things
that change a different times, to make them easier to change. We start with
the limitations of people and create languages that those people can use.
If you watch the evolution away from OO and towards things like AOP and
lightweight frameworks, it is part of an ongoing process towards the
seperation of "things that change rarely" from "things that change
frequently."

The obvious first efforts were the function libraries that would come with a
language. We all knew that there had to be a way to produce the square root
of a number. That mechanism is older than computer science (although many
implementations exist). The fundamental definition doesn't change very
often so it is easy to place something like SQRT() in a math library and be
assured of its longetivity. That's procedural.

What OO gave us was a way to abstract that thinking a bit more... to look at
the activities of our applications and find those activities that are,
themselves, fundamental and rarely changing. If we allow those activities
to operate on interfaces, and not on actual items, we can seperate these
fundamental activities (rarely changing) from the implemented objects
(frequently changing).

In this way, we earn reuse, but not by seeking it. We are seeking ease of
maintenance, and ease of understanding.

>
> Here's an example. Let's say we have a cash register program with the
> options to add, subtract, print a receipt or clear the total.
>
> Non-OO analysis:
> =========
> Op = current operation invoked
> Price = Price Entered
> X' = value change for X for next state
> Total = running total
> ==========
> Then, the state of the system (assume it waits for input) is:
> ==========
> (Op = Start ^ Op' = Clear) v
> (Op = Add ^ Total' = Total + Price) v
> (Op = Sub ^ Total' = Total - Price) v
> (Op = Clear ^ Total' = 0) v
> (Op = Print ^ Printed(Total))
> ==========

Ah, predicate calculus. I haven't done this in years. For a while, I was
pretty good in ML and later in Prolog. However, you'll have to forgive my
rustiness in the notations you used. They are not directly familiar, even
though I believe that I understand what you are trying to say.

This is a very logical approach to the state of a single object. Your
example, however, is not typical. Most applications are not like a cash
register.

A cash register is a machine that holds state for a single long-running
transaction. The state is a series of transactions against inventory. Even
in this very simple description, your model is too light, in that you have
to represent, somehow, the inventory aspect of modern cash registers. If
you do not, your example devolves into an adding machine and a cash drawer.

So let's evolve an adding machine into a cash register...

/// note: the following code is considerably simplified.

class PurchaseTicket
{
public double RunningTotal = 0;
public AddToReciept(Item MyItem, int Quantity, Outputter PrintOutputter)
{
RunningTotal += MyItem.Price * Quantity;
PrintOutputter.PrintLine("{0}/t{1} @ {2}/n",MyItem.Description,
Quantity, MyItem.Price);
}
public DeductFromReciept(Item MyItem, int Quantity, Outputter
PrintOutputter)
{
RunningTotal -= MyItem.Price * Quantity;
PrintOutputter.PrintLine("{0}/t{1} @ {2}
Credit/n",MyItem.Description, Quantity, MyItem.Price);
}
}

There is no 'clear' in that the PurchaseTicket only exists for a single
customer. When the next customer comes, a new ticket is created. If an
entire transaction is started over, the exact same logic applies.

Some would argue that an item should print itself. I disagree. The ticket
would know the format of the output. There is a grey area here. The point
is that placement of the "knowledge" (what does output look like) needs to
make sense to a developer. That way, when the crack open the code a year
later, they can find it fairly quickly.

We have a dependency on the notion of an Outputter, and it has the
interesting method of PrintLine(). Other than that, the code above has no
way of knowing (or caring) if the Outputter is actually an interface and
that the object passed in is simply one that implements that interface.

Importantly, we can add inventory functions fairly readily because we use
the Item object to contain information about the thing we are adding to our
ticket. The interface, above, doesn't change very much. One thing that
does change: we can raise an error if we attempt to remove things from the
ticket that were never in there in the first place:

class PurchaseTicket
{
public double RunningTotal = 0;

private List<Item> TicketList = new List<Item>();

public AddToReciept(Item MyItem, int Quantity, Outputter PrintOutputter)
{
RunningTotal += MyItem.Price * Quantity;
PrintOutputter.PrintLine("{0}/t{1} @ {2}/n",MyItem.Description,
Quantity, MyItem.Price);
for (int i = 0; i < Quantity; i++)
TicketList.Add(MyItem);

}
public DeductFromReciept(Item MyItem, int Quantity, Outputter
PrintOutputter)
{
if (TicketList.CountOf(MyItem) < Quantity)
raise ApplicationException("Item does not exist in this
quantity in the ticket");

for (int i = 0; i < Quantity; i++)
TicketList.Remove(MyItem);

RunningTotal -= MyItem.Price * Quantity;
PrintOutputter.PrintLine("{0}/t{1} @ {2}
Credit/n",MyItem.Description, Quantity, MyItem.Price);
}
public void CommitOnPayment(InventoryManager Iman)
{
foreach (Item TicketItem in TicketList)
Iman.RemoveFromInventory( TicketItem );
}
}

In this example, I added a new dependency. We are now coupled to the
definitions of a List. That List has a couple of interesting methods, like
CountOf, Add, and Remove. I don't know what other methods it has, nor should
I care. There are a lot of things I could say about the List, but they
would be off topic.

The point is that, by using OO programming, I've encapsulated the idea of a
list of items. That list maintains a running total of items that need to
removed from inventory when the CommitOnPayment method is called.

I can very easily implement this program using Item as a concrete class. The
neat thing about OO is that, later, I can decide to use DIP (either
definition :-) and change Item to an interface. I can implement the Item
interface in another part of the application in any way that I'd like, and
the code in this part would not change at all.

This is an example of the Liskov Substitution Principle. [paraphrased -
badly] Any subtype of a type can be substituted for any other subtype as
long as the code refers to the type.

>
> Given I'm not knowledgable about OOD, please forgive a possible
> butchering of OOD, but here's how I can see an OOD approach to the
> problem. First, I identify the nouns in the system:
>
> User
> CashRegister
> Total
> Operation
> Price

You are on thin ice already.

>
> Now the relationships. The User interacts with the CashRegister; in
> response, the CashRegister creates the appropriate Operation and allows
> access to the Price. Price and Total are just numbers. Furthermore,
> since there are several types of Operations, Operation is an abstract
> base class with Add, Sub, Print, and Clear as children sharing a
> consistent interface to allow for more streamlined code.

Interesting analysis. Not a good one.

Ask yourself the question: in my system, do I have competing needs? If I do
not, then use the simplest possible implementation that I can. If I do,
then look for what is in common between them, and what is variable. Pull
the variations down in the inheritance tree, and push the commonality up.
Prefer composition over inheritance. Make your interfaces open for
extension but closed for modification (google the "Open Closed Principle").

I would NOT suggest that the operations Add, Sub, Print, and Clear are
variations. In fact, on your list of items, I'd say that they are quite
common to the concept of a PurchaseTicket (as described above). Certainly,
you could add other types of purchase tickets (say... something that is
electronically transmitted rather than being individually scanned). In that
case, I'd create an interface from PurchaseTicket, and move my code above to
a concrete child of that interface. The calling code would be
none-the-wiser, but I'd be able to create as many different types of
purchase ticket as my customer needs, while limiting the changes to the
fundamental notions of a purchase ticket.


> So, we get
> the following (assuming a non-event model and garbage collection for
> simplicity):
>
> loop
> CashRegister.Interact
> op = CashRegister.Op
> op.DoOp(Total, CashRegister.Price())
>
> However, this is a bit of a kludge. Clear and Print need only one
> parameter, but take 2 in order to conform with the interface. In
> addition, Print gets mutable access to the Total even though it doesn't
> need it, which is unsafe. Furthermore, the analysis up to this point
> was not as clear (or IMO as verifiable) as the previous one.

I'd say: look for what you WANT to encapsulate. Why in the world would you
want to encapsulate this operation at this time? You have stated no
business need for this encapsulation? Certainly, you can encapsulate
operations. In fact, the decorator, command, strategy and
chain-of-responsibility patterns all focus on different approaches to the
problem of encapsulating an operation. However, your comments imply that
you would START there, and I, for the life of me, can't see any reason to do
so.

>
> On the other hand, maybe I was solving the wrong problem. Maybe I need
> to look at the pattern of this program and build the architecture then
> customize it.

Yech.

Use someone else's architecture. Most OO systems have frameworks that they
operate in. Use that. Build only what you need. Abstract only what you
need to abstract.

As far as building your own architecture: YAGNI ("You Ain't Gonna Need It").


> In this case, what I'm trying to build is a type of
> machine that accepts operations, parameters, and can maintain a state
> that is the result of the previous operations. Analyzing it this way,
> we get:

No. You are trying to build a cash register. Your "procedural" example
made no notion of a machine with abstract operations. Why add requirements
the moment you enter the OO world?

>
> User
> Machine
> Params
> OutputState
> Operation
>
> Params are what the Machine returns as data -- they are
> instruction/data pairs (where data can be an additional collection).
> OutputState can store any number of outputs and allows read/write
> access. Otherwise, the semantics are the same:
>
> [Assume output is an instance of OutputState]
> loop
> Machine.Interact
> plist = Machine.Params
> foreach i in plist.Size
> Operation op = Factory.CreateOp(plist[i].Op)
> op.DoOp(plist[i].Data, output)
>

That is the most unreadable bit of code I've seen in a long time. I believe
one of the other posters put up a good quote: "to be reusable, you have to
first be usable." That bit is neither.

You have certainly hit on one of the problems with OO analysis when it
abstracts the wrong things: you can obfuscate the code so wildly as to make
it completely unmaintainable. At that point, you've completely defeated the
purpose of Object Oriented development.

When you look at something like the snip above, your "gut" should say: "this
code smells bad" and you should look for opportunities to refactor it.

> This is the generalized pattern, and by deriving different Operations,
> Factories, OutputsStates, and even Machines we can simulate a wide
> variety of machines -- perhaps even primitive operating systems.

Why would we want to? Was this a requirement of the cash register? Once
again, encapsulate what you need to encapsulate, when you need it, and not
before. OO is a balance. You can go too far (hint: you have).

> We can accomplish things like history lists, screen writes, etc, all with
> the same basic framework, because we solved a general problem.

My code (above) accomplishes the exact same things, and you get the added
benefit of being able to read it.

> Now,
> all future machine-like tasks will consist solely of deriving the
> appropriate classes.
>
> In fact, this whole thing could be made a method of a machine class:
>
> Machine.Run(factory, output)
>
> Then people simply derive from this class, over-ride Interact (and
> anything else they want), and provide the necessary implementations.
>
> Comments?
>

Don't ever write code that I have to maintain. :-)

>
>
>> Design is more abstract than programming, you can design thins that are
>> impossible to translate directly to the code (then you need to
>> "normalize" things before programming it).
>
> Right, which is what I'm seeing from both cases. An analysis of the
> problem simply states relations and some of these relational statements
> can't even be checked by code (take relations involving quantifications
> over infinite sets or involving convenience functions that don't exist
> in the implementation language). What differs here is what we are
> solving. Are we solving the problem at hand, or do we choose to solve
> a generalization one of whose instances is the problem at hand,
> ostensibly for more flexibility?

We solve the problem at hand, using mechanisms that can be generalized WHEN
we need them (and not before).

We don't do anything "ostensibly" for flexibility. We write flexible code
because it is actually a second nature to do so. This is "object thinking".

>
>> > This is why I need to understand the philosophy. I want to understand
>> > how to "think OOD". I don't care about specific design techniques
>> > unless they help illustrate this shift in thinking. What can you tell
>> > me about this? What references (online and printed are fine) can you
>> > point me to? I'd love something that contrasts the two methodologies
>> > and provides examples to drive it home. Something that explains and
>> > justifies OOD from a more philosophical perspective.

I'm going to recommend a very readable book called "Design Patterns
Explained" by Shalloway and Trott. Make sure to get the second edition.
There is an extended section on Commonality Variability Analysis. CVA was
introduced by Jim Coplien but his original work went out of print, so you'll
need to get it second hand (somewhat). The nice thing about this book is
that it is written from the standpoint of an evolution in thinking. The
author describes "aha" moments and how they led to a different approach in
the ways to solve problems.

I hope this helps.

Nick Malik [Microsoft]

unread,
Jul 2, 2005, 11:54:29 AM7/2/05
to
Hello Robert Martin,

My other response to your post was errant in some ways. I had assumed that,
when using the acronym "DIP" that you meant the "Dependency Injection
Pattern" as described by Martin Fowler. Michael Feathers pointed out that
you were referring to a much older, but related, concept, the "Dependency
Inversion Principle." (same acronym)

So some of my comments like "that came much later" don't apply. I believe
that my fundamental stand is solid, in that OO does not define the right way
to practice it, but I cannot say that my words are all that coherent in
retrospect. Sorry for the confusion.

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
--

"Robert C. Martin" <uncl...@objectmentor.com> wrote in message
news:0jk7c1tj3hhc2oua6...@4ax.com...

Nick Malik [Microsoft]

unread,
Jul 2, 2005, 11:55:19 AM7/2/05
to
please ignore... I hit enter too soon.

H. S. Lahman

unread,
Jul 2, 2005, 1:06:28 PM7/2/05
to
Responding to Martin...

>>First, note that OO development traditionally has three stages:
>
>
> [OOA, OOD, OOP]
>
> Actually, this is not quite as traditional as you might think at
> first.
>
> OOP came first with languages like Simula, Smalltalk, Obj-C, and C++.

Simula had OO-like features but it was no more an OOPL than JSD was
OOA/D methodology despite having OOA/D-like features nearly a decade
before Smalltalk. It is true that Smalltalk preceded OOA/D but by the
time Objective-C and C++ got on the scene there were already full OOA/D
methodologies around. [Jacobson claims OOSE's origins in '68, but I only
recall his stuff from the late '70s.]

Those early methodologies borrowed liberally from the distinctions
between analysis, design, and implementation in structured programming.
What OOA/D/P brought to the table was a more useful and less fuzzy set
of definitions (functional vs. nonfunctional requirements, customer vs.
computing views, strategic vs. tactical). IOW, the OO paradigm provided
a better package for bridging the gap between customer spaces and
computing spaces.

> Nobody knows exactly (or even inexactly) what OOA is. There are a number of
> books and papers written about it, but they don't agree. There is not
> even a set of cogent schools of thought. OOA is a term that we bandy
> about with authority, but have no real definition for.

You keep repeating this mantra but it is still untrue. The OOA books
only differ in details -- just like books about OOP. Is the definition
of Crystal exactly the same as XP? Is your view of dependency
management the exactly same as Fowler's? The OOA/D books I have on my
bookshelf differ substantially in detail but they are agreed about the
distinctions I originally provided. You are the only person I know who
has written an OOD book and claims not to know what OOA is at that level
of distinction.

>>requirements -> OOA model -> OOD model -> OOP model -> executable
>
>
> Remarkably, this thought was formalized in a very famous paper written
> in 1970 by Dr. Winston Royce, entitled "Managing the Development of
> Large Software Systems". This paper is often called "The Father of
> the Waterfall". The remarkable part is that the paper firmly
> denounces the practice in favor of an approach in which A, D, and P
> are done iteratively.
>
> This iterative transformation means that we go from A to D to P in a
> matter of minutes, repeating the process many times per day,
> delivering working software every week.

Where is there anything in my descriptions here and elsewhere that
precludes iterative development? Have I not asserted on several other
occasions that IID is routinely practiced with this development model at
any scale?

This is just a forensic ploy to associate OOA/D with BDUF (as defined by
XP) by innuendo. This is like trying to deal with Topmind.

topmind

unread,
Jul 2, 2005, 1:55:05 PM7/2/05
to

In my domain one often cannot know ahead of time what will change. In
physics, chemistry, etc., one may be able to determine such because God
does not change the laws of physics very often, but not in most of the
business and intellectual property domains where the rules are set by
(seemingly) capricious managers, marketers, owners, and lawmakers.
Interfaces need tweaking as often as implementation.

I too seek techniques that are change-friendly; but OO does not appear
to fit that bill. Maybe if one sticks in enough indirection it might,
but then you are batteling layers and layers of interfaces that are
uglier than being closer to the implementation would be.

-T-

Laurent Bossavit

unread,
Jul 2, 2005, 4:39:15 PM7/2/05
to
Top,

> In physics, chemistry, etc., one may be able to determine such because God
> does not change the laws of physics very often,

The latest issue of Scientific American has an article by John Barrow
and John Webb. It suggests that the fine structure "constant" - actually
a ratio involving other "constants" such as the speed of light - changed
over (cosmological) time.

> Interfaces need tweaking as often as implementation.

Not *exactly* as often. There are different rates of change. A whole
spectrum of them, from "changes all the goddam time" to "changes rather
infrequently". (With the fine structure constant at the latter end,
perhaps...)

Capturing different rates of change in the structure of programs is a
win. Config files, data tables, abstract data types, configuration
management, metadata, physical distribution of processors - those are
all tactics for expressing differently things that have different rates
of change.

Laurent

Alan Gauld

unread,
Jul 3, 2005, 3:28:14 AM7/3/05
to

<xmp...@yahoo.com> wrote in message
news:1120058689.9...@g43g2000cwa.googlegroups.com...
> Hi,

> bigger picture, how to "think in OOD". I hope an example will
help:
>
> Let's say I want a program that given A produces B. I have 2
ways of
> analyzing this.

You have in effect already analyzed it because you have described
it in functional terms: A translation of 'A' into 'B'.

Try describing a word processor program in those same terms.
I have a word processor that when given <what?> produces <what?>
I guess it could be 'keystrokes' and 'files'
or maybe 'words','sentences',paragraphs','pages' and
'documents', 'letters', 'books', etc.

How you describe the problem will have a big influence on how
you analyze it and whether OO seems natural or not.

There are some problems which natuurally lead no non OO programs
because they are very simple functions. The more complex the
problem
the more likely that an OO solution will feel more natural.

> I can either formally specify how B relates to A, or I

And of course B and A could be objects...

> can use the OOD approach in which I identify nouns,
relationships,
> arity, patterns, etc...

Its one approach to identifying objects and their
responsibilities
but its not the only one, and many think its not a great one...

> The first approach is much more direct and is very reliable, so
why
> would I use the OOD approach?

Define reliable. Back to the word processor, how do you formally
define the behaviour of a word processor directly and reliably?
It is possible to build procedural word processors of course -
and
some of the best were built that way - but I'd debate whether it
is any more straightforward than an OO approach..

> Or am I comparing apples to oranges? Is OOD for designing
> applications, or is it for designing an application pattern
(framework?

It can do both, but so can functional decomposition.

> "architecture" as the primary purpose and then customize it to
provide
> the "program" as the secondary purpose?

You need architecture at some level regardless of the design
approach.
They are simply alternatives.

> and provides examples to drive it home. Something that
explains and
> justifies OOD from a more philosophical perspective.

Have you tried reading Grady Booch's classic on OOAD? The first
section
of that puts OO in a programming context.

To pick up another point you made in another post in this thread.
OOD is not based around data. It should be based around
behaviour,
the data is only there to support the behaviour. There are data
oriented design techniques too, but they no more like OOD than
functional decomosition is. When doing OOD you are looking for
concepts which exhibit key behaviour in your system. Those
conceptual
things may well require some data to perform their function(sic)
within
the system but the data is supportive of the bahaviour (which is
why it is hidden inside the interface).

There are lots of conceptual ways of looking at OO that have
been suggested over the years. Which ones will work for you is
hard
to say, but they include things like "Actors on a stage", "Ants
in
a colony", Independant parallel processes", and so on. They all
try
to capture the idea of independant objects each with their own
roles
and responsibilities within the system communicating by sending
messages to each other. If you an get that concept in mind rather
than a bunch of function calls it might help. Maybe... :-)


HTH,

Alan G.


krasicki

unread,
Jul 4, 2005, 2:11:35 AM7/4/05
to
Michael Feathers wrote:
> krasicki wrote:
>
> > Not all Computer Scientists are successful with design objects. To be
> > successful, the individual using them must be able to think in very
> > abstract ways. Not everyone can and even when they can, corporate
> > policy and practice often make the effort impossible.
>
> I agree that it takes work, but I don't think that deep abstraction is
> involved.

I don't think deep abstraction is necessary either but the ability to
juggle, weigh, and formalize multiple, sometimes conflicting complex
ideas is. That's not something everyone can do and, quite frankly,
it's obvious.

> To me, object design is more like concretization. The steps
> are: 1)think of thing that can solve a problem for you, 2) think of a
> way to ask it to solve the problem 3) go inside the thing and solve
> the problem.

That's certainly one way to look at it but in the world of commercial
development it is closer to:

The company has bought a product or is orthodox about a methodology, or
is managed by raving idiots who have very short tempers and zero
patience. If you need the money you play the game.

The game is to design something that is reasonably functional under the
circumstances knowing full well that you stand a snowball's chance in
hell of running the gauntlet.

Design amounts to ensuring that the thing accurately processes the data
the client is responsible for. This is the minimum design criteria any
software engineer needs to accomplish.

The *problem* then becomes trying to jump through the hoops that are
usually political and sometimes insurmountable. Here, no design
methodology or tool exists - except maybe dance lessons.

I must add that on rare occasions you have the opportunity to start and
proceed with a blank slate but it is rare indeed.

> It's just a little different from the procedural mindset
> which is: 1) think of a way to solve the problem 3) solve the problem.

Well, the difference is really the difference in asking the easiest way
to get from here to there vs. asking how best to create discrete and
logically cohesive application framework components. And in the case
of the latter there is no one set *right* answer. Two children's
graphic applications may function identically yet be designed wholly
differently. This is not always obvious to the novice or even the
initiated.

And.

In some shops you aren't allowed to even think that there is another
way.

> The abstraction in OO is really all about thinking about a way to ask
> for a solution rather than leaping into a solution.

Sorry. That does not ring true.

>
> > This frustrating truth is largely responsible for the Extreme
> > *whathaveyou-usually-Programming* phenomenons. With a straight face,
> > these proponents assert that if design is not egalitarian and if
> > companies don't respect it then -snip, snip- out with it except for
> > perfunctory lip-service.
>
> How many XP teams have you worked with?

Why do you ask? The methodology is well-documented. It is considered
a lightweight methodology is it not. It cannot be lightweight unless
it is lighter somewhere. What is lighter? The sum programming is the
same, correct?

>
> > One cannot glibly 'think' in OOD, there isn't any such thing. OOD is
> > very hard work, time-consuming, expensive and easy to derail (just have
> > bottom-up activity happening in the background that pre-empts the
> > designers).
>
> It is like anything else. Hard when you start, but easier when you
> acclimate to it.

No. It is not a way of thinking because when you think about it it
makes no sense. A virtual rock in cyberspace can have methods attached
to it. A real-life rock can't and doesn't. In cyberspace there is
only the formal, there is no imagination that can override the
programmatic reality whimsically.

> I "think in OO" but I've been doing it for a long time.

Let's say you think you think in OO. When you're working software you
formalize your ideas to OO patterns.

cheers.

Michael Feathers

unread,
Jul 4, 2005, 10:32:17 AM7/4/05
to
krasicki wrote:
> Michael Feathers wrote:
>
>>krasicki wrote:
>>
>>
>>>Not all Computer Scientists are successful with design objects. To be
>>>successful, the individual using them must be able to think in very
>>>abstract ways. Not everyone can and even when they can, corporate
>>>policy and practice often make the effort impossible.
>>
>>I agree that it takes work, but I don't think that deep abstraction is
>>involved.
>
>
> I don't think deep abstraction is necessary either but the ability to
> juggle, weigh, and formalize multiple, sometimes conflicting complex
> ideas is. That's not something everyone can do and, quite frankly,
> it's obvious.

Agreed. Not everyone is cut out to be a software developer.


> Well, the difference is really the difference in asking the easiest way
> to get from here to there vs. asking how best to create discrete and
> logically cohesive application framework components. And in the case
> of the latter there is no one set *right* answer. Two children's
> graphic applications may function identically yet be designed wholly
> differently. This is not always obvious to the novice or even the
> initiated.

I agree. There is no one right structuring for a particular problem
and, you know, we should be very thankful for that, because it means our
jobs are easier (believe it or not). If there was only one "right"
structuring for any given problem, software development just wouldn't
happen; it would be cost prohibitive.

So, once we get past the idea that there should be one "right"
structuring, we are left with the issue of whether some structurings are
better than others. And, some definitely are. It can be hard to come
up with a great design out of the box, so the next best thing is to
learn about criteria for class design and grade designs on them as you
develop the design. If you find that some design falls on some piece of
critera, try something different and see whether you can meet all the
criteria at once.

A good set of critera are the five class design principles we describe
on our website: Single Responsibility, Open/Closed, Liskov Substitition,
Interface Segregation, and Dependency Inversion. As you grade designs
on adherence to those principles, and look for alternatives when the
designs veer, the designs get progressively better, they end up being
more robust in the face of change. It's not magic. There is judgement
involved. But it does not require deep abstraction ability to formulate
designs this way. Weighing alternatives? Yes, people do need that
ability to be able to design.


Michael Feathers
author, Working Effectively with Legacy Code (Prentice Hall)
www.objectmentor.com

Nick Malik [Microsoft]

unread,
Jul 4, 2005, 11:33:22 AM7/4/05
to
"krasicki" <Kras...@gmail.com> wrote in message
news:1120457495.1...@g43g2000cwa.googlegroups.com...

>> > This frustrating truth is largely responsible for the Extreme
>> > *whathaveyou-usually-Programming* phenomenons. With a straight face,
>> > these proponents assert that if design is not egalitarian and if
>> > companies don't respect it then -snip, snip- out with it except for
>> > perfunctory lip-service.
>>
>> How many XP teams have you worked with?
>
> Why do you ask? The methodology is well-documented. It is considered
> a lightweight methodology is it not. It cannot be lightweight unless
> it is lighter somewhere. What is lighter? The sum programming is the
> same, correct?
>

It is an agile methodology because it allows for change to occur. The WAY
in which you do design can enable the team to respond to change, or can
create artificial barriers to change. If change is normal, then creating
artificial barriers is "swimming upstream." This creates cost and produces
difficult choices. XP, Scrum and other agile methods attempt to address the
underlying problem by changing the way in which software is produced,
thereby removing the artificial barriers to change.

Is this lighter? I don't think so. XP has much more rigor than waterfall.
The practices require training and reinforcement. Most agile processes are
lighter on "ceremony" but not on rigorousness.

Even Scrum, which is my area of practice, requires training for all members.
I've worked on a project that used FDD for requirements, Scrum for
management, and TSP/PSP for software construction. (In that case, the goal
of TSP/PSP was to create better estimates for the feature stories that were
managed by the scrum in a sprint.) Our design document grew to ~80 pages
long, and was kept up to date by the dev team (something often forgotten in
traditional waterfall projects). Our use case document was of similar size.
There were no short cuts.

We delivered the functionality that the customer desired, when they desired
it. We worked normal working hours most of the time. Our code was
thoroughly tested by a professional and independently managed test team and
our triage process, while rapid, was just as rigorous as most waterfall
projects. We cut the cost of development by an estimated 40% and delivered
four full iterations to production in a 9 month window.

Where did the savings come from? That's easy. The IEEE reported in 2001
that surveys of users have shown that nearly 50% of all features in a BDUF
(Big Design Up Front) commercial software project are not used by the users.
HALF. Working in IT, I can say that the number is probably closer to 35%
for custom software, but that number is still huge. 35% of the features is
35% of the time spent writing code, 35% of the time spent testing code, 35%
of the time spent planning and delivering and documenting. It's a LOT of
work for features that no one uses.

By using Feature Driven Development during planning, so that the customer
knows that actual cost of each feature, and by involving the customer at
every step and demonstrating the features on each sprint, the customer
chooses the features to develop, and helps to guide each feature until it is
actually valuable. That cuts about 35% of the extra effort out of every
iteration. That's our savings.

These methodologies are well documented, but so is Object Oriented design...
yet here we are, offering ongoing mentoring on object oriented development.
No one really learns an idea by hearing a self-professed expert, especially
an author in a book, stating a series of facts and assumptions. I believe
that people learn through their fingertips and their mistakes and their
personal moments of revelation and reflection. If you get a chance to work
on an agile team, whether XP or Scrum or Crystal (etc), you may find that
you'll pick up a bit more understanding of what Agile means in practice.

--
--- Nick Malik [Microsoft]

krasicki

unread,
Jul 4, 2005, 2:03:01 PM7/4/05
to
Hi Nick. In my embedded comments (and this will apply to everything we
correspond about, I will sometimes criticize Microsoft in general and I
want it made clear that all such comments have zero to do with you
because i truly enjoy your responses. So take a deep breath if and
when I give Microsoft heartburn;-)

Nick Malik [Microsoft] wrote:
> "krasicki" <Kras...@gmail.com> wrote in message
> news:1120457495.1...@g43g2000cwa.googlegroups.com...
> >> > This frustrating truth is largely responsible for the Extreme
> >> > *whathaveyou-usually-Programming* phenomenons. With a straight face,
> >> > these proponents assert that if design is not egalitarian and if
> >> > companies don't respect it then -snip, snip- out with it except for
> >> > perfunctory lip-service.
> >>
> >> How many XP teams have you worked with?
> >
> > Why do you ask? The methodology is well-documented. It is considered
> > a lightweight methodology is it not. It cannot be lightweight unless
> > it is lighter somewhere. What is lighter? The sum programming is the
> > same, correct?
> >
>
> It is an agile methodology because it allows for change to occur.

It is agile because the term 'extreme' as been so over-exposed that the
audience for this stuff evaporated. So, to be honest it is called
agile to remove the tarnish of the extreme labeling AND to emphasize
peppiness rather than dwell on the ever-present short-comings of the
pseudo-methodology.

Numerous OOD methodologies handle change much better than the agile
proponents will have you believe. The agile practices are no more
adept at change than anything else. What agile sells as response to
change is really immedite gratification. IOW, because everything is
held close to the actual coding everytime a marketing rep sneezes the
code can reflect pneumonia.

This is not responsible change management. This results in chaos and I
have seen it in profoundly big companies whose platitudes, awards, and
self-serving hype disguise the fact that beneath the covers employees
were agile enough to artificially meet deadlines to cash in on bonuses
while the quality of the software remains dubious to this day. Two
weeks ago I came across multiple instances of Y2K errors introduced to
code after the year 2000.

Agile to me means slippery and dodgy - I don't like it and it is
unacceptable in mission critical scenarios. Today many corporations
have had such fun laying off and off-shoring applications that no one
seems to remember that if they collapse or are incorrectly functioning
there will be unpleasnat consequences.

> The WAY
> in which you do design can enable the team to respond to change, or can
> create artificial barriers to change. If change is normal, then creating
> artificial barriers is "swimming upstream." This creates cost and produces
> difficult choices.

Nick, I find this argument convenient but uncompelling. Design is
about many things. It is about playing thought experiments on sytems
and sub-systems to optimize and analyze the choices in the field of
possibility. The design techniques give us a short-hand to try ideas
before a line of code is written and in some cases it can automatically
generate that code.

It is also a blueprint of the system so that architects who have to
integrate one system into another when a corporate buyout occurs can
make sense of two systems.

It is also an inventory of parts, authorizations, licences, and so on.

It is also an audit trail of how it is suppossed to work so that
malicious behavior can be identified and isolated.

It is expensive to do but it is expensive not to do. Change management
is easier and more certain in systems employing front-end design
effectively.


> XP, Scrum and other agile methods attempt to address the
> underlying problem by changing the way in which software is produced,
> thereby removing the artificial barriers to change.

Well, if the only issue was producing code you'd have a winner. Code
has never been more than, say, 10% of systems building time and even
less cost although hardware is obscenely cheap these days so that cost
may differ. Software is becoming increasingly expensive as the
short-sighted cost cutting of design and documentation are taking their
toll and the cacophony of bad software development ideas, frameworks,
and techno-babble take its toll.

>
> Is this lighter? I don't think so. XP has much more rigor than waterfall.

XP has more waterfalls than waterfall ever had. More rapids, typhoons,
and sharks as well. I would be astonished if you could compare the
biggest XP project's rigor with the smallest government development
project for any military sensitive project. The feasibility study
alone would trump your entire XP enterprise.

XP thrives in the loosey goosey, let's pretend playscape of corporate
America where petty empires are built around in efficient applications
that ask for a piece of data that's retrieved from a database.

It's very hard to screw that up but that screwing happens so often that
even XP seems a relief to these environments. It is incompetence and
not the methodology that is to blame here.

> The practices require training and reinforcement. Most agile processes are
> lighter on "ceremony" but not on rigorousness.
>
> Even Scrum, which is my area of practice, requires training for all members.
> I've worked on a project that used FDD for requirements, Scrum for
> management, and TSP/PSP for software construction. (In that case, the goal
> of TSP/PSP was to create better estimates for the feature stories that were
> managed by the scrum in a sprint.) Our design document grew to ~80 pages
> long, and was kept up to date by the dev team (something often forgotten in
> traditional waterfall projects). Our use case document was of similar size.
> There were no short cuts.

This should be a clue that change management is not a function of
rigorous design practice.

>
> We delivered the functionality that the customer desired, when they desired
> it. We worked normal working hours most of the time. Our code was
> thoroughly tested by a professional and independently managed test team and
> our triage process, while rapid, was just as rigorous as most waterfall
> projects. We cut the cost of development by an estimated 40% and delivered
> four full iterations to production in a 9 month window.

Nine months is a long time these days depending on the application of
course. A luxury that most commercial companies will not tolerate.
three month business cycles dictate development cycles and castrate any
creative processes that design entails.

>
> Where did the savings come from? That's easy. The IEEE reported in 2001
> that surveys of users have shown that nearly 50% of all features in a BDUF
> (Big Design Up Front) commercial software project are not used by the users.

IEEE should know that design is not an end user deliverable. And IEEE
should ask where the requests for unused features come from. I once
had a teacher who posed the eternal student question, "How long should
my report be!?" He replied, "Long enough to cover the subject but
short enough to be interesting". It seems there's a corollary there
foe software development.

Software engineers *have to* cover the subject yet make the application
easy enough to use.

On a personal note, I run Windows. There are libraries of documented
features I don't use. Send Bill an e-mail complaining that his product
could be cheaper without a browser and media player and whatnot.

Bill put WordPerfect under with esoteric features Three people needed
and millions didn't.

> HALF. Working in IT, I can say that the number is probably closer to 35%
> for custom software, but that number is still huge. 35% of the features is
> 35% of the time spent writing code, 35% of the time spent testing code, 35%
> of the time spent planning and delivering and documenting. It's a LOT of
> work for features that no one uses.

SOME one uses them or they wouldn't be requested (unless you're talking
about perfunctory stuff needed to be be added so that Microsoft will
brand the application compatible).

>
> By using Feature Driven Development during planning, so that the customer
> knows that actual cost of each feature, and by involving the customer at
> every step and demonstrating the features on each sprint, the customer
> chooses the features to develop, and helps to guide each feature until it is
> actually valuable. That cuts about 35% of the extra effort out of every
> iteration. That's our savings.

Maybe. But up front design eliminates asking this question iteratively
for eacgh sprint. That's my additional savings (we're price choppers
we are).


>
> These methodologies are well documented, but so is Object Oriented design...
> yet here we are, offering ongoing mentoring on object oriented development.
> No one really learns an idea by hearing a self-professed expert, especially
> an author in a book, stating a series of facts and assumptions. I believe
> that people learn through their fingertips and their mistakes and their
> personal moments of revelation and reflection. If you get a chance to work
> on an agile team, whether XP or Scrum or Crystal (etc), you may find that
> you'll pick up a bit more understanding of what Agile means in practice.
>

No doubt. Thanks for the dialogue - people do learn here as well.

cheers,

Frank

Nick Malik [Microsoft]

unread,
Jul 4, 2005, 4:10:22 PM7/4/05
to
Wow. um... tell me what you _really_ think. :-)

Clearly, I'm not going to convince you, on a newsgroup, that Agile methods
are a good practice.

Note: I did not say that "Agile" is incompatible with "design." I believe
it is incompatible with "Big Design." I hope to have made it clear that I
believe that you can (and should) perform MDA on an agile project. I've
done it. I've seen it done. It appears that you've been told that agile
methods leave no room for design. My guess is you heard that from a
self-proclaimed XP evangelist. I rank them a notch below most TV Shopping
Channel pitchmen in their respect for science, impartiality, or fundamental
integrity.

I do hope, however, that you and I can encourage our collegues and friends
to keep an open mind and learn from each other. There are opportunities for
our industry to explore ways to reduce the costs and headaches of software
development. I believe that each of the "competing" mechanisms have a
glimmer of a better way. I'm sure that somewhere between our desire to make
projects run smoothly, and our desire to give the users the features that
they want to pay for, we will synthesize a process that may last more than
the duration of the average teenage clothing fad.

With great regard,

Michael Feathers

unread,
Jul 5, 2005, 8:09:02 AM7/5/05
to
krasicki wrote:
> It is agile because the term 'extreme' as been so over-exposed that the
> audience for this stuff evaporated. So, to be honest it is called
> agile to remove the tarnish of the extreme labeling AND to emphasize
> peppiness rather than dwell on the ever-present short-comings of the
> pseudo-methodology.

Actually, it is agile because the word 'extreme' was a roadblock in some
organizations. What we've been finding recently is that there are a
many companies going agile who pick some agile method like Scrum and
then start using XP practices to fill it in. There has actually been a
net push towards XP practices over the past five years. Their worth
has clearly been recognized independently of their original packaging.

> This is not responsible change management. This results in chaos and I
> have seen it in profoundly big companies whose platitudes, awards, and
> self-serving hype disguise the fact that beneath the covers employees
> were agile enough to artificially meet deadlines to cash in on bonuses
> while the quality of the software remains dubious to this day. Two
> weeks ago I came across multiple instances of Y2K errors introduced to
> code after the year 2000.
>
> Agile to me means slippery and dodgy - I don't like it and it is
> unacceptable in mission critical scenarios. Today many corporations
> have had such fun laying off and off-shoring applications that no one
> seems to remember that if they collapse or are incorrectly functioning
> there will be unpleasnat consequences.

Baloney. Agile and Iterative methods are simply what many very good
software developers have been doing behind the scenes for years. Gerald
Weinberg is on record saying that Project Mercury at NASA was developed
using a process that was pretty much indistinguishable from XP.

krasicki

unread,
Jul 5, 2005, 10:01:00 AM7/5/05
to
Michael Feathers wrote:
> krasicki wrote:
> > It is agile because the term 'extreme' as been so over-exposed that the
> > audience for this stuff evaporated. So, to be honest it is called
> > agile to remove the tarnish of the extreme labeling AND to emphasize
> > peppiness rather than dwell on the ever-present short-comings of the
> > pseudo-methodology.
>
> Actually, it is agile because the word 'extreme' was a roadblock in some
> organizations. What we've been finding recently is that there are a
> many companies going agile who pick some agile method like Scrum and
> then start using XP practices to fill it in. There has actually been a
> net push towards XP practices over the past five years. Their worth
> has clearly been recognized independently of their original packaging.

Given that *XP practices* are largely rebranded existing good practice
I will assert that no such push exists. I submit to you that XP
advocates are simply looking outside their shell and discovering that
good programming practices existed despite their claims. But failing
to acknowledge that, XP advocates, as usual, run ahead of the parade
claiming credit for the celebration.

>
> > This is not responsible change management. This results in chaos and I
> > have seen it in profoundly big companies whose platitudes, awards, and
> > self-serving hype disguise the fact that beneath the covers employees
> > were agile enough to artificially meet deadlines to cash in on bonuses
> > while the quality of the software remains dubious to this day. Two
> > weeks ago I came across multiple instances of Y2K errors introduced to
> > code after the year 2000.
> >
> > Agile to me means slippery and dodgy - I don't like it and it is
> > unacceptable in mission critical scenarios. Today many corporations
> > have had such fun laying off and off-shoring applications that no one
> > seems to remember that if they collapse or are incorrectly functioning
> > there will be unpleasnat consequences.
>
> Baloney. Agile and Iterative methods are simply what many very good
> software developers have been doing behind the scenes for years. Gerald
> Weinberg is on record saying that Project Mercury at NASA was developed
> using a process that was pretty much indistinguishable from XP.
>

Skunkworks projects are a time-honored tradition in enterprises chock
full of talented people. XP is not skunkworks.

You and other proponents are selling XP -er- *agile* as a generic
methodology that gets better results than anything else (the boogeyman
here being waterfall).

Project Mercury would have never gotten off the ground if it had been
developed in a financial services, insurance, banking, or commericial
enterprise using *agile* methodologies.

I particularly find these bait and switch analogies to be maliciously
misleading.

I have worked in places like Dec, Ratheon, Ham Standard and other
companies that oozed of talented engineers. And in those places, the
quality of people could make anything work using any methodology
assuming the dullards get out of the way.

I've also worked in less discerning places where village idiots can
accuse talented people of being incompetent and get away with it. And
in those places, 'up' may be their 'down' and everything is capricious
to whoever happens to be the audience that day.

Let's not sell kool-aid here. There are plenty of places wallowing in
their own fecal ideas that will get on these newsgroups and testify how
good it feels - come join us. Be careful not to sound like them.

I work at very high levels and I wallow in the trenches to stay
connected to what I love doing. Today the quality of software is worse
than I have ever seen it in my twenty-five years in this business. You
have a hard sell telling me bottom up methodologies make sense in that
light.

Robert C. Martin

unread,
Jul 5, 2005, 10:51:14 AM7/5/05
to
On Sat, 2 Jul 2005 08:48:23 -0700, "Nick Malik [Microsoft]"
<nick...@hotmail.nospam.com> wrote:

>Really, the goal is not so much to reuse things as to seperate the things
>that change a different times, to make them easier to change. We start with
>the limitations of people and create languages that those people can use.
>If you watch the evolution away from OO and towards things like AOP and
>lightweight frameworks, it is part of an ongoing process towards the
>seperation of "things that change rarely" from "things that change
>frequently."

This is very well stated. I often say that OO is a technique for
managing dependencies; but often neglect to say what the management
criteria are. Separating things that change at different times is one
of the most important of those criteria. Indeed, I have written two
different principles about this topic: The SRP (Single Responsibility
Principle) which says that every class should have one, and only one,
reason to change, is a principle that expresses this idea in the
small. The CCP (Common Closure Principle) which says that classes
that change together should be grouped into packages together,
expresses this idea in the large.

http://www.objectmentor.com/resources/articles/srp
http://www.objectmentor.com/resources/articles/granularity.pdf

Robert C. Martin

unread,
Jul 5, 2005, 10:56:12 AM7/5/05
to
On 2 Jul 2005 10:55:05 -0700, "topmind" <top...@technologist.com>
wrote:

>In my domain one often cannot know ahead of time what will change.

It's not so much a matter of knowing what will change, or even how it
will change. It's a matter of recognizing that certain things will
change at a different rate than others. For example report formats
are will change at a different rate than business rules. GUI layout
will change at a different rate than database schemae.

You don't even have to know whether one will change more frequently
than the other. You just have to be able to make a reasoned guess
that they will change at different rates and for different reasons.

We try not to couple report formats to business rules because it would
be a shame to inadvertently break the business rules by moving a
column on a report. We try to decouple the GUI layout from the
database schemae because it would be a shame to crash the GUI when
adding a new column to the database.

Robert C. Martin

unread,
Jul 5, 2005, 11:22:37 AM7/5/05
to
On Sat, 2 Jul 2005 07:16:46 -0700, "Nick Malik [Microsoft]"
<nick...@hotmail.nospam.com> wrote:

>"Robert C. Martin" <uncl...@objectmentor.com> wrote in message
>news:0jk7c1tj3hhc2oua6...@4ax.com...

>>Nick asked:


>
>>>OK, it is possible to invert the physical dependencies between sets of
>>>classes (modules) by moving entities between them ...we should at least
>>>agree here......are you claiming that this is the key characteristic
>>>value of OO?
>>
>> Yes.
>
>Are you sure you haven't mixed OOD with AOP?

Yes.

>I would agree with you if your
>statement had been "a key characteristic of AOP is the inversion of control
>and the injection of dependencies."

There is a certain similarity between IOC, DIJ, and AOP.
Interestingly enough IOC and DIJ are typically implemented with OO.
It is the dependency inversion capability of OO that enables IOC and
DIJ. AOP enables IOC and DIJ through the weaving mechanism which is
also a kind of dependency inversion approach. The difference is that
AOP weaves the callback code into the main line rather than jumping
through vectors.

>This is an innovation on top of OO.

That it is innovation I will grant you. However, AOP and OO are
pretty different. You can write AO programs that are not at all OO.
Indeed, you could write AO compilers for languages that were not OO.
The whole idea of constructing a program through the weaving of many
different aspects is quite different from the ideas behind OO.

>It
>is true that OO enabled it.

AOP? I don't think so. It might have inspired it in some way, but
I'm not clear on the history.

>>>so I can vary the OO'ness of the above code, not by changing the code,
>>>but by how I allocate each entity to a module?!?!?
>>
>> Certainly. If the allocation of classes to modules does not
>> effectively decouple those modules, then you don't have an OO
>> solution.
>
>I would call this "definition parsing." You do have an OO solution in that
>it uses OO mechanisms to accomplish its goal. You may or may not have a
>"good" OO solution, with the subjective being the key thing I'm pointing
>out. The orientation towards objects is all that is required to be OO. Not
>the injection of dependencies. That came much later.

Defining "Object Oriented" as "Oriented around Objects" is a bit
circular. It also begs the questions: "What are objects, and what
does it mean to be oriented around them?"

As for what came first and what came later, I'm not sure it matters as
far as a definition of OO is concerned, I will grant you that people
were creating data structures and functions that manipulated those
data structures, long before the term OO was coined. However, when
Alan Kay coined the term OO, it was in the context of a language that
put polymorphism at its core, to the extent that even 'true' and
'false' were two different polymorphic objects with different
implementations.

Above you talked about "good OO solutions" and "bad OO solutions". By
this I take it that you define an OO program to be any program that
uses the trappings of OO. e.g. an OO programming language. I take a
very different view. I define OO as a set of principles that direct
design decisions. A program is OO to the extent that it embodies
these principles, irrespective of the language in which it is written.
I can, for example, write an OO program in C. It's hard, tedious, and
error prone, but I can do it.

>In fact, if you look up the Wikipedia definition of Aspect Oriented
>Programming, you will see that the definition's author considers AOP to be
>"not OO" but in fact a successor to OO development.

I'd call it a distance cousin.


>
>On the other hand, NMock and Reflection *does* naturally lead folks to AOP.

I don't follow that at all?

>I find these innovations in OOP to be much more of an indicator of forward
>movement towards AOP than the fundamental OO concepts of inheritance and
>polymorphism.

I also don't necessarily consider AOP to be "forward". Maybe it is,
or maybe it's sideways. It could even be backwards.

Robert C. Martin

unread,
Jul 5, 2005, 11:33:22 AM7/5/05
to
On Sat, 02 Jul 2005 17:06:28 GMT, "H. S. Lahman"
<h.la...@verizon.net> wrote:

>> Nobody knows exactly (or even inexactly) what OOA is. There are a number of
>> books and papers written about it, but they don't agree. There is not
>> even a set of cogent schools of thought. OOA is a term that we bandy
>> about with authority, but have no real definition for.
>
>You keep repeating this mantra but it is still untrue.

We disagree. Indeed, I don't think there is an agreed definition of
analysis, let alone object oriented analysis. I have sat in too many
meetings (at a company that you and I know well) that devolved into 30
minute arguments about whether a certain technical topic was
"analysis" or "design".

"We shouldn't be talking about that now, it's a design concept."
"No it's not, it's part of the problem space. It's analysis."
"It is not, it's too low level."
"No, it's critical. We have to decide it now."
"No, it can wait, it's just too low level to worry about now."
...

Nick Malik [Microsoft]

unread,
Jul 5, 2005, 11:47:51 AM7/5/05
to
Hello Robert,

just two snips (one compliment, one clarification):

That was an eloquent and clear statement. I'm going to take some time to
really let that one settle in my brain. There is something fundamentally
appealing about what you said. I appreciate that you shared that with me.


>>
>>On the other hand, NMock and Reflection *does* naturally lead folks to
>>AOP.
>
> I don't follow that at all?
>

AOP is an interesting innovation. As noted in another response, I did not
interpret your acronyms correctly before posting my response, so it wasn't
terribly coherent in retrospect.

I do think that the fundamental ideas of Reflection added to the direction
that became AOP, especially since it is largely enabled, in Java and now in
C#, by the use of reflective mechanisms in the language.

Cross cutting concerns , and the weaving of the modules with injection at
the fundamental level, are pretty clever ideas. I don't think that they
would be obvious, or easily learned, unless folks were already practicing
reflection and using injection.

I've found that many developers learn injection, as a rigor, when they are
told, in no uncertain terms, that they WILL use test-driven-development or
will write unit tests for the code. This leads to the realization that you
cannot unit test modules effectively until you can isolate them from their
dependencies, which leads to a lot of refactoring.

JMock/NMock ease the pain somewhat and offer a pathway to learning the
fundamental concept of injection or dependency inversion. As I said, I
believe that this helps form the conceptual framework that has allowed AOP
to thrive (somewhat).

<aside> I think I spend more time analyzing people than code. </aside>

hans...@hotmail.com

unread,
Jul 5, 2005, 12:05:37 PM7/5/05
to
>
> In my view the OO-ness of a system can be identified by tracing the
> dependencies between the modules. If the high level policy modules

> depend on (directly call) lower level modules, which then call even
> lower level modules, then the program is procedural and not object
> oriented.
>
> However, if the high level policy modules are independent, or depend
> solely on polymorphic interfaces that the lower level modules
> implement, then the system is OO.
>
> In other words, it all has to do with the *direction* of the
> dependencies.

Hi,
I agree with you that DIP is a valuable technique when developing
software. But so are many other techniques. Encapsulation, polymorphism
good naming conventions etc. are all useful techniques when developing
software. Some of these techniques are also classified as being
'OO' by developers.

Techniques for describing someone's perception of the world are also
valuable in software development. Categorizing phenomenon into events,
entities, roles, values etc. are indispensable techniques when building
software. Some of these techniques are also classified as 'OO' by
many developers.

It seems as if your definition of 'OO' is only related to the
direction of dependencies along the axis of level of abstraction. What
is the value of this definition? Stating that some software is OO says
close to nothing about the software.

Instead of focusing on meaningless definitions of OO, it would be far
more valuable to focus on important aspects of software development.
For one thing, far more effort should be put into techniques for how to
describe problems, not how to code them.

Robert C. Martin

unread,
Jul 5, 2005, 12:42:52 PM7/5/05
to
On 4 Jul 2005 11:03:01 -0700, "krasicki" <Kras...@gmail.com> wrote:

>It is agile because the term 'extreme' as been so over-exposed that the
>audience for this stuff evaporated.

Their is a grain of truth to that statement, but only a grain. I
called the meeting, I was there, I know. The name "agile" was
selected to represent a group of similar methodologies that included
Scrum, FDD, DSDM, Xtal, and XP. During the discussions we did mention
that the name "Extreme" was creating both positive and negative
reactions, and we wanted something a little closer to the core
motivation.

As for the audience for XP evaporating, I think you need to actually
check your facts instead of stating your opinion AS fact. There is
still a large and growing audience for XP.

>So, to be honest it is called
>agile to remove the tarnish of the extreme labeling AND to emphasize
>peppiness rather than dwell on the ever-present short-comings of the
>pseudo-methodology.

To be honest, you weren't there. All you have are opinions. I have
no problem with you expressing your opinions, but I suggest you
represent them as opinions as opposed to fact.

>Numerous OOD methodologies handle change much better than the agile
>proponents will have you believe.

Facts would be useful here. My experience has shown that agile
techniques strongly prepare software for change. I have seen
significant changes easily propagate through systems that were built
using agile techniques. I have also seen non-agile projects falter
and stall when changes needed to be applied.

>The agile practices are no more
>adept at change than anything else.

Again, facts would be useful. I can provide a simple counter fact.
Having a large batch of unit tests and acceptance tests that can be
run against the system in a matter of minutes, makes it much easier to
make changes to that system simply because it's easier to verify that
the change hasn't broken anything.

And here's an opinion, backed by a lot of observation and experience:
Writing tests first forces a design viewpoint that strongly encourages
decoupling, and that therefore fosters change.

Finally, here are some other observations from various teams that I
have coached. Customers are very happy that their input is heard
early and often. Executives love the fact that real progress is
measured on a regular (weekly) basis, and that stakeholders are
providing feedback in real time. All these things promote change
IMHO.

>What agile sells as response to
>change is really immedite gratification.

There is nothing wrong with immediate gratification so long as nothing
else is lost. Indeed, so long as nothing else is lost, immediate
gratification is better than deferred gratification. The evidence
suggests that nothing else is lost. Indeed, the evidence suggests
that the systems turn out *better*.

This shouldn't be a big surprise. Any control system works better
when you shorten the feedback loops.

>This is not responsible change management. This results in chaos and I
>have seen it in profoundly big companies whose platitudes, awards, and
>self-serving hype disguise the fact that beneath the covers employees
>were agile enough to artificially meet deadlines to cash in on bonuses
>while the quality of the software remains dubious to this day.

Agile Methods are NOT a mad rush to functionality. They are not
dotcom stupidity. Indeed, the agile methods value high quality code
and high quality designs more than any other methods I know of.
Consider rules such as "no duplicate code", "write tests before code",
"Don't let the sun set on bad code", etc, etc. There are very strong
values that are backed up by disciplines.

>Agile to me means slippery and dodgy - I don't like it and it is
>unacceptable in mission critical scenarios.

You have put the name "Agile" on something that is not Agile. Agile
does not mean "hacking". Agile does not mean running off half-cocked.
Agile does not mean slippery and dodgy. Agile means moving in very
small, determined, disciplined steps with a mass of verification at
each step, and lots of feedback from previous steps.

Robert C. Martin

unread,
Jul 5, 2005, 12:51:05 PM7/5/05
to
On Mon, 4 Jul 2005 13:10:22 -0700, "Nick Malik [Microsoft]"
<nick...@hotmail.nospam.com> wrote:

>Note: I did not say that "Agile" is incompatible with "design." I believe
>it is incompatible with "Big Design."

Not quite. Agile methods involve much more design than "Big Design"
methods. However, the design is done on a different schedule. Design
is taking place all the way through the project, at every iteration.
This design is no less rigorous than a big design up front. Indeed,
it is *more* rigorous, because each design decisions is documented by
a series of unit tests and acceptance tests that must be written
*before* the code that makes them pass.

>I hope to have made it clear that I
>believe that you can (and should) perform MDA on an agile project. I've
>done it. I've seen it done.

MDA is perhaps different from what you think it is. MDA is the notion
of drawing diagrams and then automatically converting them to code
using some kind of translator. In a true MDA environment you would do
all of your programming by drawing diagrams.

>It appears that you've been told that agile
>methods leave no room for design. My guess is you heard that from a
>self-proclaimed XP evangelist. I rank them a notch below most TV Shopping
>Channel pitchmen in their respect for science, impartiality, or fundamental
>integrity.

Nick, I'd like you to name some names. Who are these self-proclaimed
XP evangelists who are a notch below...?

Frankly I don't see any. There are some folks out there who's
enthusiasm sometimes gets the better of them, but that's a whole
different matter. On the other hand, there *are* people out there
making utterly ridiculous negative assertions about XP and Agile.
Some have even written books about how bad XP is. It's clear that
these people have never done XP, don't know much about XP, and don't
care to know anything other than that they don't like it. They simply
bash for the joy of bashing. THESE are the folks who remind ME of TV
pitchmen.

krasicki

unread,
Jul 5, 2005, 8:53:24 PM7/5/05
to
Robert C. Martin wrote:
> On Mon, 4 Jul 2005 13:10:22 -0700, "Nick Malik [Microsoft]"
> <nick...@hotmail.nospam.com> wrote:
>
> >Note: I did not say that "Agile" is incompatible with "design." I believe
> >it is incompatible with "Big Design."
>
> Not quite. Agile methods involve much more design than "Big Design"
> methods. However, the design is done on a different schedule. Design
> is taking place all the way through the project, at every iteration.
> This design is no less rigorous than a big design up front. Indeed,
> it is *more* rigorous, because each design decisions is documented by
> a series of unit tests and acceptance tests that must be written
> *before* the code that makes them pass.

We know, Bob, we know. We've heard this tune many times. The OP asked
about OOD. This is a perfect example of what it isn't.

What you are describing is bottom up, seat-of-the-pants programming
with perfunctory salutes to an unwitting user all of whom pretend they
have immunity from reality. And as long as the inmates are in charge
they're very happy subscribing to this stuff. *You mean, NO BOSSES?*

Look, OOD is about designing theoretical systems that may get
implemented based on analysis, testing, cost, and security factors.
That's rigor!

The fact that programmers sweat to bolt one whimsical idea to the next
is not rigor.

Rube Goldberg software development was not invented by XP but it is
given certification status thanks to XP. Even Rube Goldberg
contraptions are -cough- *designed*, documented, measured, and
quantified - yes, they are. And that makes them real - just as real as
real systems.

-snip-

krasicki

unread,
Jul 5, 2005, 9:26:36 PM7/5/05
to
Robert C. Martin wrote:
> On 4 Jul 2005 11:03:01 -0700, "krasicki" <Kras...@gmail.com> wrote:
>
> >It is agile because the term 'extreme' as been so over-exposed that the
> >audience for this stuff evaporated.
>
> Their is a grain of truth to that statement, but only a grain. I
> called the meeting, I was there, I know. The name "agile" was
> selected to represent a group of similar methodologies that included
> Scrum, FDD, DSDM, Xtal, and XP. During the discussions we did mention
> that the name "Extreme" was creating both positive and negative
> reactions, and we wanted something a little closer to the core
> motivation.
>
> As for the audience for XP evaporating, I think you need to actually
> check your facts instead of stating your opinion AS fact. There is
> still a large and growing audience for XP.

There is still a large and growing audience for the movie, Plan 9 from
Outer Space. I'm not holding my breath that it will be reconsidered
for an Oscar.

>
> >So, to be honest it is called
> >agile to remove the tarnish of the extreme labeling AND to emphasize
> >peppiness rather than dwell on the ever-present short-comings of the
> >pseudo-methodology.
>
> To be honest, you weren't there. All you have are opinions. I have
> no problem with you expressing your opinions, but I suggest you
> represent them as opinions as opposed to fact.

The posters here can google the 'fact' that a number of XP proponents
bemoaned ever calling the methodolgy 'extreme' because it had become
such a loaded phrase culturally and politically. Tell me you go into
conservative insurance, banking, and finacial services meetings
emphasizing the extreeme nature of your methodology or the
revolutionary (enterprise) culture shock they entail.

Who's being deceitful here?

>
> >Numerous OOD methodologies handle change much better than the agile
> >proponents will have you believe.
>
> Facts would be useful here. My experience has shown that agile
> techniques strongly prepare software for change.

My problem is that they should strongly prepare software for long-term
production activity because the software conforms to spec and QA.
There is no reason to predict change if the job is done right. Let's
call this pernicious cost, *change-creep*. And, let's be honest, this
is an admission that gathering reliable requirements was an exercise
akin to nailing jello to a wall - expect change because we've
conditioned the audience to have the attention span of gerbils.

> I have seen
> significant changes easily propagate through systems that were built
> using agile techniques. I have also seen non-agile projects falter
> and stall when changes needed to be applied.

Is this mildly misleading or are we pretending the audience for this
discussion are idiots?

>
> >The agile practices are no more
> >adept at change than anything else.
>
> Again, facts would be useful.

Years ago I chased all of you around the proverbial 'fact-checking'
bush and got nowhere. Agile is putting it mildly. I feel like I'm in
a Marx brothers movie discussing XP and agile - no sooner than one door
closes, another opens with another comedian wanting to know Viaduct?

I agree facts would be useful. After so many years, where are your
facts?


> I can provide a simple counter fact.
> Having a large batch of unit tests and acceptance tests that can be
> run against the system in a matter of minutes, makes it much easier to
> make changes to that system simply because it's easier to verify that
> the change hasn't broken anything.

And what design do you contrast the running system to to know that it
is doing what it is intended to do? Or is your assertion that the
running design is infallible?

>
> And here's an opinion, backed by a lot of observation and experience:
> Writing tests first forces a design viewpoint that strongly encourages
> decoupling, and that therefore fosters change.

My goal is not to foster change. And bad tests, testing faulty
assumptions yield successful test results. Without a well documented
design that exposes such flaws you have no metric to evaluate the
quality of what you are doing. But let's not dwell on quality.

>
> Finally, here are some other observations from various teams that I
> have coached. Customers are very happy that their input is heard
> early and often. Executives love the fact that real progress is
> measured on a regular (weekly) basis, and that stakeholders are
> providing feedback in real time. All these things promote change
> IMHO.

What promotes confidence that the thing works correctly? Bells,
whistles, and favorite colors?

>
> >What agile sells as response to
> >change is really immedite gratification.
>
> There is nothing wrong with immediate gratification so long as nothing
> else is lost. Indeed, so long as nothing else is lost, immediate
> gratification is better than deferred gratification. The evidence
> suggests that nothing else is lost. Indeed, the evidence suggests
> that the systems turn out *better*.

What evidence and how was this evidence accumulated? All billable
hours accounted for?

>
> This shouldn't be a big surprise. Any control system works better
> when you shorten the feedback loops.

Only if the feedback makes sense.

>
> >This is not responsible change management. This results in chaos and I
> >have seen it in profoundly big companies whose platitudes, awards, and
> >self-serving hype disguise the fact that beneath the covers employees
> >were agile enough to artificially meet deadlines to cash in on bonuses
> >while the quality of the software remains dubious to this day.
>
> Agile Methods are NOT a mad rush to functionality. They are not
> dotcom stupidity. Indeed, the agile methods value high quality code
> and high quality designs more than any other methods I know of.
> Consider rules such as "no duplicate code", "write tests before code",
> "Don't let the sun set on bad code", etc, etc. There are very strong
> values that are backed up by disciplines.

Parse the sentence. Everything you value is code-centric. Open your
mind to OOD.


> >Agile to me means slippery and dodgy - I don't like it and it is
> >unacceptable in mission critical scenarios.
>
> You have put the name "Agile" on something that is not Agile. Agile
> does not mean "hacking". Agile does not mean running off half-cocked.
> Agile does not mean slippery and dodgy. Agile means moving in very
> small, determined, disciplined steps with a mass of verification at
> each step, and lots of feedback from previous steps.
>

Try the telephone communication game with some friends some day.
Whisper a nontrivial message to someone near you and have them do the
same with each person in the room taking a turn and see what comes out
compared to the original.

Your argument is not compelling today any more than it was many years
ago.

Daniel Parker

unread,
Jul 6, 2005, 6:15:58 AM7/6/05
to
"krasicki" <Kras...@gmail.com> wrote in message
news:1120611204.4...@f14g2000cwb.googlegroups.com...

> Look, OOD is about designing theoretical systems ...

Just out of curiosity, what theory?

-- Daniel


Daniel Parker

unread,
Jul 6, 2005, 8:00:08 AM7/6/05
to
"Robert C. Martin" <uncl...@objectmentor.com> wrote in message
news:j5elc1t6jakj9cg1h...@4ax.com...

>
> Who are these self-proclaimed
> XP evangelists who are a notch below...?
>
> Frankly I don't see any. There are some folks out there who's
> enthusiasm sometimes gets the better of them, but that's a whole
> different matter. On the other hand, there *are* people out there
> making utterly ridiculous negative assertions about XP and Agile.
> Some have even written books about how bad XP is. It's clear that
> these people have never done XP, don't know much about XP, and don't
> care to know anything other than that they don't like it. They simply
> bash for the joy of bashing. THESE are the folks who remind ME of TV
> pitchmen.
>
I think the criticism is of writing which is largely polemical, and I think
it would be fair to put much of the writing by XP proponents in this
category. A good test would be to check whether the author treats XP as if
it were the one and only methodology that has no problems, and a good number
of articles on the subject appear to fall into this category. It's hard to
find blogs that dissect both successful and unsuccessful XP projects and
systematically discuss the consequences of the various practices, which is
what you'd expect if the author wanted to be taken seriously. Instead, you
get pictures of happy programmers on the cover of Software Development
magazine, links to projects that seem to go dead after a while, and links to
puff pieces. None of this is anti-XP; it's just that evangelical writing
tends to come across as a little bit silly when presented to a professional
audience.

Regards,
Daniel Parker


Thomas Gagne

unread,
Jul 6, 2005, 10:28:27 AM7/6/05
to
Nick Malik [Microsoft] wrote:
> "Robert C. Martin" <uncl...@objectmentor.com> wrote in message
> news:0jk7c1tj3hhc2oua6...@4ax.com...

>
>>Whether there are philosophies of software development is irrelevant;
>>though I think there may be. My point is that OO has often been
>>touted as a "grand overarching philosophy" having more to do with
>>life, the universe, and everything, than with software.
>
>
> There are some folks who still claim that the world is flat. We don't talk
> about them as though they were a significant part of scientific thought. In
> my opinion, it is fair to include "grand OO philosophers" in this category.
> Anyone who says that OO is a grand philosophy is ignorant of both software
> engineering and philosophy. Kant, Camus, Sartre... now that's philosophy.
> We are on the same page on this one.

I'm not.

I'll see your "Kant, Camus, Sartre" with Socrates, Plato, and Aristotle.
Don't worry though, none of us paid attention in philosophy class.

<http://gagne.homedns.org/~tgagne/articles/TheObjectOrientedParadigm.pdf>

I once listened to a great lecture on Plato and was fascinated when
midway through it became a computer science lecture -- unknown to all,
even the professor, but the programmers in the audience.

As to your later comments, Plato's theory was working towards a 'grand
analysis.' I see no reason to shy away from "OO is a philosophy about
everything," because we're in good company.

Robert C. Martin

unread,
Jul 6, 2005, 10:32:16 AM7/6/05
to
On Tue, 5 Jul 2005 08:47:51 -0700, "Nick Malik [Microsoft]"
<nick...@hotmail.nospam.com> wrote:

>AOP is an interesting innovation. As noted in another response, I did not
>interpret your acronyms correctly before posting my response, so it wasn't
>terribly coherent in retrospect.
>
>I do think that the fundamental ideas of Reflection added to the direction
>that became AOP, especially since it is largely enabled, in Java and now in
>C#, by the use of reflective mechanisms in the language.
>
>Cross cutting concerns , and the weaving of the modules with injection at
>the fundamental level, are pretty clever ideas. I don't think that they
>would be obvious, or easily learned, unless folks were already practicing
>reflection and using injection.

The history of AOP would be interesting to mine. Your view of it, and
my view of it, are very different. I haven't done the research, so I
don't know which is more valid. I will say that from my point of view
AOP aren't strongly related. The original weavers did their work by
weaving source code together. Pointcuts were treated rather like
elaborate macros with very detailed insertion specifications. Of late
the weavers have taken to weaving byte-codes together. This is
certainly an improvement because it prevents a massive recompile of
the entire system when aspects are changed.

BTW, I fear that AOP in its current form is faltering. The pointcuts
(am I using the right vernacular) use something like regular
expression matching against the names of the methods and classes that
they insert code into. This means that the aspects must have a very
low level dependency on the rest of the code, and that the system is
tied together by fragile naming conventions. This might be solvable
through metadata like C# attributes and Java's new syntax.

krasicki

unread,
Jul 6, 2005, 11:01:14 AM7/6/05
to

Well Daniel,

Not all systems consist of a web front end that accesses data from a
well-established database though many do.

OOD, depending on the tools and techniques you use, allows system
designers to model potential solutions in numerous ways. If this
tradeoff is made here, this benefit is forthcoming there and so on.
Design is about applying recombinant ideas to solving problems.

Systems are developed to solve problems in effective ways. Systems are
not developed for the sake of software or to gratify the provincial and
esoteric ego needs of the employees charged with getting the system
implemented.

I'll give you a good example. A Hilton was just built in Hartford and
the building went up fast to meet the deadline of the new convention
center being built next door. All federal guidelines were applied.

So come inspection day, Connecticut's inspectors applied Connecticut
building statutes to the inspection and the building failed! Now one
could say that the building was built fast, saved money, and looks real
good and boy were the builders proud of it. Let's call this process
agile.

So sixteen of the rooms were incorrectly built for handicapped access,
an error of three inches per room. Where to get three inches? Push
the rooms into the hall and the hall fails. You can see how this goes.

Another example. A Hospital builds a new wing onto an existing
structure and these days new hospital wings look like fancy hotels.
Everything is immaculate, grand fascades, fancy everything. The
hospital wing opens without a hitch.

The first bed is rolled down the hall, the elevator button is pushed,
the door opens, the bed pushed into the elevetor as far as it can go,
but the bed still doesn't fit. Wrong sized elevator.

These are true stories. Shouldn't all of the architects of these
buildings have expected change to happen as well. Same with the
builders. maybe build with everything loose so that it can be
reassembled when the next minor detail arises? Aren't we being told
this is the way things work?

Even carpenters measure before they cut. Yet, in computer science we
are being told that we should operate as though we are all alchoholics
and take things one day at a time.

Now this philosophy can work in cases where we're building cookie
cutter, intuitive stuff but you will need a lot of luck and
extraordinary follow-through to build a complex system this way.

Phlip

unread,
Jul 6, 2005, 11:19:01 AM7/6/05
to
Thomas Gagne wrote:

> As to your later comments, Plato's theory was working towards a 'grand
> analysis.' I see no reason to shy away from "OO is a philosophy about
> everything," because we're in good company.

OO is the shadows of things on the cave wall.

--
Phlip
http://www.c2.com/cgi/wiki?ZeekLand


Daniel Parker

unread,
Jul 6, 2005, 11:41:10 AM7/6/05
to
krasicki wrote:
> Daniel Parker wrote:
> > "krasicki" <Kras...@gmail.com> wrote in message
> > news:1120611204.4...@f14g2000cwb.googlegroups.com...
> >
> > > Look, OOD is about designing theoretical systems ...
> >
> > Just out of curiosity, what theory?
> >
> > -- Daniel
>
> Well Daniel,
>
> OOD, depending on the tools and techniques you use, allows system
> designers to model potential solutions in numerous ways. If this
> tradeoff is made here, this benefit is forthcoming there and so on.
> Design is about applying recombinant ideas to solving problems.
>
My question is narrower. I don't think OOD provides any _theoretical_
guidence to building software. For example, if I'm writing a system
that must be fault tolerant, I refer to theoretical work on
transactioning, I don't think OOD offers anything analagous to that.
Do you disagree?

Regards,
Daniel Parker

Michael Feathers

unread,
Jul 6, 2005, 11:42:43 AM7/6/05
to


It's pretty amazing to me that you find anything in common with Agile in
these scenarios. They all sound like cases were there was no feedback
or testing. Sounds more like plan-driven development to me.

> These are true stories. Shouldn't all of the architects of these
> buildings have expected change to happen as well. Same with the
> builders. maybe build with everything loose so that it can be
> reassembled when the next minor detail arises? Aren't we being told
> this is the way things work?

Well, the fact is software is malleable. In fact it is too malleable.
It isn't hard to change software at all. All you have to do is type a
couple of characters in any program and you can break it. Because that
is the way that software is, we need tests to give it backbone.

> Even carpenters measure before they cut. Yet, in computer science we
> are being told that we should operate as though we are all alchoholics
> and take things one day at a time.

The problem is: misunderstanding the material you are working with.
Code is not wood or concrete.

Michael Feathers
www.objectmentor.com


Daniel Parker

unread,
Jul 6, 2005, 11:53:22 AM7/6/05
to

Michael Feathers wrote:
>
> The problem is: misunderstanding the material you are working with.
> Code is not wood or concrete.
>

Some of it's more like putty, some of it's more like clay. Nobody, not
you, not Kent Beck, not Robert Martin, not Ron Jeffries, has fully
solved that problem.

-- Daniel

Nick Malik [Microsoft]

unread,
Jul 6, 2005, 11:54:31 AM7/6/05
to
Hi Thomas,


"Thomas Gagne" <tga...@wide-open-west.com> wrote in message
news:Y4OdnUFqe_Q...@wideopenwest.com...


> Nick Malik [Microsoft] wrote:
>> "Robert C. Martin" <uncl...@objectmentor.com> wrote in message
>> news:0jk7c1tj3hhc2oua6...@4ax.com...
>>

>>>My point is that OO has often been
>>>touted as a "grand overarching philosophy" having more to do with
>>>life, the universe, and everything, than with software.
>>
>>

>> Anyone who says that OO is a grand philosophy is ignorant of both
>> software engineering and philosophy. Kant, Camus, Sartre... now that's
>> philosophy. We are on the same page on this one.
>
> I'm not.
>
> I'll see your "Kant, Camus, Sartre" with Socrates, Plato, and Aristotle.
> Don't worry though, none of us paid attention in philosophy class.

My father is a professor of philosophy. He holds a Ph.D. from London
University and an Ed.D from Columbia. I paid attention.

>
> <http://gagne.homedns.org/~tgagne/articles/TheObjectOrientedParadigm.pdf>
>

The paper you cite is interesting. However, the author makes only one
conclusion: that we should understanding that a connection exists between
formalism in general and OO design. It does not make a case for the use of
OO as a philosophy, only that it grew out of the efforts of an early
scientist and philosopher.

First off: in the early days of modern thought, science, philosophy, and
mathematics tended to blend together. That is to say that the analytical
underpinnings of all three are related through the necessity of creating
analytical methods of observation and categorization. These methods were
described well by many early thinkers, including the greats that you
subscribe to.

If you are to look at the modern outcroppings of these strains of thought,
you will find that the early notions of forms have had their greatest
influence on the natural sciences, where observation, categorization, and
analysis are fundamental to our understanding of biology and chemistry.
They did less to influence the sciences of mathematics, and their influence
on philosophy is indirect at best. As the age of reason begins, science is
already using these notions to positive effect, so it is natural to adopt
formalisms into philosophy. Unfortunately, those that do adopt such
formalisms beyond simple description and categorization find that they are
dealing more with medicine and psychology and less with philosophy. Their
contributions are not part of the same canon of thought as a result.

It is true that the theory of forms is related to the categorization of
similarities and differences, as described by Plato. However, this theory
forms the basis for all decompositional and observational analysis. While
that certainly includes the analysis that leads to well structured OO
programs, I would posit that it leads to well structured forms in any logic
system. There are good examples of structured programming in Pascal that
can trace their analytical roots to the same theories. There are good
examples of mathematical algorithms created for assembler that can do the
same.

More interestingly, the cited paper does nothing to forward the notion of OO
as a philosophy in itself. The author shows that OO inherits the
fundamental mechanisms of analysis originally described by Plato. But he
does not show that object orientation has any positive effect on, as he puts
it, our understanding of "the world and the nature of knowledge." It is an
expression of formal analysis, but he makes no assertion that the expression
is somehow better or more appropriate than other forms of expression. He
claims that it is part of a lineage, but offers no reason to the reader to
believe that it is anything more than an analytical dead end. As a result,
the paper is an analysis with no recognizable conclusion other than to say,
to others, "isn't this neat?"

I find it interesting that the author cites only himself, even as he
describes ideas that he attributes to others. If I were my father, I doubt
I would have accepted his paper in even a freshman level philosophy course.

> As to your later comments, Plato's theory was working towards a 'grand
> analysis.' I see no reason to shy away from "OO is a philosophy about
> everything," because we're in good company.

What Plato founded, with his grand analysis, was science. Philosophy uses
these (and other) methods to analyze human existence, but the evidence it
leaves behind is not its methods, but its conclusions. In that, Philosophy
is a path to analysis of the human "form," with the results being great
opinions on the fundamental nature of human existence, thought,
collaboration, society, and relationship with a higher power. These
conclusions, though reachable through various methods of analysis, are
independently interesting. These conclusions form the fabric of Philosophy.
Our infant science bears little in common with these philosophical
conclusions, except the fact that the people who create software are humans
themselves. While our science is influenced by these conclusions, we
contribute nearly nothing to them.

We are in good company to call our notions of logical expression a form of
science or a form of mathematics.
We are a misfit in the world of Philosophy.

Daniel Parker

unread,
Jul 6, 2005, 12:01:40 PM7/6/05
to
Thomas Gagne wrote:
>
> Plato's theory was working towards a 'grand
> analysis.' I see no reason to shy away from "OO is a philosophy about
> everything," because we're in good company.

It's not meaningful to talk about OO as a philosophy. Rather, the
philosphical problem would be to critique OO, to investigate the
meaning of it's statements.

-- Daniel

krasicki

unread,
Jul 6, 2005, 12:03:12 PM7/6/05
to

Most of the sophisticated modeling tools are just that modeling tools,
you supply the theory. However, in cases where transactional processes
are well-defined between say a database and known SQL transaction
modules of one kind or another, there is nothing stopping someone
modeling from creating transaction design objects whose given
attributes include transaction type and other metrics (fault tolerance
under whatever constraints, and so on).

And if enough of these are identified, algorithms to identify the best
transaction for a given situation certainly is feasible.

I have been out of the CASE tool industry for a number of years but I
am aware of very sophisticated efforts to create design composites
that, if they succeed, will be very, very rich in sophistication. I
have to leave it at that.

However, there is nothing stopping an individual from inventing their
own design object - say, the fault tolerant transaction object.

Make it a notational figure (circle square). Map it to to more formal
system mechanics (code, module, plug-in, whatever)... and so on.

Nick Malik [Microsoft]

unread,
Jul 6, 2005, 11:59:49 AM7/6/05
to
<hans...@hotmail.com> wrote in message
news:1120579537.9...@g44g2000cwa.googlegroups.com...

> >
> It seems as if your definition of 'OO' is only related to the
> direction of dependencies along the axis of level of abstraction. What
> is the value of this definition? Stating that some software is OO says
> close to nothing about the software.

If you read the papers he provides links to, you will understand the
discussion.

H. S. Lahman

unread,
Jul 6, 2005, 12:20:44 PM7/6/05
to
Responding to Martin...

>>>Nobody knows exactly (or even inexactly) what OOA is. There are a number of
>>>books and papers written about it, but they don't agree. There is not
>>>even a set of cogent schools of thought. OOA is a term that we bandy
>>>about with authority, but have no real definition for.
>>
>>You keep repeating this mantra but it is still untrue.
>
>
> We disagree. Indeed, I don't think there is an agreed definition of
> analysis, let alone object oriented analysis. I have sat in too many
> meetings (at a company that you and I know well) that devolved into 30
> minute arguments about whether a certain technical topic was
> "analysis" or "design".
>
> "We shouldn't be talking about that now, it's a design concept."
> "No it's not, it's part of the problem space. It's analysis."
> "It is not, it's too low level."
> "No, it's critical. We have to decide it now."
> "No, it can wait, it's just too low level to worry about now."

You should try attending a translation model review involving
experienced developers. Whether there is implementation pollution
present is usually quite clear. Authors may have blind spots as
individuals, but they are quick to recognize the problem when it is
pointed out. The tricky part lies is eliminating implementation
pollution, not recognizing it.

<Apocryphal example>
Back in the mid-'80s I had a similar skepticism about being able to
eliminate implementation decisions from OOA models. I thought I had
found an irrefutable example and our group was too inexperienced not to
agree with me.

Basically it involved a series of operations that were essentially large
scale processing loops that were nested. The loops "walked" the
hardware to initialize its state. Each loop involved interactions among
several object state machines. As it happens the hardware was under
parallel development and the hardware guys could not yet tell us which
loop should be the outer one for the fastest execution. So it seemed
obvious we had to commit to one "implementation" solution by fixing the
order of loops and, if that turned out wrong, we would have to go back
and change it.

[Note that none of us was confused about the fact that hard-wiring the
order of the loops was an implementation decision. That's because the
loop order was a performance issue, not a functional issue. So the
problem was finding a way to avoid explicitly resolving it in the OOA.
(The processing /within/ each loop was, of course, a matter of hardware
functional requirements.)]

I cornered Mellor at a conference and presented the problem. It took
him maybe thirty seconds to recognize the each loop's processing was a
daisy-chain sequence of events that remained the same regardless of the
order of the loops. That meant that the order of loops really came down
to where the first event in the loop was issued. It was trivial to
locate two spots and insert AAL code to parametrically generate the
right starter event based on configuration data. We could then supply
the value of the configuration data later when the hardware guys got
their act together without touching the OOA model.

It was a substantial blow to the ego after spending many hours over a
couple of months with some pretty sharp developers to become convinced
there was no way out and then get blown away in a few seconds. Today,
with the benefit of experience, I would probably have no trouble
recognizing that solution (though it would probably take me a couple of
minutes rather than 30 seconds). In all the intervening years I still
have not seen a situation where one had to incorporate an implementation
decision in an OOA model. I have also seen very few situations where
there was any confusion over whether a decision was implementation or not.
</Apocryphal example>


*************
There is nothing wrong with me that could
not be cured by a capful of Drano.

H. S. Lahman
h...@pathfindermda.com
Pathfinder Solutions -- Put MDA to Work
http://www.pathfindermda.com
blog: http://pathfinderpeople.blogs.com/hslahman
(888)OOA-PATH

H. S. Lahman

unread,
Jul 6, 2005, 12:31:51 PM7/6/05
to
Responding to Martin...

> MDA is perhaps different from what you think it is. MDA is the notion
> of drawing diagrams and then automatically converting them to code
> using some kind of translator. In a true MDA environment you would do
> all of your programming by drawing diagrams.

That is the translation view.

However, MDA also supports the elaboration viewpoint where there is no
or limited code generation. In that context MDA is focused on providing
interoperability for the tools the developer employs. So the round-trip
tools like Together also fall within the MDA umbrella. In fact, most of
the vendors represented in OMG who are active in MDA are round-trip vendors.

[OTOH, the round-trip tools are doing more and more code generation so
they are evolving towards translation. Eventually they will be
indistinguishable.]

Fundamentally MDA is just a framework that standardizes migrating
information between different representations. Those representations
can be at the same level (e.g., XML vs. RDB schemas) or at different
levels (UML OOA and 3GL code). The transformation can be automatic,
manual, or some combination.

Laurent Bossavit

unread,
Jul 6, 2005, 12:40:13 PM7/6/05
to
> So sixteen of the rooms were incorrectly built for handicapped access,
> an error of three inches per room. Where to get three inches? Push
> the rooms into the hall and the hall fails. You can see how this goes.

We can see how this goes for buildings. But we're not buildings experts,
and unless I missed something neither are you. You're asking us to draw
insight from a foreign domain about which we know *less* than the domain
we're supposed to apply it to - software. That's the wrong direction to
operate an intuition pump.

Your rant, while entertaining, is at the level of caricature rather than
real insight; for insight, read something like Stewart Brand's /How
Buildings Learn/, which has real stories about architecture.

Most software has the property that if you twiddle one small bit
incorrectly, an entire system might crash with disastrous consequences.
(Think off-by one errors.) As far as I can't tell, no building has this
property - if you remove one brick, even a brick at the very bottom, the
building stays up.

You're saying that software design should be based in theory. You can't
have your cake and eat it too - the *process* whereby something is
designed should certainly take into account the characteristic
properties of the thing being designed. (Call that meta-design.)

Buildings are brick. Software is text. Brick and text probably call for
different design processes.

Laurent

Nick Malik [Microsoft]

unread,
Jul 6, 2005, 12:30:15 PM7/6/05
to
"Robert C. Martin" <uncl...@objectmentor.com> wrote in message
news:j5elc1t6jakj9cg1h...@4ax.com...

> On Mon, 4 Jul 2005 13:10:22 -0700, "Nick Malik [Microsoft]"
> <nick...@hotmail.nospam.com> wrote:
>
>>Note: I did not say that "Agile" is incompatible with "design." I believe
>>it is incompatible with "Big Design."
>
> Not quite. Agile methods involve much more design than "Big Design"
> methods. However, the design is done on a different schedule. Design
> is taking place all the way through the project, at every iteration.
> This design is no less rigorous than a big design up front. Indeed,
> it is *more* rigorous, because each design decisions is documented by
> a series of unit tests and acceptance tests that must be written
> *before* the code that makes them pass.

Alas, I'm guilty of attempting to imply a term without doing a good job of
providing a meaningful definition. The acronym BDUF is a bit off-putting in
conversations where the other participant is intentionally ignorant of agile
concepts. Please forgive my lack of clarity.

>
>>I hope to have made it clear that I
>>believe that you can (and should) perform MDA on an agile project. I've
>>done it. I've seen it done.
>
> MDA is perhaps different from what you think it is. MDA is the notion
> of drawing diagrams and then automatically converting them to code
> using some kind of translator. In a true MDA environment you would do
> all of your programming by drawing diagrams.

Yes. Many tools that are being used for this have the ability to be used
many times through the process. You can create the model, hand-modify the
code, and recreate the model from the code. This round-tripping is Very
useful in iterative design processes because each innovation can be
inspected for compliance with the fundamental notion of "does it solve This
business problem?" This keeps with the notion of solving the problem when
it is presented, and not before. As I understand this, this is fairly key
to keeping costs low in an agile environment.

I have done this on some small projects and found it to be a useful
practice. I'm hoping to do more of this in the future, as I believe it
makes the "design conversation" more succinct during the sprint planning
stages. It also helps to find the "outliers" (objects that were created by
programmers who didn't understand the design and just shoved something into
a static utility class to get it out of the way). These become stories to
refactor them in the coming iteration.

>
>>It appears that you've been told that agile
>>methods leave no room for design. My guess is you heard that from a
>>self-proclaimed XP evangelist. I rank them a notch below most TV Shopping
>>Channel pitchmen in their respect for science, impartiality, or
>>fundamental
>>integrity.
>
> Nick, I'd like you to name some names. Who are these self-proclaimed
> XP evangelists who are a notch below...?

Certainly not you, Robert. The members of the original Agile Alliance have
(mostly) done a good job of using words that are fair and partial, if
sometimes a bit strident. However, there are many "hucksters" out there
that have read half of a book on XP and then go out preaching about XP with
no more understanding than a teenage high school dropout attempting to
lecture me on the mechanics of the supply chain. I have come across some of
them. One offered me a job (which I turned down). Others have underbid me
(when I was in the consulting business). Others have vociferously
challenged me when I dared mention UML when discussing the notions of OO
design. They were not credible, and they gave "agile" a bad name.

There are many people making money off of agile methods that are not as
upstanding as you are, Robert. Surely, you have run across some of them
yourself. Naming names will only get me into a libel suit. Their goal is
to make money, not improve software.

>
> Frankly I don't see any. There are some folks out there who's
> enthusiasm sometimes gets the better of them, but that's a whole
> different matter.

see above

> On the other hand, there *are* people out there
> making utterly ridiculous negative assertions about XP and Agile.
> Some have even written books about how bad XP is. It's clear that
> these people have never done XP, don't know much about XP, and don't
> care to know anything other than that they don't like it. They simply
> bash for the joy of bashing. THESE are the folks who remind ME of TV
> pitchmen.

When I first heard about agile methods, I was very dubious. I decided that
I had two choices: to get defensive or to read more about it. I took the
latter. I've learned a lot and I find myself explaining agile methods to
people who have mistaken notions about it. I've used agile methods more
frequently in the past few years and I expect that will continue. While I
am a scrummaster, I don't pretend to be an expert on agile methods. I defer
to those who have done more than I have, and I glide under the noise, being
a change agent along the way. On the other hand, I have found that simply
using individual methods as "best practices" (which they demonstrably are)
gets you in the door with people who otherwise scream and run for cover when
you use the term "agile" in their presence. There are ways to change
organizations that are subtle, but powerful.

I agree that the noise out there is not rational. I agree that there are
folks who oppose agile methods with an almost hysterical response mechanism.
I posit only that there are a few folks on the other side as well. We need
to "lower the heat and turn up the light" in all our conversations,
recognizing both the good and bad, the known and the unknown, in order to
achieve real change in this crazy profession we've chosen to practice.

I'm on the side of reason.

Phlip

unread,
Jul 6, 2005, 1:02:07 PM7/6/05
to
Laurent Bossavit wrote:

> Your rant, while entertaining, is at the level of caricature rather than
> real insight; for insight, read something like Stewart Brand's /How
> Buildings Learn/, which has real stories about architecture.

Or /Notes on the Synthesis of Form/, by Chris Alexander, for the distinction
between "self-conscious" architecture, like Frank Lloyd Wright masterpieces
that suck, and "unself-consciously" humble dwellings that tune and adjust in
iterations, and don't suck.

--
Phlip
http://www.c2.com/cgi/wiki?ZeekLand


Robert C. Martin

unread,
Jul 6, 2005, 1:16:01 PM7/6/05
to
On 5 Jul 2005 09:05:37 -0700, hans...@hotmail.com wrote:

>I agree with you that DIP is a valuable technique when developing
>software. But so are many other techniques. Encapsulation, polymorphism
>good naming conventions etc. are all useful techniques when developing
>software. Some of these techniques are also classified as being
>'OO' by developers.

True. However, of all the techniques you mentioned above,
polymorphism is the one that is most strongly identified with OO.
Encapsulation, naming conventions, and other qualities of good
software structure have long been part of software development, even
before OO became popular.

DIP is the design principle that guides the use of polymorphism; and
polymorphism is the language feature that enables DIP. These two,
working together, are what puts the OO in OOD.

>Techniques for describing someone's perception of the world are also
>valuable in software development.

Absolutely! Though this has little to do with OO. I am much more
interested in the work of Ward Cunningham and Rick Mugridge in
expressing requirements as Tests. (See www.fitnesse.org)

>Categorizing phenomenon into events,
>entities, roles, values etc. are indispensable techniques when building
>software. Some of these techniques are also classified as 'OO' by
>many developers.

I agree that some folks call these activities OO; but I find that
strange, since these activities have been a common part of software
engineering for a very long time, and predate OO.

>It seems as if your definition of 'OO' is only related to the
>direction of dependencies along the axis of level of abstraction. What
>is the value of this definition? Stating that some software is OO says
>close to nothing about the software.

I disagree. I think there is a lot of value in precise definitions
that can act as a metric against which to measure software designs.
Given my definition, you can quickly ascertain whether a particular
design is OO or not. Moreover, there are many benefits that are
associated with this structure. Dependency Inversion is the primary
mechanism behind independently deployable binary components.

On the other hand, there is little value in using the term OO to
describe all good things that have come out of software development
over the last 40 years. There is this nasty tendency for developers
to say to themselves: "Everyone says OO is good. I am a good
developer. Therefore what I do (and what I have always done) is OO."
Through this strange logic, every good practice eventually gets
labeled OO.

>Instead of focusing on meaningless definitions of OO, it would be far
>more valuable to focus on important aspects of software development.
>For one thing, far more effort should be put into techniques for how to
>describe problems, not how to code them.

I agree that describing problems is very important, and it is an
active area of my own research (www.fitnesse.org). On the other hand,
I think that we don't put enough effort into techniques for how to
code problems. Far too many problems are related to poor coding
structure. Whole systems, and whole development teams, are brought to
their knees because their code has become so tangled and impenetrable
that it cannot be cost-effectively maintained.

Robert C. Martin

unread,
Jul 6, 2005, 1:28:49 PM7/6/05
to
On 5 Jul 2005 17:53:24 -0700, "krasicki" <Kras...@gmail.com> wrote:

>Robert C. Martin wrote:
>> On Mon, 4 Jul 2005 13:10:22 -0700, "Nick Malik [Microsoft]"
>> <nick...@hotmail.nospam.com> wrote:
>>
>> >Note: I did not say that "Agile" is incompatible with "design." I believe
>> >it is incompatible with "Big Design."
>>
>> Not quite. Agile methods involve much more design than "Big Design"
>> methods. However, the design is done on a different schedule. Design
>> is taking place all the way through the project, at every iteration.
>> This design is no less rigorous than a big design up front. Indeed,
>> it is *more* rigorous, because each design decisions is documented by
>> a series of unit tests and acceptance tests that must be written
>> *before* the code that makes them pass.
>
>We know, Bob, we know.

Courteous debate reflects better on the participants than
condescension. Are you sure that you are in a position to take a
superior tone?

>We've heard this tune many times. The OP asked
>about OOD.

Yes, and YOU changed the topic to XP. You can expect me to respond
whenever you do that.

>This is a perfect example of what it isn't.

We disagree, and that's fine. Though I think it would be better of
you expressed your opinions as opinions, instead of as hard facts.

>What you are describing is bottom up, seat-of-the-pants programming
>with perfunctory salutes to an unwitting user all of whom pretend they
>have immunity from reality. And as long as the inmates are in charge
>they're very happy subscribing to this stuff. *You mean, NO BOSSES?*

No, no, and no. This is not seat-of-the-pants, it is not an
egalitarian "no-bosses" scheme, and it is strongly tied to reality.
Nor is it bottom up. Agile methods start with requirements and work
down.

You are making claims of fact without sure knowledge of what you are
talking about. You are strongly misrepresenting what XP and Agile
are.

>Look, OOD is about designing theoretical systems that may get
>implemented based on analysis, testing, cost, and security factors.
>That's rigor!

That's a strange definition of rigor. Rigor means stiff, disciplined,
inflexible. Now consider just the TDD rules of XP. No production
code can be written until there is a failing acceptance test, and
failing unit tests. That's stiff, disciplined, and inflexible.
*That's rigor!* And that's just one aspect of Agile/XP.

>Rube Goldberg software development was not invented by XP but it is
>given certification status thanks to XP.

Why do you insist on mischaracterization without evidence? XP does
not create Rube Goldberg structures. Take a look at the source code
of FitNesse (www.fitnesse.org) if you'd like to see the kind of
architecture and code that was created using XP.

Really, you need to do some actual research before you spout your
incorrect opinions as facts.

Robert C. Martin

unread,
Jul 6, 2005, 1:42:08 PM7/6/05
to
On Wed, 6 Jul 2005 08:00:08 -0400, "Daniel Parker"
<danielaparker@spam?nothanks.windupbird.com> wrote:

>I think the criticism is of writing which is largely polemical, and I think
>it would be fair to put much of the writing by XP proponents in this
>category. A good test would be to check whether the author treats XP as if
>it were the one and only methodology that has no problems, and a good number
>of articles on the subject appear to fall into this category.

Does it? Oh I agree that there are some over-enthusiastic blurbs
written here and there. But where are the serious articles and books
that treat XP as a discipline that has no problems? Certainly the
primary XP books don't fall into that category. Nor do the articles
by the original proponents. While I would agree that there have been
certain excesses of enthusiasm amongst the original proponents (and I
have been as guilty as any), none of us has claimed silver-bullet
status for XP/Agile.

>It's hard to
>find blogs that dissect both successful and unsuccessful XP projects and
>systematically discuss the consequences of the various practices, which is
>what you'd expect if the author wanted to be taken seriously.

I'm astounded. There has been a rather large amount of critical and
thoughtful writing about XP/Agile in the magazines and newsgroups.
Folks have tried it this way and that, and have commented on how the
practices apply to them and their situations. Whole books have been
published with the research data on certain practices.

> Instead, you
>get pictures of happy programmers on the cover of Software Development
>magazine, links to projects that seem to go dead after a while, and links to
>puff pieces.

You also get links to projects that are succeeding and continue to
succeed, as well as articles in Dr. Dobbs about the problems of
XP/Agile, and links to critical and thoughtful discussions on many
blogs and newsgroups.

>None of this is anti-XP; it's just that evangelical writing
>tends to come across as a little bit silly when presented to a professional
>audience.

So does anti-writing; especially when it isn't based on any kind of
facts.

Phlip

unread,
Jul 6, 2005, 2:16:28 PM7/6/05
to
Robert C. Martin wrote:

> Does it? Oh I agree that there are some over-enthusiastic blurbs
> written here and there. But where are the serious articles and books
> that treat XP as a discipline that has no problems? Certainly the
> primary XP books don't fall into that category.

Strong the Dark Side Is

TDD's force can cause problems. Used alone, without checks and balances from
code reviews and feature reviews, frequent testing can help add many useless
features, providing a very high apparent velocity. Used incompletely, with
huge gaps between tests, TDD can reinforce bad code.

Some folks write a few tests, then write code and "refactor" for a long
time, without frequent testing. These behaviors add back the problems that
TDD helps a healthy process avoid.

TDD applies our Agile "headlights metaphor" in miniature. Imagine headlights
that can only strobe, not shine continuously. Each time you hit the test
button, you get a glimpse of your implementation's road ahead. So to change
direction, you must test more often, not less.

Teach your colleagues to write tests on code you understand, and learn to
write tests they understand. This learning begins as a team collaborates, at
project launch time, to install and use a testing framework.

--
Phlip
http://www.c2.com/cgi/wiki?ZeekLand

Robert C. Martin

unread,
Jul 6, 2005, 4:27:21 PM7/6/05
to
On 5 Jul 2005 07:01:00 -0700, "krasicki" <Kras...@gmail.com> wrote:

>Given that *XP practices* are largely rebranded existing good practice

Wait, I thought you said that XP was seat-of-the-pants, Rube Goldberg,
undisciplined, etc. Now it's existing good practice?

>I will assert that no such push exists. I submit to you that XP
>advocates are simply looking outside their shell and discovering that
>good programming practices existed despite their claims.

OK, so is your complaint is more about XP *advocates* and less about
XP itself?

>But failing
>to acknowledge that, XP advocates, as usual, run ahead of the parade
>claiming credit for the celebration.

I think that's interesting since, from the very start, "XP Advocates"
have said that the practices of XP are based on previous best
practices.
>
>Project Mercury would have never gotten off the ground if it had been
>developed in a financial services, insurance, banking, or commericial
>enterprise using *agile* methodologies.

The software for the Mercury Space Capsule was written iteratively.
The iterations were a day long. Unit tests were written in the
morning, and made to pass in the afternoon.

>Let's not sell kool-aid here. There are plenty of places wallowing in
>their own fecal ideas that will get on these newsgroups and testify how
>good it feels - come join us. Be careful not to sound like them.

By the same token, I advise you to try to sound a bit less like Howard
Dean. If you want to engage in a civilized debate over XP, then get
your facts together, and have at it. But spewing emotional baggage
around benefits nobody.

Robert C. Martin

unread,
Jul 6, 2005, 4:26:50 PM7/6/05
to
On 5 Jul 2005 18:26:36 -0700, "krasicki" <Kras...@gmail.com> wrote:

>Robert C. Martin wrote:

>> As for the audience for XP evaporating, I think you need to actually
>> check your facts instead of stating your opinion AS fact. There is
>> still a large and growing audience for XP.
>
>There is still a large and growing audience for the movie, Plan 9 from
>Outer Space. I'm not holding my breath that it will be reconsidered
>for an Oscar.

You are watching too much "House".

>
>>
>> >So, to be honest it is called
>> >agile to remove the tarnish of the extreme labeling AND to emphasize
>> >peppiness rather than dwell on the ever-present short-comings of the
>> >pseudo-methodology.
>>
>> To be honest, you weren't there. All you have are opinions. I have
>> no problem with you expressing your opinions, but I suggest you
>> represent them as opinions as opposed to fact.
>
>The posters here can google the 'fact' that a number of XP proponents
>bemoaned ever calling the methodolgy 'extreme' because it had become
>such a loaded phrase culturally and politically.

Wrong fact. The fact I was disputing was that the audience for XP had
evaporated. It has not.

> Tell me you go into
>conservative insurance, banking, and finacial services meetings
>emphasizing the extreeme nature of your methodology or the
>revolutionary (enterprise) culture shock they entail.

Sometimes, it depends on whether the folks there have developed an
unreasoned fear of the word "extreme".

>
>Who's being deceitful here?

Nobody except those who offer opinions as fact.

>> Facts would be useful here. My experience has shown that agile
>> techniques strongly prepare software for change.
>
>My problem is that they should strongly prepare software for long-term
>production activity because the software conforms to spec and QA.

XP demands that, every week, the software pass tests written by QA and
Business. These tests specify the system both functionally and
non-functionally. They establish exactly the criterion you mention
above; and they do so unambiguously and repeatably.

>There is no reason to predict change if the job is done right.

That is the silliest thing I've seen you write. Even perfect systems
must change as the world changes around them. Indeed, perfection
itself is a moving target. A system that meets all stated
requirements today, will be sub-optimal tomorrow because the world
will change around it. And you know it. Nobody could be in this
business without the truth of that being ground into his or her bones.

>> I have seen
>> significant changes easily propagate through systems that were built
>> using agile techniques. I have also seen non-agile projects falter
>> and stall when changes needed to be applied.
>
>Is this mildly misleading or are we pretending the audience for this
>discussion are idiots?

Neither. It's simply the truth. I suppose that the statement could
be considered misleading because I did not say that I have seen the
opposite. I didn not feel the need to be balanced because I was
simply refuting your Dean-isms that XP leads to Rube Goldberg,
seat-of-the-pants, non-rigorous, systems.

>> >The agile practices are no more
>> >adept at change than anything else.
>>
>> Again, facts would be useful.
>
>Years ago I chased all of you around the proverbial 'fact-checking'
>bush and got nowhere.

"All of us"? Anyway, I'm not sure I understand your point. Let's for
the moment say that your statement is accurate, and "all of us" did,
in fact, avoid the facts. Is it your argument that you are therefore
relieved of the standard you tried to hold us to? In any case, I
don't know what your "years ago" reference is about. As for facts,
there are plenty out there (both positive and negative), if you are
willing to do some due diligence prior to debate.


>
>I agree facts would be useful. After so many years, where are your
>facts?

What would you like to know? I can point you to both successful and
failed XP projects. I can point you to articles written by companies
claiming huge productivity and quality benefits. I can point you to
research studies, both positive and negative.

And, in fact, with a little tiny bit of elbow grease you could find
them yourself, because they are all freely available on the net, and
respond nicely to Google searches.

>> I can provide a simple counter fact.
>> Having a large batch of unit tests and acceptance tests that can be
>> run against the system in a matter of minutes, makes it much easier to
>> make changes to that system simply because it's easier to verify that
>> the change hasn't broken anything.
>
>And what design do you contrast the running system to to know that it
>is doing what it is intended to do? Or is your assertion that the
>running design is infallible?

I presume you are referring to the functional design; i.e. the design
of the requirements. We contrast the running system against the
design specified by the acceptance tests and unit tests.

>>
>> And here's an opinion, backed by a lot of observation and experience:
>> Writing tests first forces a design viewpoint that strongly encourages
>> decoupling, and that therefore fosters change.
>
>My goal is not to foster change. And bad tests, testing faulty
>assumptions yield successful test results. Without a well documented
>design that exposes such flaws you have no metric to evaluate the
>quality of what you are doing. But let's not dwell on quality.

Tests *are* documents. Bad tests are bad documents. Bad documents
improperly specify the system. Executable tests, written in two forms
(unit and acceptance) are a very effective way to eliminate most bad
specifications.

>> Finally, here are some other observations from various teams that I
>> have coached. Customers are very happy that their input is heard
>> early and often. Executives love the fact that real progress is
>> measured on a regular (weekly) basis, and that stakeholders are
>> providing feedback in real time. All these things promote change
>> IMHO.
>
>What promotes confidence that the thing works correctly? Bells,
>whistles, and favorite colors?

Users observing and using the system from iteration to iteration, and
release to release, while providing continuous feedback.

>> >What agile sells as response to
>> >change is really immedite gratification.
>>
>> There is nothing wrong with immediate gratification so long as nothing
>> else is lost. Indeed, so long as nothing else is lost, immediate
>> gratification is better than deferred gratification. The evidence
>> suggests that nothing else is lost. Indeed, the evidence suggests
>> that the systems turn out *better*.
>
>What evidence and how was this evidence accumulated? All billable
>hours accounted for?

I was thinking specifically of the work we've been doing on FitNesse,
and the work I've seen on JUnit, and Eclipse, as well as the systems
that I have seen in my role as a consultant and coach.

>> This shouldn't be a big surprise. Any control system works better
>> when you shorten the feedback loops.
>
>Only if the feedback makes sense.

That statement gives you an opportunity to say something concrete as
opposed to amorphous disparagements. What, in particular, about the
feedback loops in XP, doesn't make sense? I suggest you do a bit of
research on just what those feedback loops are, and what control
mechanisms XP employs around those feedback loops.

>> Agile Methods are NOT a mad rush to functionality. They are not
>> dotcom stupidity. Indeed, the agile methods value high quality code
>> and high quality designs more than any other methods I know of.
>> Consider rules such as "no duplicate code", "write tests before code",
>> "Don't let the sun set on bad code", etc, etc. There are very strong
>> values that are backed up by disciplines.
>
>Parse the sentence. Everything you value is code-centric. Open your
>mind to OOD.

I agree that I put a lot of value on code. Code is the medium in
which I work, and in which all software systems are eventually built
(by definition). As such, I think that it is very appropriate to
value code. Not that I don't value requirements, I do. Indeed, I put
a lot of research effort into finding better ways to gather, express,
and refine requirements (e.g. www.fitnesse.org).

Open my mind to OOD? I've been writing books and articles about OOD
for over ten years. I was an early adopter, and have worked hard to
advance the state of the art. I think my mind is open to OOD; though
I am always ready to learn something new.

But I'll turn this around on you. Open your mind to XP. For a very
long time you have posted negative statement on this newsgroup that
show that you know very little about it.

>Your argument is not compelling today any more than it was many years
>ago.

Unfortunately yours are no more informed than they were years ago.

topmind

unread,
Jul 6, 2005, 7:11:38 PM7/6/05
to

Robert C. Martin wrote:
> On 2 Jul 2005 10:55:05 -0700, "topmind" <top...@technologist.com>
> wrote:
>
> >In my domain one often cannot know ahead of time what will change.
>
> It's not so much a matter of knowing what will change, or even how it
> will change. It's a matter of recognizing that certain things will
> change at a different rate than others. For example report formats
> are will change at a different rate than business rules.

I am not sure what you mean. Both change, often at different
unpredictable rates.

> GUI layout
> will change at a different rate than database schemae.

Yes, but one cannot say in *advance* what will change faster.

>
> You don't even have to know whether one will change more frequently
> than the other. You just have to be able to make a reasoned guess
> that they will change at different rates and for different reasons.

Do you mean knowing the actual reason and rate? Or just knowing they
will be different?

>
> We try not to couple report formats to business rules because it would
> be a shame to inadvertently break the business rules by moving a
> column on a report.

Please clarify. I can think of situations where relating them may save
time and situations where relating them would cause headaches.

> We try to decouple the GUI layout from the
> database schemae because it would be a shame to crash the GUI when
> adding a new column to the database.

On the flip side it is sometimes nice to have a column *automatically*
appear in the CRUD (edit screens) realm so that we don't have to make
the same column addition in two or more different places. There is no
One Right Coupling decision here. Adding new columns and having to sift
through code to manually make that addition to multiple spots can be
time-consuming.

This is one reason I like data dictionaries: describe columns in one
and only one place and have whatever needs that info use it. I agree it
is not always that simple because it is subject to the 80-20 or 90-10
rule where 10% of the time we need a custom, local tweak that deviates
from the "standard" behavior. If we add a new data dictionary
attribute/flag for each exception (deviation), we have a big mess
(large interface) after a while. This is true with any "generic"
framework whether it be via OO, procedural, FP, etc.

(The OO equiv. of data dictionaries is a Field Object, by the way.)

BTW2, I would like to see you rework your "coupling" concept into a
change analysis and change pattern analysis focus. I think many would
find that more useful and subject to more objective observation and
metrics. One may disagree about the frequencies of certain changes (as
we seem to do), but the impact on code per change pattern is fairly
objective. Thus, the discipline can be divided into frequency analysis
and impact analysis, with the latter being more documentable thus far.

You may be well-positioned for this because most OO authors seem to
focus on "mental models" of the real world while you focus more on code
structure, which is more based on concrete,
western-reductionalist-style analysis instead of the prevalent OO fuzzy
eastern style that drives me up the wall. (Not that eastern style is
"bad", just less analyzable at this stage in history.)

>
>
> -----
> Robert C. Martin (Uncle Bob) | email: uncl...@objectmentor.com

-T-

Thomas Gagne

unread,
Jul 6, 2005, 7:29:28 PM7/6/05
to
Phlip wrote:
>
> OO is the shadows of things on the cave wall.

:-)

Robert C. Martin

unread,
Jul 6, 2005, 8:11:38 PM7/6/05
to
On 6 Jul 2005 16:11:38 -0700, "topmind" <top...@technologist.com>
wrote:

>
>
>Robert C. Martin wrote:
>> On 2 Jul 2005 10:55:05 -0700, "topmind" <top...@technologist.com>
>> wrote:
>>
>> >In my domain one often cannot know ahead of time what will change.
>>
>> It's not so much a matter of knowing what will change, or even how it
>> will change. It's a matter of recognizing that certain things will
>> change at a different rate than others. For example report formats
>> are will change at a different rate than business rules.
>
>I am not sure what you mean. Both change, often at different
>unpredictable rates.

Exactly. They change at different rates. If we couple them, we will
be forced to make changes to one because of the other.

>> GUI layout
>> will change at a different rate than database schemae.
>
>Yes, but one cannot say in *advance* what will change faster.

Sometimes you can, and sometimes you can't; but it doesn't matter.
The issue is that they change for different reasons.

>Do you mean knowing the actual reason and rate? Or just knowing they
>will be different?

The latter.

>> We try not to couple report formats to business rules because it would
>> be a shame to inadvertently break the business rules by moving a
>> column on a report.
>
>Please clarify. I can think of situations where relating them may save
>time and situations where relating them would cause headaches.

I think that was pretty clear. If we couple the report format to the
business rules, (for example, by doing computations at the same time
that we are generating the report) then a change to the format of the
report will break the business rules. Or rather, when you change the
report format you'll have to make similar changes to the structure of
the business rule algorithm.


>
>> We try to decouple the GUI layout from the
>> database schemae because it would be a shame to crash the GUI when
>> adding a new column to the database.
>
>On the flip side it is sometimes nice to have a column *automatically*
>appear in the CRUD (edit screens) realm so that we don't have to make
>the same column addition in two or more different places. There is no
>One Right Coupling decision here. Adding new columns and having to sift
>through code to manually make that addition to multiple spots can be
>time-consuming.

Yes, it depends. If we are writing a program to throw away in a day
or two, then we might take a short-cut like that. On the other hand,
if we are developing a system that must survive through years of
changing requirements, then coupling the GUI to the Schema is suicide;
and the time you might save in so doing is a false economy.

-----
Robert C. Martin (Uncle Bob) | email: uncl...@objectmentor.com

krasicki

unread,
Jul 6, 2005, 11:25:26 PM7/6/05
to
Robert C. Martin wrote:
> On 5 Jul 2005 18:26:36 -0700, "krasicki" <Kras...@gmail.com> wrote:
>
> >Robert C. Martin wrote:
>
> >> As for the audience for XP evaporating, I think you need to actually
> >> check your facts instead of stating your opinion AS fact. There is
> >> still a large and growing audience for XP.
> >
> >There is still a large and growing audience for the movie, Plan 9 from
> >Outer Space. I'm not holding my breath that it will be reconsidered
> >for an Oscar.
>
> You are watching too much "House".
-snip-

>
> Wrong fact. The fact I was disputing was that the audience for XP had
> evaporated. It has not.

Nor has it significantly grown. And you will note that I made no
assertion that it had evaporated. i aws making a point that no matter
how bad something might be even bad has its audience.

>
> > Tell me you go into
> >conservative insurance, banking, and finacial services meetings
> >emphasizing the extreeme nature of your methodology or the
> >revolutionary (enterprise) culture shock they entail.
>
> Sometimes, it depends on whether the folks there have developed an
> unreasoned fear of the word "extreme".
> >
> >Who's being deceitful here?
>
> Nobody except those who offer opinions as fact.

And what about those who are deceptive with facts?

>
> >> Facts would be useful here. My experience has shown that agile
> >> techniques strongly prepare software for change.
> >
> >My problem is that they should strongly prepare software for long-term
> >production activity because the software conforms to spec and QA.
>
> XP demands that, every week, the software pass tests written by QA and
> Business. These tests specify the system both functionally and
> non-functionally. They establish exactly the criterion you mention
> above; and they do so unambiguously and repeatably.

Twaddle. One of the well-known problems with OOD in general is that
keeping complex system functional specs up to date is a full-time job.
I d o not believe for a minute that comprehensive re-anaylsis of
interim designs can or are being performed let alone what the system is
NOT doing.

Tell me this is a fact.

>
> >There is no reason to predict change if the job is done right.
>
> That is the silliest thing I've seen you write. Even perfect systems
> must change as the world changes around them. Indeed, perfection
> itself is a moving target. A system that meets all stated
> requirements today, will be sub-optimal tomorrow because the world
> will change around it. And you know it. Nobody could be in this
> business without the truth of that being ground into his or her bones.

You may find this hard to believe but systems written thirty years ago
are still in production working fine. The systems are occasionally
upgraded and maintained but 2 + 2 is still 4. And lots of business
functionality is just that straightforward.

>
> >> I have seen
> >> significant changes easily propagate through systems that were built
> >> using agile techniques. I have also seen non-agile projects falter
> >> and stall when changes needed to be applied.
> >
> >Is this mildly misleading or are we pretending the audience for this
> >discussion are idiots?
>
> Neither. It's simply the truth. I suppose that the statement could
> be considered misleading because I did not say that I have seen the
> opposite. I didn not feel the need to be balanced because I was
> simply refuting your Dean-isms that XP leads to Rube Goldberg,
> seat-of-the-pants, non-rigorous, systems.

Oh, let me get this straight, *I* made you be misleading. Your honor,
he made me do it - him and Dean - not to mention, uh hm, Rube.

>
> >> >The agile practices are no more
> >> >adept at change than anything else.
> >>
> >> Again, facts would be useful.
> >
> >Years ago I chased all of you around the proverbial 'fact-checking'
> >bush and got nowhere.
>
> "All of us"? Anyway, I'm not sure I understand your point. Let's for
> the moment say that your statement is accurate, and "all of us" did,
> in fact, avoid the facts.

Finally, some fresh air.

> Is it your argument that you are therefore
> relieved of the standard you tried to hold us to?

Are you asking that *moi* should provide facts despite your ability to
cleverly dodge the -cough- minor issue yourself. (satire, lest I be
quoted literally) I'll have you know that I am not beneath being agile,
slippery, and dodgy all on my own! HARRUPH! The nerve of some people.
(end satire)

> In any case, I
> don't know what your "years ago" reference is about. As for facts,
> there are plenty out there (both positive and negative), if you are
> willing to do some due diligence prior to debate.

If you google hard enough you'll note that I was one of the original
critics who provided those facts oh so many years ago. So
discomforting did some of your collegues become that they moved their
traveling sideshow to private yahoo group discussions.

> >
> >I agree facts would be useful. After so many years, where are your
> >facts?
>
> What would you like to know? I can point you to both successful and
> failed XP projects. I can point you to articles written by companies
> claiming huge productivity and quality benefits. I can point you to
> research studies, both positive and negative.
>
> And, in fact, with a little tiny bit of elbow grease you could find
> them yourself, because they are all freely available on the net, and
> respond nicely to Google searches.
>
> >> I can provide a simple counter fact.
> >> Having a large batch of unit tests and acceptance tests that can be
> >> run against the system in a matter of minutes, makes it much easier to
> >> make changes to that system simply because it's easier to verify that
> >> the change hasn't broken anything.
> >
> >And what design do you contrast the running system to to know that it
> >is doing what it is intended to do? Or is your assertion that the
> >running design is infallible?
>
> I presume you are referring to the functional design; i.e. the design
> of the requirements. We contrast the running system against the
> design specified by the acceptance tests and unit tests.

And you know perfectly well that that avoids the point of the question.
The acceptance tests and unit tests are little more than the narrative
design of what's taken place right or wrong. How do you distinguish
right from wrong? What's the metric unit?

> >>
> >> And here's an opinion, backed by a lot of observation and experience:
> >> Writing tests first forces a design viewpoint that strongly encourages
> >> decoupling, and that therefore fosters change.
> >
> >My goal is not to foster change. And bad tests, testing faulty
> >assumptions yield successful test results. Without a well documented
> >design that exposes such flaws you have no metric to evaluate the
> >quality of what you are doing. But let's not dwell on quality.
>
> Tests *are* documents. Bad tests are bad documents. Bad documents
> improperly specify the system. Executable tests, written in two forms
> (unit and acceptance) are a very effective way to eliminate most bad
> specifications.

Tea leaves straned at the bottonm of tea cups can be considered
documentation as well. The rest of your assertion makes no sense at
all. Bad things happen. The way to avoid bad things is make sure you
write good things.

Gotcha.

> >> Finally, here are some other observations from various teams that I
> >> have coached. Customers are very happy that their input is heard
> >> early and often. Executives love the fact that real progress is
> >> measured on a regular (weekly) basis, and that stakeholders are
> >> providing feedback in real time. All these things promote change
> >> IMHO.
> >
> >What promotes confidence that the thing works correctly? Bells,
> >whistles, and favorite colors?
>
> Users observing and using the system from iteration to iteration, and
> release to release, while providing continuous feedback.

How do they ensure their evaluations are correct? And if their bonuses
depend on the expeditious delivery of a system, how reliable is that
feedback?

> >> >What agile sells as response to
> >> >change is really immedite gratification.
> >>
> >> There is nothing wrong with immediate gratification so long as nothing
> >> else is lost. Indeed, so long as nothing else is lost, immediate
> >> gratification is better than deferred gratification. The evidence
> >> suggests that nothing else is lost. Indeed, the evidence suggests
> >> that the systems turn out *better*.
> >
> >What evidence and how was this evidence accumulated? All billable
> >hours accounted for?
>
> I was thinking specifically of the work we've been doing on FitNesse,
> and the work I've seen on JUnit, and Eclipse, as well as the systems
> that I have seen in my role as a consultant and coach.

Yes. Software written for developers, not commercial applications with
budgets, deadlines, interoperability issues with legacy systems still
using screen scraping techniques.

You live an enchanted life.

> >> This shouldn't be a big surprise. Any control system works better
> >> when you shorten the feedback loops.
> >
> >Only if the feedback makes sense.
>
> That statement gives you an opportunity to say something concrete as
> opposed to amorphous disparagements. What, in particular, about the
> feedback loops in XP, doesn't make sense? I suggest you do a bit of
> research on just what those feedback loops are, and what control
> mechanisms XP employs around those feedback loops.
>
> >> Agile Methods are NOT a mad rush to functionality. They are not
> >> dotcom stupidity. Indeed, the agile methods value high quality code
> >> and high quality designs more than any other methods I know of.

High quality *code* design that is.


> >> Consider rules such as "no duplicate code", "write tests before code",
> >> "Don't let the sun set on bad code", etc, etc. There are very strong
> >> values that are backed up by disciplines.
> >
> >Parse the sentence. Everything you value is code-centric. Open your
> >mind to OOD.
>
> I agree that I put a lot of value on code. Code is the medium in
> which I work, and in which all software systems are eventually built
> (by definition). As such, I think that it is very appropriate to
> value code. Not that I don't value requirements, I do. Indeed, I put
> a lot of research effort into finding better ways to gather, express,
> and refine requirements (e.g. www.fitnesse.org).

You spelled fitness wrong.

The medium I work in is thinking. Remember the IBM reminder; THINK.
Design is about formalizing thought, not code. And that thought is
applied to problem solving not code generation.

Maybe it's a difference between you and I.

>
> Open my mind to OOD? I've been writing books and articles about OOD
> for over ten years. I was an early adopter, and have worked hard to
> advance the state of the art. I think my mind is open to OOD; though
> I am always ready to learn something new.

I was introduced to OOD by Shelly and Cashman in the late seventies. I
write no books and give no lectures.

>
> But I'll turn this around on you. Open your mind to XP. For a very
> long time you have posted negative statement on this newsgroup that
> show that you know very little about it.

I know a lot about it and my posts have contributed to its maturity and
rebranding. I have no use for XP but I practice good code development
anyway using many of the gut techniques and practices XP claims as its
own.

XP is not the first or last word in good software development and it
holds no magic or power for me. I don't hate it or its proponents I
just don't advocate it. Does that make me close minded?

>
> >Your argument is not compelling today any more than it was many years
> >ago.
>
> Unfortunately yours are no more informed than they were years ago.
>

And, as usual, you've brought nothing to feed the hungry.

krasicki

unread,
Jul 7, 2005, 12:00:59 AM7/7/05
to
Robert C. Martin wrote:
> On 5 Jul 2005 07:01:00 -0700, "krasicki" <Kras...@gmail.com> wrote:
>
> >Given that *XP practices* are largely rebranded existing good practice
>
> Wait, I thought you said that XP was seat-of-the-pants, Rube Goldberg,
> undisciplined, etc. Now it's existing good practice?

XP is a lightweight methodology of practices.

Practices within that methodology can be good or bad. The absnce of
better practices within the methodology can be good or bad.

Many of the worthwhile practices co-opted into XP existed before,
during, and after the XP gold rush sometimes unbeknownst to the
advocates.

XP was not promoting testing years ago. XP retreated to testing
emphasis because few people argue that it's good. But testing advocacy
doesn't validate XP as a good methodology per se.

>
> >I will assert that no such push exists. I submit to you that XP
> >advocates are simply looking outside their shell and discovering that
> >good programming practices existed despite their claims.
>
> OK, so is your complaint is more about XP *advocates* and less about
> XP itself?

I care precious little for either. One has to acknowledge the noise of
XP nonetheless and the muddying of the waters having to do with
Object-oriented anything.

>
> >But failing
> >to acknowledge that, XP advocates, as usual, run ahead of the parade
> >claiming credit for the celebration.
>
> I think that's interesting since, from the very start, "XP Advocates"
> have said that the practices of XP are based on previous best
> practices.

I can remember weeks of such debates in which XP advocates were in
denial of this.

> >
> >Project Mercury would have never gotten off the ground if it had been
> >developed in a financial services, insurance, banking, or commericial
> >enterprise using *agile* methodologies.
>
> The software for the Mercury Space Capsule was written iteratively.
> The iterations were a day long. Unit tests were written in the
> morning, and made to pass in the afternoon.

And where did the design come from. Was that fabricated on a daily
basis as well?

>
> >Let's not sell kool-aid here. There are plenty of places wallowing in
> >their own fecal ideas that will get on these newsgroups and testify how
> >good it feels - come join us. Be careful not to sound like them.
>
> By the same token, I advise you to try to sound a bit less like Howard
> Dean. If you want to engage in a civilized debate over XP, then get
> your facts together, and have at it. But spewing emotional baggage
> around benefits nobody.

Every time I begin to feel bad at how personal some of these exchanges
sound you remind me that they are.

krasicki

unread,
Jul 7, 2005, 12:18:35 AM7/7/05
to

Au contraire. The bricks all passed unit tests. As did the cement,
steel, and so on. And the plans all had feedback. And the customer
surely showed up with a glowing smile watching the obvious progress.
And progress happened every day.

The elevator worked fine. Up. Down. Ring, ring. All positive
feedback.

The workers sweated. The execs wore suits and went golfing.

>
> > These are true stories. Shouldn't all of the architects of these
> > buildings have expected change to happen as well. Same with the
> > builders. maybe build with everything loose so that it can be
> > reassembled when the next minor detail arises? Aren't we being told
> > this is the way things work?
>
> Well, the fact is software is malleable. In fact it is too malleable.
> It isn't hard to change software at all. All you have to do is type a
> couple of characters in any program and you can break it. Because that
> is the way that software is, we need tests to give it backbone.

It's not that malleable. Once in production software is very hard to
change for all kinds of political reasons.

In fact a big problem for architects and designers is having
programmers undermine design activity with too much dog and pony
prototyping. Bad ideas become adopted before any discussion of the
larger picture can be formulated.

Of course you need tests. We aren't a bunch of ninnies here.

> > Even carpenters measure before they cut. Yet, in computer science we
> > are being told that we should operate as though we are all alchoholics
> > and take things one day at a time.
>
> The problem is: misunderstanding the material you are working with.
> Code is not wood or concrete.

But spent resources are. Nobody fixes anything for free. And bad code
applied to millions of daily transactions can cost companies or
customers lots and lots of money when wrong.

Testing is tricky stuff and complex logic errors don't get discussed
when daily iterations are the norm because there is no time.

Design and OOD are not code or code design.

krasicki

unread,
Jul 7, 2005, 12:49:10 AM7/7/05
to
Laurent Bossavit wrote:
> > So sixteen of the rooms were incorrectly built for handicapped access,
> > an error of three inches per room. Where to get three inches? Push
> > the rooms into the hall and the hall fails. You can see how this goes.
>
> We can see how this goes for buildings. But we're not buildings experts,
> and unless I missed something neither are you. You're asking us to draw
> insight from a foreign domain about which we know *less* than the domain
> we're supposed to apply it to - software. That's the wrong direction to
> operate an intuition pump.

And coders are not system architects or system designers yet XP sells
that idea.

Anyone who has had to try to fit yet another piece of software between
the cracks of systems boundaries understands the problem. To increase
security you slow transaction times to levels unsatisfactory to system
specifications, these are typical conundrums in systems today. Every
tweak has a tradeoff. You cannot tradeoff one thing for another if
you're handed coding minutia.

And you Laurent miss the point. This thread is about design not
software. design is not a daily feelgood touchstone with Skippy.
Design involves hard work and great responsibility above and beyond
coding group hugs.

>
> Your rant, while entertaining, is at the level of caricature rather than
> real insight; for insight, read something like Stewart Brand's /How
> Buildings Learn/, which has real stories about architecture.

I thought you didn't like the intuition pump thing.

And, it's not a rant.

>
> Most software has the property that if you twiddle one small bit
> incorrectly, an entire system might crash with disastrous consequences.
> (Think off-by one errors.) As far as I can't tell, no building has this
> property - if you remove one brick, even a brick at the very bottom, the
> building stays up.

Civic center roof collapses during the seventies are example of just
that phenomenon. Miscalculations of local snowfall and so on were the
culprits.

The space shuttle O-ring disaster is another example.

Or the NASA Mars probe that performed math conversions incorrectly.

During the nineties, Gingrich (I think) thought pennies were too
expensive to make. So they stopped using copper for a short period of
time for another metal. Trouble was that babies swallowed copper
pennies that passed through their systems without incident. The new
pennies decomposed in babies stomachs making them severely ill.
Pennies are copper once again.

All kinds of things can cause disasters.

>
> You're saying that software design should be based in theory. You can't
> have your cake and eat it too - the *process* whereby something is
> designed should certainly take into account the characteristic
> properties of the thing being designed. (Call that meta-design.)

No we won't call it meta-design. The design of OOD objects is
meta-design.

OOD does take software development into consideration as one of many
factors.

>
> Buildings are brick. Software is text. Brick and text probably call for
> different design processes.

Solving problems can all be done using design notation for the ideas
expressed.

Les Cargill

unread,
Jul 7, 2005, 1:19:29 AM7/7/05
to

Not the right feedback. It's a failure of requirements capture,
pure and simple. No methodology nor any other thing, other
than collecting all the relevant requirements and checklisting
them, would have made a bit of difference.

> And the customer
> surely showed up with a glowing smile watching the obvious progress.
> And progress happened every day.
>
> The elevator worked fine. Up. Down. Ring, ring. All positive
> feedback.
>
> The workers sweated. The execs wore suits and went golfing.
>
>
>> > These are true stories. Shouldn't all of the architects of these
>> > buildings have expected change to happen as well. Same with the
>> > builders. maybe build with everything loose so that it can be
>> > reassembled when the next minor detail arises? Aren't we being told
>> > this is the way things work?
>>
>>Well, the fact is software is malleable. In fact it is too malleable.
>>It isn't hard to change software at all. All you have to do is type a
>>couple of characters in any program and you can break it. Because that
>>is the way that software is, we need tests to give it backbone.
>
>
> It's not that malleable. Once in production software is very hard to
> change for all kinds of political reasons.
>

It shouldn't go into production with defects that are gonna
cost people money, at least without an enforceable plan
to get the defects out, upfront.

Once in production, somebody has to make the decisions of
when, how and why to deploy upgrades.

> In fact a big problem for architects and designers is having
> programmers undermine design activity with too much dog and pony
> prototyping.

How is that possible? Other than time being wasted, prototyping
is harmless. Prototypes should not even be attempted until
there's a specific question or suite of questions they are
to answer. If it's a sandboxed protpype, just to let the
programmers play, then chunk it, or put it away. You
still need specific deliverables from the prototyping.

> Bad ideas become adopted before any discussion of the
> larger picture can be formulated.
>

Then they have to get rooted out and killed, or at least
triaged and weighed for "badness". Bad ideas that don't get
shot are a sign of complacency, not methodology.

> Of course you need tests. We aren't a bunch of ninnies here.
>
>
>> > Even carpenters measure before they cut. Yet, in computer science we
>> > are being told that we should operate as though we are all alchoholics
>> > and take things one day at a time.
>>
>>The problem is: misunderstanding the material you are working with.
>>Code is not wood or concrete.
>
>
> But spent resources are. Nobody fixes anything for free. And bad code
> applied to millions of daily transactions can cost companies or
> customers lots and lots of money when wrong.
>

So somebody has to do a cost-benefeit analysis of when to do what.
Good code isn't free, either. This is logistics, not particularly
even software logistics.

> Testing is tricky stuff and complex logic errors don't get discussed
> when daily iterations are the norm because there is no time.
>
> Design and OOD are not code or code design.
>

You can't fix culture with tools, in other words. Mostly, yes :)

--
Les Cargill

topmind

unread,
Jul 7, 2005, 1:59:14 AM7/7/05
to
> >>
> >> It's not so much a matter of knowing what will change, or even how it
> >> will change. It's a matter of recognizing that certain things will
> >> change at a different rate than others. For example report formats
> >> are will change at a different rate than business rules.
> >
> >I am not sure what you mean. Both change, often at different
> >unpredictable rates.
>
> Exactly. They change at different rates. If we couple them, we will
> be forced to make changes to one because of the other.

Sometimes changes are related, sometimes they are not. thing-A may
change twice as fast as thing-B, but maybe 70% of all changes to
thing-B affect thing-A also. Changing faster does not necessarily mean
something is unrelated to something that changes slower. Change speed
is only one of many factors controlling relationships.

>
> >> GUI layout
> >> will change at a different rate than database schemae.
> >
> >Yes, but one cannot say in *advance* what will change faster.
>
> Sometimes you can, and sometimes you can't; but it doesn't matter.
> The issue is that they change for different reasons.

No! Sometimes they change for the same reasons.

>
> >Do you mean knowing the actual reason and rate? Or just knowing they
> >will be different?
>
> The latter.
>
> >> We try not to couple report formats to business rules because it would
> >> be a shame to inadvertently break the business rules by moving a
> >> column on a report.
> >
> >Please clarify. I can think of situations where relating them may save
> >time and situations where relating them would cause headaches.
>
> I think that was pretty clear. If we couple the report format to the
> business rules, (for example, by doing computations at the same time
> that we are generating the report) then a change to the format of the
> report will break the business rules. Or rather, when you change the
> report format you'll have to make similar changes to the structure of
> the business rule algorithm.

I would like to explore specific scenarios rather than accept broad
generalizations.

> >
> >> We try to decouple the GUI layout from the
> >> database schemae because it would be a shame to crash the GUI when
> >> adding a new column to the database.
> >
> >On the flip side it is sometimes nice to have a column *automatically*
> >appear in the CRUD (edit screens) realm so that we don't have to make
> >the same column addition in two or more different places. There is no
> >One Right Coupling decision here. Adding new columns and having to sift
> >through code to manually make that addition to multiple spots can be
> >time-consuming.
>
> Yes, it depends. If we are writing a program to throw away in a day
> or two, then we might take a short-cut like that. On the other hand,
> if we are developing a system that must survive through years of
> changing requirements, then coupling the GUI to the Schema is suicide;
> and the time you might save in so doing is a false economy.

Why is it a "shortcut"? About 50% to 90% of the time new table columns
result in corresponding report and screen columns. If we make the two
independent, then we are doing almost double the effort when we add or
change them if we leave them un-coupled.

I am not saying a data dictionary is always the way to go, but you seem
to dismiss it without specific-enough reasoning here. Data dictionaries
can be good once-and-only-once (non-duplication). Are you against the
factoring of duplication? If we have to mention a given column name and
related attributes in 10 different places, then we are NOT factoring;
we are copying-and-pasting the same or similar information all over the
place.

Note that factoring tends to *increase* coupling because it makes
multiple spots reference (be coupled to) the same thing.

A -------> A1

B -------> B1

After factoring:

A -------> C
^
|
B ---------*

>
>
>
> -----
> Robert C. Martin (Uncle Bob) | email: uncl...@objectmentor.com

-T-

frebe

unread,
Jul 7, 2005, 2:43:46 AM7/7/05
to
> We try to decouple the GUI layout from the
> database schemae because it would be a shame to crash the GUI when
> adding a new column to the database.

Adding a new column to the database would in no way crash the GUI. It
is the same as adding a new method to a class. Old code using the class
will not be affected at all.

But if you you add a new column to the database, it is very likely that
the you want to show this column in the GUI too. If your GUI is
decoupled from the database schema, you would have to add a lot of
extra code in your application to be able to show the new column. If
you used a data-aware GUI component (low decoupling between GUI and
database), the only thing you would have to do is to tell the GUI
component to show this new column (or nothing at all if the GUI
component shows every column in the table).

Fredrik Bertilsson
http://butler.sourceforge.net

frebe

unread,
Jul 7, 2005, 2:55:18 AM7/7/05
to
> if we are developing a system that must survive through years of
> changing requirements, then coupling the GUI to the Schema is suicide;
> and the time you might save in so doing is a false economy.

Why would it be suicide to have a coupling between the GUI and database
schema? As pointed out before, a simple column adding would cause you a
lot of extra coding using a decoupled approach. With the coupled
approach, the new column could appear automatically in the GUI after it
is added to the database table. It is easy to prove that the coupled
approach makes maintainance much easier.

How would it be easier to maintain "system that must survive through
years of changing requirements", by doing simple things harder?

Fredrik Bertilsson
http://butler.sourceforge.net

Laurent Bossavit

unread,
Jul 7, 2005, 4:12:06 AM7/7/05
to
> > and unless I missed something neither are you. You're asking us to draw
> > insight from a foreign domain about which we know *less* than the domain
> > we're supposed to apply it to - software. That's the wrong direction to
> > operate an intuition pump.
>
> And coders are not system architects or system designers yet XP sells
> that idea.

Not so fast, mister. Coders *are* experts about the failure modes of
software systems. There are many more things besides that it is
necessary to know when running a software development effort, but this
is certainly among the foremost.

> And you Laurent miss the point. This thread is about design not
> software. design is not a daily feelgood touchstone with Skippy.

Then provide specific (theory-backed) guidance about design, not warm
fuzzies plucked out of a worthless analogy.

> Design involves hard work and great responsibility above and beyond
> coding group hugs.

You've got me hooked and curious. *What* kind of hard work, and can you
prove that it meets the criterion for "design": it will eliminate bad
solutions from the running at a lower cost than actually going ahead and
building the thing. (Design is primarily a matter of economics.)

Laurent

hans...@hotmail.com

unread,
Jul 7, 2005, 4:08:14 AM7/7/05
to
Read the papers and understand the discussion. However, I do not agree
with the rather myopic view of software development as being mostly a
coding activity. Neither do I see any point in defining OO as was done
in RCM's comment.
Regards,
Hans

Laurent Bossavit

unread,
Jul 7, 2005, 4:50:02 AM7/7/05
to
Krasicki,

> XP was not promoting testing years ago.

I think 1998 qualifies as "years ago". The first article ever published
about XP started, "Extreme Programming rests on the values of
simplicity, communication, *testing*, and aggressiveness." (Emphasis
mine.) It went on to describe C3's thousands of unit tests.

> XP retreated to testing emphasis because few people argue that it's good.

I would think XP might "retreat to testing" because *many* (not few)
people argue that testing is good. I wonder what you meant exactly - not
that it matters, as we've established there was no "retreat to testing".

Laurent

hans...@hotmail.com

unread,
Jul 7, 2005, 5:33:50 AM7/7/05
to
Robert C. Martin wrote:

> Absolutely! Though this has little to do with OO. I am much more
> interested in the work of Ward Cunningham and Rick Mugridge in
> expressing requirements as Tests. (See www.fitnesse.org)

By testing, I'm not sure you are referring to validation or
verification. I assume (maybe incorrectly) that 'test' means
execution of code i.e. validation. I have difficulty in understanding
how all requirements can be expressed as 'tests'. I could see
problems with conditions that occur randomly or rarely. I'll have to
take a look at www.fitnesse.org.

> I disagree. I think there is a lot of value in precise definitions
> that can act as a metric against which to measure software designs.
> Given my definition, you can quickly ascertain whether a particular
> design is OO or not. Moreover, there are many benefits that are
> associated with this structure. Dependency Inversion is the primary
> mechanism behind independently deployable binary components.

I don't disagree with the value of precise definitions and metrics.
However, I do disagree with hijacking the term OO with a meaning that
is very counter intuitive - at least to me. There are strong
relationships between concepts and objects and both concepts and
objects have lots of interesting aspects. You have reduced 'OO' to
mean something very specific about how a programming language models
one single operation on concepts (generalization) and how that
operation should be used.

Yes, you can ascertain if some software is 'OO' or not with your
definition. So what? There are lots of other more important aspects of
software than being DIP compatible (see my last comment in this reply).

> I agree that describing problems is very important, and it is an
> active area of my own research (www.fitnesse.org). On the other hand,
> I think that we don't put enough effort into techniques for how to
> code problems. Far too many problems are related to poor coding
> structure. Whole systems, and whole development teams, are brought to
> their knees because their code has become so tangled and impenetrable
> that it cannot be cost-effectively maintained.

In the majority of cases where I have seen this (code mess) happen, it
has been because of a poor understanding of the problem being solved.
It has rarely been because of poor programming practices. I don't
believe it is possible to code effective solutions without having a
clear understanding of the problem at the level of the 'business'.

It is a mystery to me that most developers believe they can code a
solution to a problem when they don't understand the problem. DIP,
polymorphism (OO?) and any other technical wizardry will not help in
avoiding messy code when the problem is non-trivial. Only an
understanding of the problem or an understanding of how to deal with
the class of problems that the problem belongs to, will avoid code
disasters.

In summary: I do not believe poor coding practices are the main problem
in IT development. Knowing, understanding and describing problems are
at the heart of developing good software. Maybe the situation is
different in other domains, but in the financial domain I am convinced
that I am correct-:)

Regards,
Hans Ewetz

Daniel Parker

unread,
Jul 7, 2005, 6:13:19 AM7/7/05
to
"Robert C. Martin" <uncl...@objectmentor.com> wrote in message
news:8c5oc15o20asviqcr...@4ax.com...

> On Wed, 6 Jul 2005 08:00:08 -0400, "Daniel Parker"
> <danielaparker@spam?nothanks.windupbird.com> wrote:
>
>
>>It's hard to
>>find blogs that dissect both successful and unsuccessful XP projects and
>>systematically discuss the consequences of the various practices, which is
>>what you'd expect if the author wanted to be taken seriously.
>
> I'm astounded. There has been a rather large amount ...

Oh, good. Can you provide a link to a site that does the above?

Thanks,
Daniel


Michael Feathers

unread,
Jul 7, 2005, 8:31:58 AM7/7/05
to
krasicki wrote:
> XP was not promoting testing years ago. XP retreated to testing
> emphasis because few people argue that it's good. But testing advocacy
> doesn't validate XP as a good methodology per se.

Nope. Testing was a core practice of XP from the beginning. It was it
the white book, as test-first and functional testing. If you go to Ron
Jeffries' site you'll find the original writeups of the C3 project's
practices:

http://xprogramming.com/Practices/xpractices.htm

And, if I remember correctly, the paper submitted to OOPSLA by C3 in the
late 90s, the one that spurned interest in XP, emphasized testing as well.

Where are you getting all of these odd ideas?


Michael Feathers
author, Working Effectively with Legacy Code (Prentice Hall 2005)
www.objectmentor.com

Michael Feathers

unread,
Jul 7, 2005, 8:59:43 AM7/7/05
to

And here's where your analogy crumples into a ball and falls down. In
software, we can try to "roll the bed down the hall" whenever we want
to. We can have an automated test that attempts that even before there
is a hallway. Running the test is free, and we can always see whether
we are done or not. In this sense, we have an advantage over many other
disciplines, owing mainly to the fact that we have very malleable
material and we have very good ways of working with it.

>> > These are true stories. Shouldn't all of the architects of these
>> > buildings have expected change to happen as well. Same with the
>> > builders. maybe build with everything loose so that it can be
>> > reassembled when the next minor detail arises? Aren't we being told
>> > this is the way things work?

It works that way, if your material allows it.

>>Well, the fact is software is malleable. In fact it is too malleable.
>>It isn't hard to change software at all. All you have to do is type a
>>couple of characters in any program and you can break it. Because that
>>is the way that software is, we need tests to give it backbone.
>
> It's not that malleable. Once in production software is very hard to
> change for all kinds of political reasons.

Not necessarily. It depends upon on confidently you can make changes
and what your track record is. There are teams that incorporate new
features and deploy every day.

> In fact a big problem for architects and designers is having
> programmers undermine design activity with too much dog and pony
> prototyping. Bad ideas become adopted before any discussion of the
> larger picture can be formulated.

Programming is a design activity. There are no bricklayers in software
development.

> Of course you need tests. We aren't a bunch of ninnies here.
>
>> > Even carpenters measure before they cut. Yet, in computer science we
>> > are being told that we should operate as though we are all alchoholics
>> > and take things one day at a time.
>>
>>The problem is: misunderstanding the material you are working with.
>>Code is not wood or concrete.
>
>
> But spent resources are. Nobody fixes anything for free. And bad code
> applied to millions of daily transactions can cost companies or
> customers lots and lots of money when wrong.

So true. That's why we test continuously and adopt practices which
decrease the chance of defects. You "roll the bed down the hall" before
production, more more times than you imagine.

> Testing is tricky stuff and complex logic errors don't get discussed
> when daily iterations are the norm because there is no time.

Why isn't there? I think this is another case where you misunderstand
agile processes. Read up on "The Planning Game" in XP when you get a
chance. Plans are recalibrated continuously to allow quality work.

> Design and OOD are not code or code design.

Yes, they are.
http://www.developerdotstar.com/mag/articles/reeves_design_main.html

krasicki

unread,
Jul 7, 2005, 11:32:57 AM7/7/05
to

I will infer that you're saying that More planning anfd preparation
time might have comprehensively accumulated and accounted for these
missing design considerations. In other words a hieavier weight
methodology could have avoided the headaches - all things being equal.

>
> > And the customer
> > surely showed up with a glowing smile watching the obvious progress.
> > And progress happened every day.
> >
> > The elevator worked fine. Up. Down. Ring, ring. All positive
> > feedback.
> >
> > The workers sweated. The execs wore suits and went golfing.
> >
> >
> >> > These are true stories. Shouldn't all of the architects of these
> >> > buildings have expected change to happen as well. Same with the
> >> > builders. maybe build with everything loose so that it can be
> >> > reassembled when the next minor detail arises? Aren't we being told
> >> > this is the way things work?
> >>
> >>Well, the fact is software is malleable. In fact it is too malleable.
> >>It isn't hard to change software at all. All you have to do is type a
> >>couple of characters in any program and you can break it. Because that
> >>is the way that software is, we need tests to give it backbone.
> >
> >
> > It's not that malleable. Once in production software is very hard to
> > change for all kinds of political reasons.
> >
>
> It shouldn't go into production with defects that are gonna
> cost people money, at least without an enforceable plan
> to get the defects out, upfront.
>
> Once in production, somebody has to make the decisions of
> when, how and why to deploy upgrades.

Well, my point is that if something goes through the XP methodology
with all of the hot air and hubris that one has performed a bazillion
tests on it already but defects still exist, who will know and how
could they prove it.

Would any of us argue for long with these people? I lose heart just
trying to get a straight answer out of them in something as
straightforward as a newsgroup. Haven't you heard, XP is absolutely
right because they've tested everything every which way.

Once in commercial production, software that is mission critical is not
easily changed because, as someone said elsewhere, tweaking the wrong
bit could cause system calamities. Reintroducing code in these
enevironments could take six months to a year of expensive rework or
total shutdown. It's no longer a question of tweaking code but
questioning all assumptions. With BDUF, you can isolate the problem
and hypothetically run the system without software trying to understand
the overall implications.

>
> > In fact a big problem for architects and designers is having
> > programmers undermine design activity with too much dog and pony
> > prototyping.
>
> How is that possible? Other than time being wasted, prototyping
> is harmless. Prototypes should not even be attempted until
> there's a specific question or suite of questions they are
> to answer. If it's a sandboxed protpype, just to let the
> programmers play, then chunk it, or put it away. You
> still need specific deliverables from the prototyping.

Prototyping is political dynamite in many organizations. Software
designers and architects are usually discussing issues that are not
near and dear to the hearts of the local application domain princess
who wants to have someone to talk to. Enter, any number of local
characters who begin prototyping their idea of what should happen.
Before long the architects and designers are entangled in favorite
color discussions and presentation fashion shows.

Add to this mix, any number of programmers who believe they know better
than the people always talking about abstract ideas and you enter the
realm of random, esoteric, and uncontrollable development.

>
> > Bad ideas become adopted before any discussion of the
> > larger picture can be formulated.
> >
>
> Then they have to get rooted out and killed, or at least
> triaged and weighed for "badness". Bad ideas that don't get
> shot are a sign of complacency, not methodology.

There is no budget to root things out and bad software is often
sponsored internally by incompetent people who control your paycheck.
XP adds authenticity to the problems involved.

Because software development is so tightly coupled to the individuals,
it is no longer a matter of correcting or eliminating problematic code.
The XP crowd has a vociferous ego stake in what's being done. They've
got stories and tests and feedback loops that will insist it's there
right. They all feel good about it. And there is no impartial design
document you can point to to say otherwise because the whole ball of
wax is personal, intimate, immediate, and a treadmill of exhaustion for
everyone involved.

>
> > Of course you need tests. We aren't a bunch of ninnies here.
> >
> >
> >> > Even carpenters measure before they cut. Yet, in computer science we
> >> > are being told that we should operate as though we are all alchoholics
> >> > and take things one day at a time.
> >>
> >>The problem is: misunderstanding the material you are working with.
> >>Code is not wood or concrete.
> >
> >
> > But spent resources are. Nobody fixes anything for free. And bad code
> > applied to millions of daily transactions can cost companies or
> > customers lots and lots of money when wrong.
> >
>
> So somebody has to do a cost-benefeit analysis of when to do what.
> Good code isn't free, either. This is logistics, not particularly
> even software logistics.

The key term is "has to".

>
> > Testing is tricky stuff and complex logic errors don't get discussed
> > when daily iterations are the norm because there is no time.
> >
> > Design and OOD are not code or code design.
> >
>
> You can't fix culture with tools, in other words. Mostly, yes :)
>

Thanks Les. Arguing XP is as thankless a task as I've ever
encountered. The proponents swarm on critics like hornets so try to
avoid this stuff more often than not. I sincerely was trying to give
the OP a fair assessment of what's out there but this quagmire blocks
all light from shining through.

krasicki

unread,
Jul 7, 2005, 12:16:44 PM7/7/05
to

Daniel,

You will wait forever. Every valid question you raise will be ignored
and you will exhaust yourself disputing one nonsensical assertion about
XP after another. You need a strong sense of humor engaging this
crowd.

I do wish you luck.

Daniel Parker

unread,
Jul 7, 2005, 1:40:11 PM7/7/05
to
krasicki wrote:
> Robert C. Martin wrote:
> > e.g. www.fitnesse.org
>
> You spelled fitness wrong.
>

On the contrary ...

"Thou chang'd and selfe-couerd thing, for shame
Be-monster not thy feature, wer't my fitnesse"

Shakespeare, King Lear

krasicki

unread,
Jul 7, 2005, 3:44:26 PM7/7/05
to

So Shakespeare needed a spelling checker as well...

Robert C. Martin

unread,
Jul 7, 2005, 7:11:03 PM7/7/05
to
On 6 Jul 2005 23:55:18 -0700, "frebe" <fredrik_b...@passagen.se>
wrote:

>> if we are developing a system that must survive through years of
>> changing requirements, then coupling the GUI to the Schema is suicide;
>> and the time you might save in so doing is a false economy.
>
>Why would it be suicide to have a coupling between the GUI and database
>schema?

Consider the following pseudocode:

Item maxItem = new Item(0, "junk");
int totalAmt = 0;
foreach item in product {
print item.name;
print item.amount;
maxItem = max(maxItem,item);
totalAmt += item.amount;
}

The calculation of the 'maxItem', and 'totalAmt' are mixed in with the
printing of the the items. Other parts of the program eventually use
totalAmt and maxItem for other calculations.

Later, the folks who use the report decide that they don't want to see
the ZIGGY item printed on the report. This item is mostly just a
placeholder and just confuses the report. This is strictly a cosmetic
issue and should not affect any of the business rules.

A programmer makes the following change:

Item maxItem = new Item(0, "junk");
int totalAmt = 0;
foreach item in product{
if (item.name == "ZIGGY")
continue;
print item.name;
print item.amount;
maxItem = max(maxItem,item);
totalAmt += item.amount;
}

This fixes the report as required, but now the totalAmt and maxItem
variables are silently incorrect for any product that happens to
include a ZIGGY item.

Robert C. Martin

unread,
Jul 7, 2005, 7:11:57 PM7/7/05
to
On 6 Jul 2005 23:43:46 -0700, "frebe" <fredrik_b...@passagen.se>
wrote:

>Adding a new column to the database would in no way crash the GUI.

Use your imagination!

Robert C. Martin

unread,
Jul 7, 2005, 7:28:16 PM7/7/05
to
On 7 Jul 2005 02:33:50 -0700, hans...@hotmail.com wrote:

>Robert C. Martin wrote:
>
>> Absolutely! Though this has little to do with OO. I am much more
>> interested in the work of Ward Cunningham and Rick Mugridge in
>> expressing requirements as Tests. (See www.fitnesse.org)
>
>By testing, I'm not sure you are referring to validation or
>verification.

Neither. I'm referring to *specification*. We specify the
requirements of a system by writing tests that pass if those
requirements are implemented correctly.

>I have difficulty in understanding
>how all requirements can be expressed as 'tests'.

Any requirement that cannot be expressed as a test, is not really a
requirement.

>> I disagree. I think there is a lot of value in precise definitions
>> that can act as a metric against which to measure software designs.
>> Given my definition, you can quickly ascertain whether a particular
>> design is OO or not. Moreover, there are many benefits that are
>> associated with this structure. Dependency Inversion is the primary
>> mechanism behind independently deployable binary components.
>
>I don't disagree with the value of precise definitions and metrics.
>However, I do disagree with hijacking the term OO with a meaning that
>is very counter intuitive - at least to me. There are strong
>relationships between concepts and objects and both concepts and
>objects have lots of interesting aspects. You have reduced 'OO' to
>mean something very specific about how a programming language models
>one single operation on concepts (generalization) and how that
>operation should be used.

Yes, I have done that. I think that is much closer to the original
inception of OO. Nygaard, Dahl, et. al. noticed that the block
structure of algol was constrained. The function stack frame was on
the heap, and therefore was destroyed when the initializing function
returned. They realized they could move this stack frame to the heap,
and keep it alive even after the function returned. Voila! The
object was born in the Simula language.

OO was born in a coding environment. OO was about ways to make code
more expressive and have better structure. Indeed, Dahl described
their insight in the 1972 book "Structured Programming" which was all
about coding structures.

>In the majority of cases where I have seen this (code mess) happen, it
>has been because of a poor understanding of the problem being solved.
>It has rarely been because of poor programming practices.

There is a difference between a requirements mess, and a code mess.
Messy requirements lead to systems that are difficult to use. Messy
code leads to systems that are difficult to change for very technical
reasons. e.g. they take forever to build, they break in strange and
unexpected places when changes are made, changes cannot be made in one
place, but must be made in many places throughout the code. There is
massive duplication and interdependency in the code.

Code messes are an incredible problem that have brought teams and
companies to their knees. The developers eventually militate for "the
grand redesign in the sky". This is not a redesign at the
requirements level, it is a redesign at the code level. Such grand
redesigns almost always fail spectacularly.

>I don't
>believe it is possible to code effective solutions without having a
>clear understanding of the problem at the level of the 'business'.

Agreed. However, this has nothing to do with OO. OO is not a scheme
for better understanding business requirements IMHO.

>It is a mystery to me that most developers believe they can code a
>solution to a problem when they don't understand the problem.

Agreed, it is folly.

>DIP,
>polymorphism (OO?) and any other technical wizardry will not help in
>avoiding messy code when the problem is non-trivial.

DIP will not help you understand the problem better; but DIP *will*
help you structure the code better once you *do* understand the
problem.

>Only an
>understanding of the problem or an understanding of how to deal with
>the class of problems that the problem belongs to, will avoid code
>disasters.

It also requires good coding skills and design disciplines. It is
entirely possible for programmers to make a horrible mess in the code,
even when they understand the problem perfectly.


>
>In summary: I do not believe poor coding practices are the main problem
>in IT development.

Poor coding practices are a major issue. They are not the only issue.

>Knowing, understanding and describing problems are
>at the heart of developing good software.

Understanding the problem is necessary, but not sufficient.

Robert C. Martin

unread,
Jul 7, 2005, 7:29:43 PM7/7/05
to
On Wed, 06 Jul 2005 16:20:44 GMT, "H. S. Lahman"
<h.la...@verizon.net> wrote:

>You should try attending a translation model review involving
>experienced developers. Whether there is implementation pollution
>present is usually quite clear. Authors may have blind spots as
>individuals, but they are quick to recognize the problem when it is
>pointed out. The tricky part lies is eliminating implementation
>pollution, not recognizing it.

I could say the same about a good design review, or a good pair
programming session. Individual authors may miss certain
partitionings that would better separate implementation from policy;
but the team is pretty good at getting it right.

Phlip

unread,
Jul 7, 2005, 7:51:16 PM7/7/05
to
Shakespeare wrote:

> "Thou chang'd and selfe-couerd thing, for shame
> Be-monster not thy feature, wer't my fitnesse"

Be-monster not thy _feature_??

You changed and self-covered thing, for shame
be-monster not your _feature_, were it my fitnesse??

Could someone put that on http://fitnesse.org 's homepage?!

--
Phlip
http://www.c2.com/cgi/wiki?ZeekLand


topmind

unread,
Jul 7, 2005, 8:03:41 PM7/7/05
to


But the flip-side could also happen. It may be that the
requirements are for Ziggy to also be excluded from the
totals. We cannot know that *in advance*. If we separate
them, then we may forget to make the same filtering
to both loops. I have seen such issues play out both ways.

It is all back to *probability* again, just like the
last topic. You cannot say "always" in this example.
You cannot say that any such loop filtering change
will always or never be applied to the other stuff.
Summing and displaying may or may not be related
in any given filtering change. It is situational, not
absolute.

Coupling seems coupled to probability (pun intended).

>
>
> -----
> Robert C. Martin (Uncle Bob) | email: uncl...@objectmentor.com

-T-

It is loading more messages.
0 new messages