I think this is an excellent corollary, particularly for people
working in difficult corporate environments. And it works for people
like Uncle Bob as well as an inexperienced yet passionate young
developer.
How do Boy Scouts define what it means for a campground to be "cleaner" ?
Lance,
What you're describing seems to be what the C2 wiki calls Ravioli
Code: http://c2.com/cgi/wiki?RavioliCode
This can be a good thing as long as we're not abiding by arbitrary
line length limits and we're conscious of the fact that in some
domains breaking things into very small methods makes it harder for
others to understand your intentions. Another situation where very
small methods make code to understand is when you're implementing
well-known algorithms or processes.
I tend to use the heuristic: make things as small as possible by
continually extracting meaningful chunks and giving them
intention-revealing names. If making something smaller obscures my
intention (for instance because I can't think of a good name for the
small chunk) I'll leave it alone. Sometimes that may involve leaving
behind a very long comment explaining the issues. Sometimes the act of
writing that comment will clarify my intention and I can actually
extract the chunk. When that doesn't happen I'm quite willing to trust
that someone in my team will see what I've missed and improve on my
work.
I'm also quite happy with the notion that a team converges on a sense
of "better" that makes sense for them. I suppose that's because I'm a
little suspicious of ideals like "cleaner" that don't acknowledge that
there are trade-offs involved. For instance in Python you can
sometimes get into situations where the overhead of polymorphic method
dispatch makes your code unacceptably slow and you have to do
something uglier instead. Or the overhead of object creation means
your code can't handle the dataset you have and you're forced to use
things like named tuples which obscure your intent but radically
reduce your memory footprint.
I'd like to hear more from people on situations where they've had to
make these kinds of trade-offs or discovered that their
rules/metarules were actually heuristics.
it could be curious fun if there were a poll set up with a bunch of
multiple choice questions along these lines. not that i am nice enough
to get around to doing that, i'm smart enough to know i won't since i
have little free time :-}
sincerely.
I think the way from bad code to better code is no straight line but a big territory with many ways, pitfalls canyons
and shortcuts etc.
But resonating with the values above I'd like the values that kent beck based his "Implementation Pattterns" (although
I'm very ambivalent about the whole book) on.
Code quality is measured in (in this order): understandability/readability, simplicity, flexibility.
That makes any code that improved in one of the areas above (preferably in the first two) better than before.
Michael
--
Michael Hunger
Independent Consultant
Web: http://www.jexp.de
Email: michael...@jexp.de
Enthusiastic Evangelist for Better Software Development
Don't stop where you are: http://creating.passionate-developers.org
Sign the Software Craftsmanship Manifesto at: http://manifesto.softwarecraftsmanship.org
We support Software Engineering Radio (http://se-radio.net)
I personally prefer a very concise coding style with guard clauses which return early, single level of abstraction
within a method. Almost no nested blocks (returning as soon as it becomes clear this path is done), to keep the ballast
of the mental model as small as possible when understanding code (that is also quite nicely explained in Implementation
patterns), extracting methods with descriptive names rather than writing comments, narrowing the scope of variables and
methos as much as possible, having final method params, local and instance variables etc. So keeping it as concise and
"simple (for me)" as possible. I'm also very fond of refactoring code as I have the best tools available for the job
(experience and IntelliJ).
He argued that he moved away from this concise style some years ago as he evolved his development skills. The problem is
when you're writing this kind of code, many developers have a harder time understanding it, and even worse when copying
it they make a lot of mistakes and produce crap (e.g. by leaving/forgetting out important alternative paths). So he
simplified his coding style by:
* declare every local variable you'd like to use upfront
* have ony one exit point from a method with a commonly named variable (result)
* each if needs an else / or comment // no else, why
* no final keywords as it clutters the code
His point is to find the common denomiator for a team where the majority of people can understand, maintain and extend
the code and is able to keep developing sound code with a single style.
I can understand his motivation and the problems it solves and have also experienced sometimes people who complained
about my many little methods, classes and scopes (they rather like big classes and big methods where they can see
everything at a single glance).
What do you all think about that?
Michael
A while ago I found an interesting approach on Code Reviews called "tick the code" at
http://www.tick-the-code.com/en/index.php
It is a quite fast but imho limited application of fixed rules (one at a time) to a printout of the code to be reviewed.
There I also found the "single exit point" and "no if without else" rules which made me discuss that with the author and
one of my friends and the pragprog mailing list. It seemed I was quite alone with my approach back then :)
But I have found the minimizing of mental ballast when reading code (not keeping in mind all the open scopes, paths,
used or to be used variables, conditions that were evaluated for one path etc) in mind while reading code is more
beneficial than the consistency of having the same structure over and over again.
http://jexp.de/index.php?n=Info.Demotivators
We print them out and hang them in the office to remind people of doing better and having a pointer when finding stuff
like that again - just point to the poster and either say "rembember?" or "read this".
HTH
Michael
* smaller is better when possible
* less nested is better when possible
* names are important and really hard to get right (a very good thing
for pairing or code review)
* i prefer one return since it makes maintenance easier, over
test-return-test-return style
* languages that don't support "let" or "where" really really irk me
because they have lost a really nice optional middle ground between
one-long-method vs. lots-of-little-methods.
* i would like to see user defined types used more, vs. primitives
since i feel the latter are insufficiently constraining / expressive
(obviously i'm talking about typed languages :-)
* i prefer typed languages that have type inference
* inheritance is, generally, evil
* i prefer pure fp over other things, but realize (see CTM by Van Roy
+ Haridi) that state change will be inevitable, and so things like OO
or STM or Agents will be needed -- but i think one should use
shared-mutable-state as a last worst resort in some sense (not that
STM etc. are perfect themselves by any means).
* on the whole i don't like whitespace sensitive languages for bigger
code if only because my IDE can't automatically reformat it
* c++ is evil ;-)
* java sucks ;-)
* ok, ok, really, "it depends".
p.s.: my on-going brain dump of concerns:
http://therightabstractions.wikispaces.com, in particular
http://therightabstractions.wikispaces.com/ToolsAndProcess.
sincerely.
hysterical. excellent!
I am new to this list. I found this discussion interesting so far,
just wanted to add my 2 cents.
First I want to say that generally I agree with Lance in that I think
methods should be small, as small as possible. Maybe I should add "but
no smaller". I've found that some developers are often shocked when I
tell them that I think the ideal for a method is about five lines,
including the declaration.
However, I note that I say "ideal". Ideals are human impossibilities.
No human is ideal. I don't really want to get bogged down in that.
Suffice to say that holding humans (and their artefacts) up to some
impossible ideal only leads one to reject the failing humans. So, is
this ideal impossible? In some cases. But that does not mean we
shouldn't strive to get there. I can't be a perfect human in any
respect but I can try to be a *better* one. And so can my code.
But enough of the motherhood and generalities;
On 11/04/2009, at 07:50 , Lance Walton wrote:
> Let's take the first one on that list. Does everybody here always
> remove all duplicate code? Or is it sometimes 'pragmatic' or a sign of
> superior craftsmanship to leave some duplication?
Yes, I would always strive to remove duplicate code where I found it.
The only thing that would stop me is if I was ordered not to by some
superior force, e.g. as a consultant on a one-week or other similarly
short engagement to get something specific done, and the client told
me not to bother (assuming they found out before I did anything). I
would then spend the time I would have spent doing the fix writing the
client an email and telling them to confirm in writing they don't want
me to attend to any major code-quality issues.
> Where's the boundary? All knobs to 10, or does a craftsman know that 9
> is best sometimes? Do we all have the same definition of 'sometimes'?
In a previous life - in between electrical engineering and starting
computer science - I was an audio engineer. I still have some interest
in that area. I can fully extend the analogy.
I can certainly tell you that "10" is definitely a "bad setting". At
the end of the knobs there lies non-linearity. Unless you deliberately
choose it as an aesthetic choice (which is perfectly valid, even good
old analogue tape distortion can be very sweet, on some types of
music). Volume knobs definitely shouldn't usually go past about 70% in
a "mainstream" application. I try to never set them beyond 50%. For
equalisation ... "it is better to cut than boost". In fact going as
far back to the original source as you can is the best. Bad guitar
sound? Go into the recording studio, Listen to the guitar. Does it
sound bad in there? Move the microphones, change the microphones,
change the guitar, investigate the guitar's signal chain to the
amplifier, change the amplifier, move everything to a new room or
maybe try the stairwell, do you need to change the player (defer to
producer), and only now, am I looking at the mixing desk, the
outboard, and other signal processing tools. Yet people think, I'm
sitting behind the tracking desk, it should be my first area of
responsibility. Yet it is my last!
In some ways, and I'm going to make a probably unsustainable analogy
here, when I'm sitting in front of my IDE looking at adding
functionality (implementing a story) I'm trying to see what "further
back" in the signal chain needs to be changed, what has to be taken
away, and so on before I even begin to start inserting my new code
into the chain. If I don't find anything that immediately jumps out at
me then I guess I figure I'm right to start writing the unit test for
my new code. In many ways the process is part of my "what existing
unit tests are there"? and "where does my new unit test belong?"
thinking-time that goes into my design decision-making process really
before I even start typing on the keyboard.
So what is the definition of "sometimes"? Well maybe it's part of the
ineffable art of craft, that we all learn when that "sometimes"
occurs? Certainly even with all the "rules" of audio engineering I
stated above there are many creative engineers (much better than me)
who break all those rules and still manage to make beautiful sounding
recordings, because in some part it is about the quality your 'ear'
more than anything else. But I guess that is not being helpful in this
particular context.
> I'm not trying to be difficult. I am not looking for prescriptions; I
> know we have to balance some stuff (but I do not believe that
> everything is subject to context).
Perhaps in this case it's better if we enumerate what we believe those
things that are not contextual to be. Certainly I would say that cut
and paste code duplicates are one such context-free occasion. Also I
would add, any method longer than "a screen" must be made shorter than
"a screen" and ideally, five or six lines if possible. For me it's the
screenful rule (I take that to be 15 to 20 lines) that has the smell
of Inviolate Principle about it. Shorter than that is better, but only
if it *is* "better", i.e. increases code readability /
understandability.
I've written way too much for such little signal of doubtful quality
that I will terminate my ramblings here.
scot.
Its pareto again, as so often.
Cheers
Michael
--
You may be well aware of the trade-offs but our discussions here seem
prone to wandering into discussions of ideals and rules rather than
heuristics. In that spirit here are some of my heuristics that I use
when working with object oriented and garbage collected languages:
- Guard blocks
- Multiple returns
- Default assumption that everything is immutable until proven otherwise
- Domain modelling with an understanding that one's understanding of
the domain will change and that certain parts of the domain will
'attract' features and we'll have to watch carefully to make sure they
don't become God-classes just because the ubiquitous language we're
using places them at the heart of everything.
- Using DDD but watching out for the point when an entity should
become an aggregate
- Solving problems with data structures rather than conditional logic
(this can range from using polymorphism rather than conditions all the
way to using a priority deque in order to reduce apparently complexity
in a system. I'm saying apparent complexity because in reality we've
just moved the complexity into the implementation of the data
structure)
- Eliminate cyclic dependencies
- Use Whole Values (from Ward Cunningham's Checks pattern language)
- Treat domain objects as state machines so that the set of legitimate
transitions is carefully controlled/tested/restricted
- Use constructors rather than Sprint/Pico/Guice/etc.
Most of my heuristics are derived from a combination of Beck's 2
Smalltalk books; Arthur Riel's book: Object Oriented Heuristics,
Meilir Page-Jones and the Zen of Python.
I don't buy the distinction between theory and practice. In
programming, there are statements made at one level of abstraction
that need refining at a more concrete level, but this is not a
difference between theory and practice.
For example,
Now some people would say something like 'in theory, I remove all
duplication. In practice the clutter that Java introduces would
prevent me from doing so here.' But that simply avoids a learning
opportunity.
> What would happen if we don't try to find a better metaphor and instead just
> state what we think 'cleaner' code means?
Didn't someone do this a decade ago? I find I get great results from
1. Passes tests
2. Minimizes duplication
3. Maximizes clarity
4. Is small
--
J. B. (Joe) Rainsberger :: http://www.jbrains.ca
Diaspar Software Services :: http://www.diasparsoftware.com
Author, JUnit Recipes
2005 Gordon Pask Award for contribution to Agile practice
Register for Agile 2009 at http://www.agileregistration.org
> However, I've found that other people I work with follow rules like
> 'if a method is only use by one other method, inline it', which I find
> astonishing. Clearly this creates tensions because their idea of
> 'cleaner' is not the same as mine.
We don't need to define good design.
http://www.jbrains.ca/permalink/220
My goal, as a developer, is to write the absolute best code I can, at
the time. William Strunk of the /Elements of Style/ states:
"Vigurous writing is concise. A sentence should contain no
unnecessary words, a paragraph no unnecessary sentences, for the same
reason that a drawing should have no unnecessary lines and a machine
no unnecessary parts. This requires not that the writer make all
sentences short or avoid all detail and treat subjects only in
outline, but that every word tell."
Our choice of words should ultimately depend on our audience. I'm not
going to write identically for elementary school students and
university graduates. To substitute unnecessary syntax or structure
when a more telling option is available seems wasteful.
To curtly describe my first reaction, the above sounds like dumbing
down the curriculum because the students don't want to take the effort
to learn. This might be the correct thing to do, for him. If an
employer can't keep employees long enough to adequeately train them or
doesn't hire those who will take the effort to learn, than it may be
the his best option. But if that were the case, I'd hope the company
would evaluate its tenets and practices and make necessary changes.
I'm a developer, and developing is a craft. I'm also a writer. I
write code. Unless my audience deserves otherwise, I'm going to use
the best and most descriptive syntax, structure, adjectives and
adverbs I can muster. I want my kids to ask questions. I want other
developers to ask questions. I want to learn.
--Kaleb
> My view is that a professional simply does the best job possible in the
> environment where they work.
> In any case, secessionism is the wrong attitude. Our goal ought to be
> "change from within" not "revolution".
huh, i don't see somebody leaving to work at a different (be it bigger
or smaller or whatever) place as anything inherently wrong. it rather
shocks me that anybody could consider it appropriate to tell another
person where they should/n't stay for their job.
i found the wording of the original post quite strange overall!
imhumbleo, revolution would be walking into the CEOs office and
pointing a gun at their head; leaving for something else that would
simply be competion is not revolution. it might be 'secessionism' of
some form, but i don't see it as mattering.
sincerely.
I created a "boutique shop" within Obtiva's existing business of
consulting at corporate IT shops. We rotate people into our Software
Studio (aka "boutique shop") and out to larger corporate environments.
This allows us to bring up apprentices in a safe, friendly
environment and then give them the opportunity to experience corporate
environments. In this way, we get to change organizations and also
change organizations.
I don't view what we do as successionist. We are still actively
engaged in several corporate IT consulting gigs and have no plans to
pull out of that business.
Best,
Dave Hoover
//obtiva: Agility applied. Software delivered.
At the risk of going off-topic, I'll point out that I have had similar
thoughts about working with a tough, legacy codebase. Which is the
greater craftsman: the one who can build a feature into a non-existent
codebase, or the one who can build a feature into an extremely chaotic
codebase? I'm reminded of the houses that are built into a
mountainside vs. houses that are built on the prairie. There, now
I've officially taken us off-topic. :)
To me the OP was about some of the elitist, or close to it, comments
about software craftsmanship. There have been numerous 'must be our
way' and 'your way can\'t work' posts on this list, in blogs and on
twitter. And if that is not what it was about I think it should have
been.
For me craftsmanship boils down to improving myself and the community around me.
--
David
blog: http://www.traceback.org
twitter: http://twitter.com/dstanek
there's a whole bunch of people who work on / have experience with
mapping Lean to software. i'm not one of them but... my understanding
is that "cost" is viewed appropriately as e.g. technical debt or
what-have-you, not so much just $ where they'd recommend "lay off
everybody!"
sincerely.
I agree that now is a great opportunity to introduce change into
existing environments because people are feeling pain. One of the
ways that Software Craftsmanship proposes to cut cost is by using
small, tight-knit teams of generalists rather than larger teams of
specialists. We've found success with this approach at Obtiva with
our teams of generalists.
50 Years of Stupid Grammar Advice -By GEOFFREY K. PULLUM
http://chronicle.com/free/v55/i32/32b01501.htm
Michael
I'd like to cite another elements of style: That of Christopher Alexander where he lists 15 elements of Architectural
style. Interestingly many of them can be applied to code as well.
(http://www.weepingash.co.uk/tools/tool02.html)
My favorites:
14. Simplicity & Inner Calm
Use only essentials; avoid extraneous elements.
7. Local Symmetries
Organic, small-scale symmetry works better than precise, overall symmetry.
12. Echoes
Similarities should repeat throughout a design.
Michael
Here are all of them:
1. Levels of Scale
A balanced range of sizes is pleasing and beautiful.
2. Strong Centers
Good design offers areas of focus or weight.
3. Boundaries
Outlines focus attention on the center.
4. Alternating Repetition
Repeating elements creates a sense of order and harmony.
5. Positive Space
The background should reinforce rather than detract from the center.
6. Good Shape
Simple forms create an intense, powerful center.
7. Local Symmetries
Organic, small-scale symmetry works better than precise, overall symmetry.
8. Deep Interlock & Ambiguity
Looping, connected elements promote unity and grace.
9. Contrast
Unity is achieved with visible opposites.
10. Gradients
The proportional use of space and pattern creates harmony.
11. Roughness
Texture and imperfections convey uniqueness and life.
12. Echoes
Similarities should repeat throughout a design.
13. The Void
Empty spaces offer calm and contrast.
14. Simplicity & Inner Calm
Use only essentials; avoid extraneous elements.
15. Not-Separateness
Designs should be connected and complementary, not egocentric and isolated.
Is this perhaps why some believe that there's an air of elitism in this
group? Corporate IT isn't "hostile", it's just dealing (incorrectly)
with the fact that IT itself is a support unit and not a fundamental
driver of the business. It may be an important support unit that can
have direct impacts on revenues, but it's a support unit nonetheless.
In a corporate environment, we need to demonstrate that the ROI of a
high-quality "craft" approach to building software is achieved
relatively quickly. We need to make it so clearly evident that it's a
better approach than "just get it done, and we'll clean it up later",
that the people at or near the top of the corporate food chain buy in.
They don't give a flying you-know-what about a "craftsmanship movement"
- they do, however, give one about better project throughput and lower
cost (as opposed to "price"). We have to make it real for *those*
people, not just for us.
--
Dave Rooney
Mayford Technologies
"Helping you become AGILE... to SURVIVE and THRIVE!"
http://www.mayford.ca
http://practicalagility.blogspot.com
Twitter: daverooneyca
Paul
I agree. How do you propose to do that?
> They don't give a flying you-know-what about a "craftsmanship
> movement"
Why do corporations go 'agile'? Do they care about these movements in
development to some extent?
>
> - they do, however, give one about better project throughput and lower
> cost (as opposed to "price"). We have to make it real for *those*
> people, not just for us.
This is the big questions to me. How do we convince an organization
that quality = cheaper? That the quality of code will have a direct
effect on the price of the project?
Paul
Similarly, how do we convince an organization that attracting and
growing 3-4 craftsmen and paying them $500k per year is both cheaper
and more productive than managing 16 specialists and paying them $1M
per year?
The way that I'm currently convincing organizations of this is by
creating small teams of craftsmen who deliver value consistently to
customers.
> My view is that a professional simply does the best job possible in the
> environment where they work. If you are in corporate IT, you do the best
> job you can there. If you are in a boutique, you do the best job you can do
> there. Moving from one to the other does not necessarily make you
> a craftsman. Indeed, there is something to be said for doing it where its
> *hard* as opposed to where it's easy. Which is the greater craftsman: the
> one who can hold his/her disciplines under adversity, or the one who faces
> no adversity?
A fine question. Perhaps this group doesn't include a large number of
people willing to act as pioneers in this regard -- or perhaps they
have been suffering in relative silence and want a break to spread
their wings and fly.
> In any case, secessionism is the wrong attitude. Our goal ought to be
> "change from within" not "revolution".
I don't think people here have "the wrong attitude". Why can't we do
both? Maybe we have been doing both.
--
J. B. (Joe) Rainsberger :: http://www.jbrains.ca
Diaspar Software Services :: http://www.diasparsoftware.com
Author, JUnit Recipes
2005 Gordon Pask Award for contribution to Agile practice
Register for Agile 2009 at http://www.agileregistration.org
> My colleagues in Switzerland who do Lean consultancy (I mean proper
> lean Lean, in factories that make things with production lines) say:
> "cut costs to increase quality". Only they say it in German, which is
> much more impressive.
But they say it in Swiss German, which makes it less impressive.
Austrian German would sound better.
> Similarly, how do we convince an organization that attracting and
> growing 3-4 craftsmen and paying them $500k per year is both cheaper
> and more productive than managing 16 specialists and paying them $1M
> per year?
Forget convincing. Just do it and still charge them $16M. Pocket the profit.
> The way that I'm currently convincing organizations of this is by
> creating small teams of craftsmen who deliver value consistently to
> customers.
Aha. You already get it.
UB,If you are working in a hostile environment to craftsmanship, I see how it can be tempting to leave and find work in an environment that rewards good code and personal relationships (i.e. some small boutique software shops fit that bill, rather than a larger organization focusing on developing as craftsmen). Is it the professionalism of the individual to stick around and make the situation better for writing high quality code? Or is the corporation responsible for retaining/empowering those that care about quality?