RESOLVED: Using ORMs leads lazy programmers
to make bad database designs. It's better to
carefully design your database with no invisible
means of support and there is no reason to not
use SQL directly for this purpose.
FOR EXAMPLE: Consider blogging. The most
successful blog software is WORDPRESS. Here
is the WordPress data model:
Beautiful, isn't it? It was designed by people who
thought about what they were doing and did it carefully.
Now let's look at the Sakai Blogger tool data model
(as reverse engineered by someone who had to
fix a bug -- there actually was no data design created
by the implementers):
How did the above happen? I suspect someone opened
up Eclipse and started typing, relying on the Hibernate
ORM to handle all the database stuff automagically. The
result is a massive headache for people like me. Another
one. I routinely open up the mysql prompt and start typing
"show tables", "describe table blah", "select * from blah limit 10"...
trying to figure out WTF Hibernate did with my data.
Occasionally I fantasize about making a non-trivial change
to one of these programs, but I strongly resist going further
than that because the ORM meatgrinder makes it somewhere
between extremely unpleasant and impossible to make any
non-trivial changes to a non-trivial program, especially after
it has been populated with data.
Ok I feel a little better now. Maybe I should get back
-- Aaron Watters
It has been said that democracy is the worst form of government except
all the others that have been tried.
Sir Winston Churchill
What I am mostly allergic to is manipulating sql queries as strings and
resultsets as lists of tuples. I strongly prefer a higher level
representation of both the queries and the resultsets. From this POV,
SQLAlchemy or Django's ORM are better than the primitive scheme above -
and don't have that much impact on db design.
> RESOLVED: Using ORMs leads lazy programmers
> to make bad database designs.
to me, a "lazy" programmer is someone that is mostly concerned with not
repeating itself. As such, "lazy" usually ends up with better designs
(db or whatever) !-)
> It's better to
> carefully design your database with no invisible
> means of support and there is no reason to not
> use SQL directly for this purpose.
I don't use SQL to *design* my databases, I use SQL (directly or
indirectly) to create and mananage my schemas and datas.
For the design part, I usually start with a pencil and some paper.
(snip remaining rant).
I think your example is nonsense. Just comparing the two models based on
"they both are for blogging" is like comparing a cessna to a 747 - yes,
both are flying, but that's pretty much what they have in common.
It is pretty clear that sakai's data-model caters to a very
sophisticated *user*-model, with roles and permissions, and whatnot.
Plus other features. So it appears to be more in the CMS-league.
Now it's obviously easy to argue that this isn't needed for a simple
blog. Nonetheless, it doesn't make a point about ORM. With any ORM I
know (both in Java and Python) you could design the simple and straight
model WP uses.
And I've seen my fair share of convoluted, trashy pure-SQL-based DBs as
So I think to make your point you need some more convincing arguments.
As a result the whole thing looks like a dog's breakfast.
> Now it's obviously easy to argue that this isn't needed for a simple
> blog. Nonetheless, it doesn't make a point about ORM. With any ORM I
> know (both in Java and Python) you could design the simple and straight
> model WP uses.
> And I've seen my fair share of convoluted, trashy pure-SQL-based DBs as
Yup, me too. Many such designs make mistakes like using multiple columns
(or, even worse, comma-separated values) instead of many-to-many
relationships. Sad how such elegant tools are so badly abused so often,
> So I think to make your point you need some more convincing arguments.
It seems to me that the biggest sin in databases is a failure to use
rigorous design techniques. If somebody doesn't understand relational
theory then they will probably not design database representations that
are easy to work with. If they do understand then the designs they
produce will operate just as easily with object relational mappers as
they do with direct SQL.
I suspect Aaron was really complaining about ignorance rather than
laziness (though the two do sometimes go together, and the former is
often but by no means invariably caused by the latter). But he'd have to
confirm or deny that.
Where does one find out about this? I've somehow managed to avoid
it for an awfully long time, but it begins to look useful. Thanks.
The works of C. J. Date, especially (in my view) “Database In Depth”
the further reading recommended at the Wikipedia article
solid groundings in the topic for programming practitioners.
In learning about the relational model, you'll inevitably learn that SQL
is rather flawed in its implementation of that model. Nevertheless, it
turns out to be the best (largely because most widespread) language for
describing a relational database, *if* used with knowledge of the
relational model to avoid those implementation flaws.
\ “As scarce as truth is, the supply has always been in excess of |
`\ the demand.” —Josh Billings |
Go to the source: "The relational model for database management" by E. F. Codd
Yeah sure, whatever. I'm sure a good programmer could use sql
directly and produce a tighter, faster, better-performing application
than an ORM-solution, same as you could use C to produce a tighter,
faster, better-performing application than a pure Python one.
No thanks, you can write your databases in database assembly language
if you want, I'll stick to my ORM, tyvm.
Isn't WordPress written in PHP? Are ORMs even possible in PHP? I can
almost rationalize use of direct sql if the alternative is some
hellspawn PHP ORM.
It's certainly way easier to write good SQL than to write a good C
program, and even when using an ORM, it's sometimes worth "going down"
to hand-written SQL queries for some more or less complex cases. But I
guess this falls in the same category as recoding a couple core
functions in C for improved performances.
> Isn't WordPress written in PHP? Are ORMs even possible in PHP?
There are (alas) some attempts. Nothing close to SQLAlchemy nor even
Django's - PHP is just not hi-level enough.
> I can
> almost rationalize use of direct sql if the alternative is some
> hellspawn PHP ORM.
What _could_ possibly be done in PHP would at least be a higher-level
representation of sql expressions, that would make it easier to
dynamically build / modifie / combine queries. Working with "SQL as raw
strings" is both tiresome and error-prone when it comes to dynamically
generate complex queries. That part is IMHO *much* more important than
mapping tuples to objects.
FWIW, sakai seems to be a wiki, not a blog. And I would add that not
having a nice graphical *representation* of a schema doesn't imply the
schema is wrong from a relational design POV.
I recently had the uncomfortable task of picking an ORM to use in a
php project for a start up.
There's something called RedBean http://www.redbeanphp.com/ that's
actually pretty effin sweet (IMHO). Sweet enough that I'm considering
porting it to python.
It's more of a RAD tool than something you'd want to use in production.
As far as the OP rant goes, my $0.02: bad programmers will write bad
code in any language, with any tool or system or environment they're
given. If you want to avoid bad code there's (apparently) no
substitute for smrt programmers who are familiar with the tools
they're using, not just the syntax but the underlying conceptual
models as well.
That said, "I feel ya bro'"
In win32api line 10 is written:
mod = imp.load_dynamic(__name__, path)
ImportError: DLL load failed: The specified module
could not be found.
import imp is available,
BTW, the comma-separted-values-in-a-field is officially called the First
Anormal Form. There *has to be* some value to it since I've seen it used
quite a few times...
Just because you've seen something, doesn't mean it has value; just
because something has value, doesn't mean the value outweighs the
I'm not saying you're wrong, just that you may not be right. ;-)
The setting of irony poti must've been to low... What I really meant to
say was: First Anormal Form is despicable, and anyone who uses it
schould Rot In Hell For All Eternities To Come! really! :-)
The proliferation of tables required for the pure relational approach
is problematic enough that postgresql supports array-valued columns.
That comes across to me as an acknowledgement that the benefits of
"CSV in a field" are worth supporting in a somewhat less kludgy way.
If I knew what First Anormal Form was I (hope!) I would have seen the
irony. But it was still fun to wax philosophical for a moment. ;-)
It appears to be a made-up term.
This refers to the Normal Forms one goes through when normalizing
The First Anormal Form (FAN) means just lumpin' data together in a comma
separated string that has to be un-lumped in a program. When there is
more than one such program operating on the same FAN-field you can bet
they interpret "comma separated" differently, or one wants the comma
separated values sorted and gets confused when the other program inserts
values in arbitrary order, or, if the one program expects ints, and the
other inserts floats, maybe not even using dotted notation, but
commas... you get the picture. In summa: FAN is evil.
I read it somewhere once, I just can't find or even remember the source.
I definitely didn't make it up, though I wish I had.
I found exactly one google hit for it, which is this clpy thread.
I believe he mistyped it. Try "First Abnormal Form":
"I have come to believe that the whole world is an enigma, a harmless enigma
that is made terrible by our own mad attempt to interpret it as though it had
an underlying truth."
-- Umberto Eco
Ah, that makes sense. Article is subscriber-only but would seem to
explain the situation.
The word “anormal” appears to have been made up by you.
The negation of the word “normal” is “abnormal”, perhaps you meant
“First Abnormal Form”? That term was in use when E. F. Codd was
originally describing the normal forms, and seemed to imply a database
that was in even worse shape than 0NF.
More recently, this witty commenter at Daily WTF has defined a plausible
(though certainly not universally-agreed) set of abnormal forms
\ “Nothing so needs reforming as other people's habits.” —Mark |
`\ Twain, _Pudd'n'head Wilson_ |
Maybe my English (and my memory) is just not so good. I'm german, and
here "abnormal" and "anormal" are both negations of "normal", but with a
slight difference in meaning. "anormal" means just "not normal", whereas
the meaning of "abnormal" is more like "perverted". That's of course the
better word for the case at hand.
> That term was in use when E. F. Codd was originally describing the normal
> forms, and seemed to imply a database that was in even worse shape
> More recently, this witty commenter at Daily WTF has defined a plausible
> (though certainly not universally-agreed) set of abnormal forms
That's better. I'll keep googling though, because I distinctly remember,
if nothing else, that I have read "First Abnormal Form" being used for
the comma separation stuff.
>From my understanding, the prefix "ab-" comes from latin (away /away
from) - which can also be shortened to "a" in some usages. (e.g.
The prefix "an" - which is more commonly shortened to "a" comes from the
greek - meaning "without" (e.g. anaerobic )
I agree that "abnormal" would be the better term here - as it does /can
have normalisation information, just not in a standardised manner.
> Occasionally I fantasize about making a non-trivial change
> to one of these programs, but I strongly resist going further
> than that because the ORM meatgrinder makes it somewhere
> between extremely unpleasant and impossible to make any
> non-trivial changes to a non-trivial program, especially after
> it has been populated with data.
Object-relational mappers are like putting lipstick on a pig:
Cute, but wrong. Using ORMs is better than using "Object databases".
In my case I use Python to un**** data created by java/hibernate.
If I was using a java based "object database" I would be simply stuck.
At least if you use an ORM you have a way to access the information
without writing a program in the programming language that the
ORM was defined in. Anyway, thanks for all the great comments on
this thread from all you Sarcopterygii and Haplorrhini out there.
-- Aaron Watters
SQL is the worst possible data interface language
except for all the others. -- Churchill (paraphrased)
> On Oct 16, 10:35 am, mario ruggier <mario.rugg...@gmail.com> wrote:
>> On Oct 5, 4:25 pm, Aaron Watters <aaron.watt...@gmail.com> wrote:
>> > Occasionally I fantasize about making a non-trivial change
>> > to one of these programs, but I strongly resist going further
>> > than that because the ORM meatgrinder makes it somewhere
>> > between extremely unpleasant and impossible to make any
>> > non-trivial changes to a non-trivial program, especially after
>> > it has been populated with data.
>> Object-relational mappers are like putting lipstick on a pig:http://gizmoweblog.blogspot.com/2006/10/putting-lipstick-on-pig.html
>> m ;-)
> Cute, but wrong. Using ORMs is better than using "Object databases".
> In my case I use Python to un**** data created by java/hibernate.
> If I was using a java based "object database" I would be simply stuck.
> At least if you use an ORM you have a way to access the information
> without writing a program in the programming language that the
> ORM was defined in. Anyway, thanks for all the great comments on
> this thread from all you Sarcopterygii and Haplorrhini out there.
Data persistence isn't a "one-size fits all" problem. It really depends
on the needs of the system. Object databases solve the problem of
storing complex object graphs, deeply nested data structures, and
serializing language-specific objects like continuations or
what-have-you (but I think that last one is yet unsolved). We all know
what RDBMS' are good for. Neither is perfect for solving every data
It's a pun on First Normal Form. To transform a schema into First Normal
Form you remove repeating groups from the entity and place them in a
newly-created entity, leaving a copy of the identifier column behind to
express a relationship between the two.
It's a pun on First Normal Form. To transform a schema into First Normal
> Simon Forman wrote:
>> As far as the OP rant goes, my $0.02: bad programmers will write bad
>> code in any language, with any tool or system or environment they're
>> given. If you want to avoid bad code there's (apparently) no
>> substitute for smrt programmers who are familiar with the tools they're
>> using, not just the syntax but the underlying conceptual models as
> Hear, hear!
That's all very well, but some languages and techniques encourage the
programmer to write bad code.
That's just BS.
Bad code doesn't just write itself. Programmers write bad code. And
ignorance is not an excuse.
Just because a language allows a programmer to write sloppy code doesn't
put the language at fault for the bad code programmers write with it.
Any half-way decent programmer should be cognisant of when they're
writing bad code and when they're writing good code. They should be
able to admit that they don't know enough about a language to be writing
programs for money in it. They should be able to see anti-patterns and
areas of their code that should be re-factored or re-written.
The real underlying problem is the human characteristic that allows us
to let ourselves believe that we're better than everyone else or more
simply, better than we really are.
>>> Hear, hear!
>> That's all very well, but some languages and techniques encourage the
>> programmer to write bad code.
> That's just BS.
> Bad code doesn't just write itself. Programmers write bad code. And
> ignorance is not an excuse.
> Just because a language allows a programmer to write sloppy code doesn't
> put the language at fault for the bad code programmers write with it.
Okay, as long as you realize the corollary of your argument is:
It is impossible for a language to encourage programmers to write good
code and promote good programming practices by design.
I'm not sure that's entirely true either.
I think python's "one way to do something" design philosophy goes some
way toward that, as does Smalltalk's enforced message passing. I think
PHP's superglobals and namespacing encourage bad practices (or used to
back in the day), as do Basic's GOTO and Ecmascript's prototype overriding.
Surely a language CAN be said to encourage kludges and sloppiness if it
allows a good way and a bad way and makes the bad way much easier to
implement or understand for noobs.
> J Kenneth King wrote:
>> Steven D'Aprano <st...@REMOVE-THIS-cybersource.com.au> writes:
>>> On Fri, 11 Dec 2009 19:20:21 -0500, Steve Holden wrote:
>>>> Hear, hear!
>>> That's all very well, but some languages and techniques encourage the
>>> programmer to write bad code.
>> That's just BS.
>> Bad code doesn't just write itself. Programmers write bad code. And
>> ignorance is not an excuse.
>> Just because a language allows a programmer to write sloppy code doesn't
>> put the language at fault for the bad code programmers write with it.
> Okay, as long as you realize the corollary of your argument is:
> It is impossible for a language to encourage programmers to write good
> code and promote good programming practices by design.
> I'm not sure that's entirely true either.
> I think python's "one way to do something" design philosophy goes some
> way toward that, as does Smalltalk's enforced message passing. I think
> PHP's superglobals and namespacing encourage bad practices (or used to
> back in the day), as do Basic's GOTO and Ecmascript's prototype
I think your corollary is slightly misleading.
It would be more apt to say, "Just because a language allows a
programmer to write good code doesn't mean that the language is
responsible for the good code programmers write with it."
It is the responsibility of the programmer to recognize the advantages
and flaws of their tools. PHP doesn't encourage a programmer to be a
bad programmer because it lacks name-spaces or because BASIC has GOTO
statements. A bad programmer will be a bad programmer because they
don't understand what makes these features distinct, useful, or
The language doesn't encourage anything. It's just a medium like oil
paints and canvas. A painting can be good or bad despite the medium it
is constructed on. The skill of the painter is what matters.
> Surely a language CAN be said to encourage kludges and sloppiness if it
> allows a good way and a bad way and makes the bad way much easier to
> implement or understand for noobs.
The programmer can be encouraged to use kludges and produce sloppy
code. Whether by ignorance or inflated ego. Languages with more choice
just give them more rope to hang themselves with.
Technically, oil paints do encourage a certain kind of painting.
They can be layered on top of old paint easily, and they dry
slowly, allowing you to slowly "build up" a painting in layers,
and create effects with texture. If you try doing thse things
with watercolors, and you'll probably be discouraged.
I think a programming language does encourage a certain kind of
code. Good code in one language can be poor in another.
It's a weak analogy on my part, but I think I do understand what you
In regards to my original point, I think I just came up with a clearer
way to express it:
A language is a thing. It may have syntax and semantics that bias it
towards the conventions and philosophies of its designers. But in the
end, a language by itself would have a hard time convincing a human
being to adopt bad practises.
I believe it's the fault of the programmer who adopts those poor
practises. Surely their acceptance of GOTO statements and
prototype-overloading are signs of their own preferences and ignorance?
It suggests to me that they learnt enough of one language to get by and
stopped thinking critically as soon as they sat in front of their
Throw an idiot behind a Python interpreter and it won't teach them a
damn thing unless they're capable of learning it on their own. No
matter how well you manage to hard code your conventions into the
language. Bad code is written by bad programmers, not bad programming
Perhaps someone should make a research whether if you teach a language
to kids, where one group is taught the language filtered from "bad
words" and another group is taught all the languages' "bad words" on
purpose. Will one group have more behavioral problems compared to the other?
I would be curious to know, but the test is likely impossible without
trespassing on ethical boundaries. ;)
I would hypothesize that you would not find an increase in behavioural
a) Without cultural context "bad words" have little meaning
b) Behavioural issues can be attributed to several factors such as
physiology, health, environment, etc.
c) This has nothing to do with programming languages. A programmer that
lacks critical thinking is a bad programmer. The language they use has
no bearing on such human facilities.
The language may well have a bearing on the quality of the programs
generated though, which is what most people care about. A dolt writing
in python is far less likely to write a program that bluescreens the
users machine than a comparative dolt writing the same program in C or
Of course two gurus writing in different languages would produce equally
good results but gurus are considered gurus by virtue of their scarcity.
Back in the real world the further into dolthood you venture, the more
the more important the design of the language becomes to the quality of
outputs you can expect to get from your code monkeys.
Take 100 perfectly average programmers and give them the same programs
to write in a variety of languages, you will get higher quality results
from some languages than others i.e. not all languages are equal. I
think it's fair to say the ones that give the best results encourage
good coding and the ones that give the worst results encourage bad coding.
If you don't believe it's possible to have a language that encourages
bad coding practices consider this one I just made up, I call it Diethon..
It's entirely the same as Python 2.6 except that any syntax errors that
happen within class definitions cause the interpreter to send offensive
emails to everyone in your contacts list and then delete your master
Unsurprisingly users of this language are reluctant to try to create
object oriented code and resort to ugly struct and list based paradigms
> A programmer that
> lacks critical thinking is a bad programmer. The language they use has
> no bearing on such human facilities.
That's nonsense, and I can demonstrate it by reference to a single
programming language, namely Python.
For many years, Python had no ternary if operator:
result = x if condition else y
Instead the accepted, idiomatic Python way of writing this was to use
result = condition and x or y
However this idiom is buggy! If x is a false-value (say, 0) then result
gets set to y no matter what the value of condition.
This buggy idiom survived many years of Python development, missed by
virtually everyone. Even coders of the calibre of Raymond Hettinger (who
neither lacks critical thinking nor is a bad programmer) have been bitten
"The construct can be error-prone. When an error occurs it can be
invisible to the person who wrote it. I got bitten in published code
that had survived testing and code review: ..."
This is a clear and obvious case where a language feature (in this case,
the lack of a feature) encouraged an otherwise excellent coder to make an
error. It was a very subtle error, which was not picked up by the author,
the tests, or the coder reviewer(s). Had Python been different (either by
including a ternary if statement, or by forcing and/or to return bools
only) then this bug never would have occurred.
Of course awful programmers will be awful programmers in any language,
and excellent programmers will be excellent programmers in many languages.
(I say "many" rather than any deliberately. There's a reason why nobody
uses languages like Brainf*ck, Whitespace, Ook or Intercal for real work.)
But most coders are neither awful nor excellent. The language DOES make a
difference: the quality of a technician depends partly on the quality of
his tools, and programmers are no different.
If you don't believe me, imagine writing code in a language without
functions or loops, so you have to use GOTO for everything.
All very true.
But did the lack of ternary encourage Raymond to become a bad
That is what I believe the core of the argument is. Sure the misfeature
was over-looked by Raymond, but it took him (and perhaps the help of
others) to recognize it and fix it. That's because he's human and the
language is inert. He is smart and obviously has the cognitive
capabilities to recognize that the language has to change in order to be
a better tool.
It would be a different story if he just assumed that the misfeature was
actually a feature and that it was a good thing. In such a case would
Python the language be at fault or the people who write programs with
Good tools make all the difference in the world, I'm not arguing that.
Just that the tools don't use us; we use them. Programming in Python
doesn't instantly make me a better programmer. It can certainly make me
think of myself as a good programmer though... ;)
> Steven D'Aprano <st...@REMOVE-THIS-cybersource.com.au> writes:
>> On Mon, 21 Dec 2009 11:44:29 -0500, J Kenneth King wrote:
>>> A programmer that
>>> lacks critical thinking is a bad programmer. The language they use
>>> has no bearing on such human facilities.
>> That's nonsense, and I can demonstrate it by reference to a single
>> programming language, namely Python.
>> For many years, Python had no ternary if operator:
> But did the lack of ternary encourage Raymond to become a bad
No, but Raymond started off in a position of being an excellent
programmer. A single buggy idiom lead him to be slightly-less excellent
than he otherwise would have been. How many buggy idioms would it take to
lead him to become a mediocre coder, if he was unable to change languages?
Because Python is generally an excellent language, the harm done by one
or two misfeatures is minor. But less excellent languages encourage
coding styles, techniques and idioms that encourage the programmer to
write poor code: either complicated, baroque, unreadable code; or slow
inefficient code; or buggy code. To avoid starting a flame war, I will
avoid mentioning PHP. *cough*
Sometimes you know what you need to do to write non-buggy code, but
because covering all the corners are just Too Damn Hard in a certain
language, you simply lower your expectations. Error checking is tedious
and hard to get right in some languages, like C and Pascal, and hence
even good programmers can miss some errors.
Different languages encourage different mind-sets in the programmer: C
encourages the coder to think at the low level of pointers and addresses,
and primarily about machine efficiency; Java encourages the use of big
object hierarchies and design patterns (it's hard to write lightweight
code in Java, so everything turns into heavyweight code); Perl encourages
cleverness and code-golf (writing a program in as few lines or characters
as possible); Haskell and Lisp encourage a heavily abstract approach that
often requires an elite coder to follow; Forth encourages you to think
> Good tools make all the difference in the world, I'm not arguing that.
You appear to be arguing against that.
> Just that the tools don't use us; we use them.
Nobody said that tools use us.
> Programming in Python
> doesn't instantly make me a better programmer.
No, not instantly, but I would argue that after many years of coding in
Python you will be a better programmer than after the same number of
years of coding in PHP or Basic.
It also depends on what you mean by "better programmer". Some languages
value cleverness above all else. Python is not a language for writing
amazing, awe-inspiring hacks that work where nobody but the author can
work out why. This is why there is an Obfuscated C contest and an
Obfuscated Perl contest but no Obfuscated Python contest -- it wouldn't
be anywhere near as awe-inspiring.
So one might argue that the best C and Perl coders are better than the
best Python coders, but the average Python coder is better than the
average C and Perl coder.
(I suggest this as a hypothetical, and do not wish to defend it
This is only a bug if one expects otherwise.
>> This buggy idiom survived many years of Python development, missed by
>> virtually everyone.
The last statement is false. The hazard of using and/or was well-known
back in '97 or so when I discovered or learned it and I believe it was
mentioned in the FAQ entry on the subject. The new alternative has the
hazard that the condition and if-branch must be written and read in a
backwards order. I consider that buggy and do not use it for that reason.
Terry Jan Reedy
>>> Instead the accepted, idiomatic Python way of writing this was to use
>>> short-circuit booleans:
>>> result = condition and x or y
>>> However this idiom is buggy! If x is a false-value (say, 0) then
>>> result gets set to y no matter what the value of condition.
> This is only a bug if one expects otherwise.
I'm not saying the behaviour of `a and x or y` is buggy, but that it's
use as a replacement for a ternary conditional expression is buggy; the
*idiom* is buggy, not the behaviour of and/or.
If I say "you can make perfect hard boiled eggs by putting the egg in a
glass of water in the microwave on high for eight minutes", and the egg
explodes, that's not a bug in the microwave, that's a bug in the recipe.
>>> This buggy idiom survived many years of Python development, missed by
>>> virtually everyone.
> The last statement is false. The hazard of using and/or was well-known
> back in '97 or so when I discovered or learned it and I believe it was
> mentioned in the FAQ entry on the subject.
We can argue about how well-known it was for somebody like Raymond
Hettinger to miss it, and for whoever did a code-review of his
application to also miss it.
> The new alternative has the
> hazard that the condition and if-branch must be written and read in a
> backwards order.
If you had asked me a couple of years ago, I would have agreed, but I've
now come to the conclusion that `x if condition else y` is not only
perfectly natural, but at least as natural as the conventional order of
`if condition then x else y` (at least for expressions, not for if
"Steven, what are you doing on Monday night?"
"Going to the movies if I can get away from work on time, otherwise
sitting at home answering questions on comp.lang.python."
Oh really? I thought putting the conditional in the middle was
ingenious, whoever thought that must have the next Turing award!
I always feel there's something wrong with the (condition ? true :
false) or (if condition then true else false) expressions found in other
languages; and just realized it was because of their unnatural ordering.
I have to admit initially the reversed ordering do confound me, but
afterward it felt even more natural than the traditional
If anyone continues to follow bad idioms without questioning their
usefulness from time to time, I'd question their ability as a
programmer. Critical thinking is important. Which is why good programs
can be written in PHP, Forth, Lisp, Perl, and anything else. However,
if a programmer thinks the only language they will ever need to know is
BF, they have a serious screw loose. ;)
>> Good tools make all the difference in the world, I'm not arguing that.
> You appear to be arguing against that.
Maybe you need to reconsider my arguments.
It takes a good programmer to recognize the values and trade-offs of the
tools they work with.
Ignorance is not an excuse to blame the language. It's too easy to say,
"Well Perl sucks because it encourages you to be a bad programmer
because it has all these features that let you shoot yourself in the
foot." In reality, lots of really great programs are written in Perl
all the time and some very smart people write them. It just so happens
that in hands of the educated, those very features are useful in certain
Python doesn't "encourage" you to be a better programmer. It just
enforces particular idioms and conventions in its design. As long as
the ignorant programmer follows them they should be better off. Yet if
they are ignorant, no amount of encouragement will get them to think
critically about Python and find bugs in it. They will have to rely on
the community of developers to do that thinking for them.
>> Just that the tools don't use us; we use them.
> Nobody said that tools use us.
But it is being suggested that they influence our thinking.
Pretty smart thing for a language to be able to do.
>> Programming in Python
>> doesn't instantly make me a better programmer.
> No, not instantly, but I would argue that after many years of coding in
> Python you will be a better programmer than after the same number of
> years of coding in PHP or Basic.
And my argument is that the human element is what will determine who is
There are good programmers who can program in PHP. Some of the biggest
websites on the Internet are programmed in it. And like any language
I'm sure it has a good number of inefficiencies and bad design decisions
that the programmers using it had to work around. Yet despite it being
a poor language in your opinion, they built successful programs with
it. I wouldn't feel right calling them bad programmers.
(large portions of Facebook and Flickr, for example, are written in
PHP. They used to be written entirely in PHP before migrating the
bottlenecks out to lower-level languages as they scaled up... as is
common in most high-level languages)
> It also depends on what you mean by "better programmer". Some languages
> value cleverness above all else. Python is not a language for writing
> amazing, awe-inspiring hacks that work where nobody but the author can
> work out why. This is why there is an Obfuscated C contest and an
> Obfuscated Perl contest but no Obfuscated Python contest -- it wouldn't
> be anywhere near as awe-inspiring.
> So one might argue that the best C and Perl coders are better than the
> best Python coders, but the average Python coder is better than the
> average C and Perl coder.
> (I suggest this as a hypothetical, and do not wish to defend it
I should hope not. ;)
Particularly because people often go out of their way to write clear,
concise, and maintainable Perl and C code every day.
In many contexts I'm sure there is reason to use Perl instead of Python
just as there are situations where C is more appropriate than either.
However, the mark of a poor programmer in my line of reasoning is one
who cannot recognize such distinctions.
One must be aware of the benefits and short-comings of their tools. If
your tools influence the way you think then you are being ignorant of
this principle. And I would suggest that makes you a poor programmer.
Perhaps "influence the way you think" is not the right way to phrase
it... how about "be the tool" ;)
We have all seen the struggles that newcomers to a language go through
as they try (or don't try) to adjust their thinking to the tool at hand
-- programming Java, BASIC, FORTRAN, or xyz in Python. Even now Phlip
is raging against exceptions and the very Zen of Python.
Converting FoxPro to Python is an interesting excercise for me --
version 6 at least doesn't have many of the cool things that Python
does, and consequently thinking in Python while writing FoxPro (when I
have to) is extremely frustrating; going the other way is a bit of a
challenge also, although much more rewarding.
For a more concrete example, take sail-boats and speed-boats: you get
used to the speed boat, it's quick handling and sharp turns... then you
take a sail boat out for a cruise -- if you don't adjust your thinking
from speed to sail, you could very well end up in the rocks.
To sum up: I agree with your "poor programmer" line of reasoning in the
second paragraph above, but not with the "tools influencing the way we
think" line of reasoning -- while I am cognizant of Python's
shortcomings, I am very much enjoying the changes in my thinking the
more I use it.