Which are the classic books in computer science which one should
peruse?
I have (a) Code Complete (b) GOF (c) Art of programming.
Art of programming was too tough for me - and I couldnt understand
much. The other two were good books - I understood and implemented
quite a bit from both.
What are the other books which I should peruse?
Regards
K
Software Tools by Plaugher
Mythical Man Month by Brooks
Malcolm
Code Complete and GOF are software engineering books but not really
CS books. TAOCP is a CS book but a bit old fashioned. Other classics:
Introduction to Algorithms by Thomas H. Cormen, Charles E. Leiserson,
Ronald L. Rivest, and Clifford Stein.
Structure and Interpretation of Computer Programs by Harold Abelson
and Gerald Jay Sussman (online at mitpress.mit.edu/sicp)
I understand and concur. Since I am a software engineer - coming in to
software from a different background - what I am looking for is self-
improvement books for a software engineer. This can include both CS
and Software books - even though I found that CS books are much less
understandable to me :-)
> Hi all,
> I do understand that this is not a python question and I apologize
> for that straight up.
> But I am a full time follower of this group and I have seen very
> very brilliant programmers and solutions.
> I also want to be a good programmer
The best way to become a good programmer is to program. Write a lot of
code; work on some large projects. This will improve your skill more than
anything else. It's also important to learn new languages regularly. I
recommend to learn C, Python, and Lisp first.
> Which are the classic books in computer science which one should
> peruse?
A list of some good books is at steve.yegge.googlepages.com/ten-great-
books. Also read programming blogs.
--
Nathan Stoddard, http://nathanstoddard.com
>
>> Code Complete and GOF are software engineering books but not really
>> CS books.
>
> I understand and concur. Since I am a software engineer - coming in to
> software from a different background - what I am looking for is self-
> improvement books for a software engineer.
In that case The Mythical Man-Month (Brooks) is a must.
--
Rhodri James *-* Wildebeest Herder to the Masses
> The Mythical Man-Month (Brooks) is a must.
What's amazing about this book is just how relevant it is today, 35 years
after it was written. Some of the technical details have changed (how many
of us still keep our project notes on microfiche?), but cross out
"microfiche" and write in "wiki" and what he's saying is just as valid
today. It's not about computer science. It's not really even about
software engineering. It's more about general project management than
anything else.
In the same vein, Death March, by Ed Yourdon.
I've been wanting to read "Antipatterns".
Thank you Rhodri.
I do have Mythical Man-Month - a great book indeed.
I was looking for more technical books -
I have now got a good set - Putting it across so that others can also
use maybe -
Code Complete,
GOF,
Mythical Man-Month,
SICP - Thank you Paul - I have downloaded it from Web Site,
Introduction to algorithm - I have placed an order for the same,
The Pragmatic Programmer - Planning to buy,
Refactoring: Improving the Design of Existing Code - again planning to
buy,
The C Programming Language - I had this, lost it, now I will buy
again,
The Little Schemer - I am not sure about buying this - I dont know
scheme
Software Tools - Seems to be a classic - not sure whether I will buy.
Regards
K
Dijkstra's writings -- http://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD1036.html
--
are not kind to software engineers [His defn of SE -- How to program
if you cannot].
Seemingly irrelevant -- Good programmers are very good with their
editors -- someone mentioned yegge. Read him for inspiration on
emacs. Of course you can use something else but its important to get
good at it.
> I do have Mythical Man-Month - a great book indeed.
> I was looking for more technical books ...
No-one has mentioned Andrew Tanenbaum's "Computer Networks". So much of
programming seems to involve networking these days, I think the current
(4th) edition is a great introduction to how to think in network terms,
particularly about security issues.
I didn't think that was so great. It had a lot of hype, which lead to be
believe it would be something wonderful, but I wasn't so impressed.
Hmm, good to know. Thanks.
> The best way to become a good programmer is to program. Write a lot of
> code; work on some large projects. This will improve your skill more than
> anything else.
I think there are about 100 million VB code-monkeys who prove that theory
wrong.
Seriously, and without denigrating any specific language, you can program by
(almost) mindlessly following a fixed number of recipes and patterns. This
will get the job done, but it won't make you a good programmer.
--
Steven
I bought it but couldn't get into it. Light on meat, heavy on boredom
(for me - these things are always somewhat subjective).
These are my top two recommendations for people who can already code a
bit, but who want to get really really good. The first few chapters of
Refactoring hold the key insights, the rest is examples.
> The Little Schemer - I am not sure about buying this - I dont know
> scheme
If you want to learn functional programming, that's excellent.
> Nathan Stoddard wrote:
>
>> The best way to become a good programmer is to program. Write a lot of
>> code; work on some large projects. This will improve your skill more than
>> anything else.
>
> I think there are about 100 million VB code-monkeys who prove that theory
> wrong.
Really? So you don't think that the best way to get good at something
is to practice? I think I'm paraphrasing Richard Feynman here, but the
only way to truly understand something is to do it.
Obviously a bit of guided learning is a major boon, but you can't be practice.
> Which are the classic books in computer science which one should
> peruse?
From having read this discussion up to now I'd recomend you to read code
written by good programmers.
Christof
> I think there are about 100 million VB code-monkeys who prove that theory
> wrong.
>
> Seriously, and without denigrating any specific language, you can program by
> (almost) mindlessly following a fixed number of recipes and patterns. This
> will get the job done, but it won't make you a good programmer.
When Dijkstra was asked what next programming language to learn he
would typically recommend Latin :-)
> Really? So you don't think that the best way to get good at something
> is to practice? I think I'm paraphrasing Richard Feynman here, but the
> only way to truly understand something is to do it.
> Obviously a bit of guided learning is a major boon, but you can't be practice.
For every one Horowitz there are a thousand wannbes thumping on the
piano trying to become Horowitz.
The traction that practice gives is maximal only in the beginning.
> On 2009-06-14 14:04:02 +0100, Steven D'Aprano
> <st...@REMOVETHIS.cybersource.com.au> said:
>
>> Nathan Stoddard wrote:
>>
>>> The best way to become a good programmer is to program. Write a lot of
>>> code; work on some large projects. This will improve your skill more
>>> than anything else.
>>
>> I think there are about 100 million VB code-monkeys who prove that theory
>> wrong.
>
> Really? So you don't think that the best way to get good at something
> is to practice?
Shame on you for deliberately cutting out my more serious and nuanced answer
while leaving a silly quip. As I went on to say:
"... you can program by (almost) mindlessly following a fixed number of
recipes and patterns. This will get the job done, but it won't make you a
good programmer."
There are huge numbers (millions?) of lousy programmers who program every
single day and never become good programmers. "Practice makes perfect" only
works for mechanical skills and rote learning, neither of which are
especially applicable to good programming. (Although rote learning is
helpful for reducing the time taken to look up syntax and library
functions.) Without some level of understanding and creativity, as soon as
you hit a problem that can't be efficiently solved by one of the patterns
or recipes you've learned, you're in trouble.
All the practice in the world won't give you the discipline to write
appropriate comments, or to test your code thoroughly. Practice won't
*necessarily* make you creative -- you can't be creative in a field you
know nothing about, but having learned the language and the libraries
doesn't necessarily mean you can apply the tools to solve novel problems.
Many programmers know a few good tricks, and try to hammer every problem
into a form that can be solved by one of the few tricks they know, no
matter whether it is appropriate or not. Witness how many people try to
write regexes to parse bracketed expressions, a problem which requires a
proper parser.
(This is not necessarily a bad thing, but it often is.)
You could write a piece of code like:
s = ""
for word in some_data:
s += " " + word
a thousand times a day, and *never* learn that this is Bad Code, because you
never profile it with more than a few thousand words and so never discover
that it's O(n**2). Eventually when it gets released into the real world,
somebody reports that it takes eight hours to process 100MB of words, and
then *some other guy* re-writes your code to use s = " ".join(words), and
you remain in blissful ignorance, happily writing your bad code every
single time.
> I think I'm paraphrasing Richard Feynman here, but the
> only way to truly understand something is to do it.
An amazingly inappropriate quote for a *theoretical* physicist to have said.
Whether Feynman did or didn't say that, it's clearly untrue: many people do
without understanding. Many people can cook, some people are expert cooks,
but few people understand precisely what takes place when you cook food.
People can catch and throw balls, and have little or no understanding of
gravity, air-resistance, and the mechanics of their own bodies.
In fact... just before you hit Delete on this post, how about you explain
*how* you make your finger reach out and press the Delete key? You probably
move bits of your body a million times a day, and the chances are very high
that until now, you've never once noticed that you have no idea how you do
it. I think that simple fact blows out of the water the concept that doing
implies understanding.
> Obviously a bit of guided learning is a major boon, but you can't be
> practice.
I didn't say that practice was useless. Arguably, it may even be necessary
to be a good programmer. (Although less so for an unsurprising language
like Python, where it is very common to write code which works correctly
the first time.) But practice is clearly not *sufficient* to be a good
programmer.
--
Steven
I think I can attest to that.
I was a programmer (in a low level language) in a huge MNC code monkey
shop for > 7 years.
I consider myself to be Ok - not great, but not very poor either.
I had written a lot of code in those 7 years, but due to lack of
exposure and laziness, never knew that I have to read books.
As I mentioned before, I come in from a non-computing engineering
degree, so I did not even know which books to read etc.
I had seen many of the frameworks written by others, and was extremely
impressed. I considered those people who wrote those frameworks to be
geniuses - until I accidently came across one site where I read about
GOF.
I bought it and read it - and straight away understood that whatever I
learned in the last 7 years, I could have learned most of them in 6
months, provided I had read the right books. All the frameworks were
just amalgamation of these patterns.
Now, I voraciously purchase and read books - both in the domain of my
work and on computer science in general. Even though I am not going to
be recruited by google any time soon :-), I think I have become a much
better programmer over the last one year. I see my code before 1 year
and I am horrified :-).
Practice makes perfect only if you push yourself - and you need
exposure to know that.
Vivaldi vs. Mozart
And the latter especially had definitely mastered his editor. Just think
of the sheer volume of the coding he managed during his short life.
Not many bugs either…
CJ
It depends on the book. Also, keep in mind that quite a few good
books these days are online where you can read them for free. SICP
(mentioned in my earlier post) is one of them.
> On 2009-06-14 14:04:02 +0100, Steven D'Aprano
> <st...@REMOVETHIS.cybersource.com.au> said:
>
>> Nathan Stoddard wrote:
>>
>>> The best way to become a good programmer is to program. Write a lot of
>>> code; work on some large projects. This will improve your skill more
>>> than
>>> anything else.
>> I think there are about 100 million VB code-monkeys who prove that
>> theory
>> wrong.
>
> Really? So you don't think that the best way to get good at something is
> to practice?
Self-evidently. If what you practice is bad practice, it doesn't matter
how much you practice it you'll still be no good at good practice in
practice. Practically speaking, that is :-)
> Graham Ashton wrote:
>
>> On 2009-06-14 14:04:02 +0100, Steven D'Aprano
>> <st...@REMOVETHIS.cybersource.com.au> said:
>>
>>> Nathan Stoddard wrote:
>>>
>>>> The best way to become a good programmer is to program. Write a lot of
>>>> code; work on some large projects. This will improve your skill more
>>>> than anything else.
>>>
>>> I think there are about 100 million VB code-monkeys who prove that
>>> theory wrong.
>>
>> Really? So you don't think that the best way to get good at something
>> is to practice?
>
> Shame on you for deliberately cutting out my more serious and nuanced
> answer while leaving a silly quip.
Can't have been very "serious and nuanced" if it could be summed up by such
a "silly quip" though, could it?
> Vivaldi vs. Mozart
>
> And the latter especially had definitely mastered his editor. Just think
> of the sheer volume of the coding he managed during his short life.
>
> Not many bugs either…
I thought Vivaldi did more. The style of music was such that they could
virtually sketch it out in shorthand, and leave it to the copyists to expand
to proper notation for the musicians to play. I imagine that it was also the
job of copyists to fix the typos.
In other words, high productivity was a direct consequence of adoption of a
cookie-cutter style.
If you are given to depression then programming is possibly not for you.
Keep tackling problems that are beyond your current capabilities.
Think about what you are doing... this is more subtle that you might grasp at first.
Know there is always a better way and seek it.
Prime your thinking by reading Edsgar Dijkstra. Dijkstra's "Notes on Structured Programming" are a good read and are probably available on the 'Net by now. You will see that his concerns had practically nothing to do with "Goto Considered Harmful". That was a later observation made as a result of checking students programs. His notes were knocking around in the UK back in 1966 and earlier. His co-authored book on "Structure Programming" is good reading even now. (O-J Dahl, EW Dijkstra, and CAR Hoare). See http://www.cs.utexas.edu/users/EWD/transcriptions/transcriptions.html for his famous EWD series of notes.
Gain access to one of the IEEE or ACM web sites and their resources. I used to sneak into my local university library before the 'Net to read this stuff.
Beyond that I check up on the reading lists for CS students from time to time. This often throws up real gems and prevents me from being blind-sided.
Beware any "programmer" who only knows one computer language and only one OS, especially if this is either windows or a version of UNIX.
My 2c worth
>> Shame on you for deliberately cutting out my more serious and nuanced
>> answer while leaving a silly quip.
>
> Can't have been very "serious and nuanced" if it could be summed up by
> such a "silly quip" though, could it?
But it can't be summed up by the silly quip, which is why I'm complaining
that the silly quip on its own fails to include the more serious and
nuanced elements of my post.
--
Steven
Lots of references to "good programmer" but no attempt to define the term.
Who is the better programmer - one who writes lousy code but produces good programs
or one who obeys all the rules of coding but whose programs break all the time?
(Yes, I know there are two other categories!)
In almost 50 years programming I have met all types but I tended to judge them
by the end results, not on their style.
A programmer that just follows the recipes for the so-called "rules of
coding" is just, as Steven says, a bad programmers. Unless you write a
program that works, you are not a programmer; once you've written one
that works, we'll see whether you're a good or bad by your style.
Points is taken when the so-called rules are followed mindlessly.
Bonus Points if you can justify your breaking rules.
No point for program that doesn't work.
For those who are not rich, MIT has put a lot of courseware on the web,
including in particular, CS, for free. And there is lots more put up by
professors and departments elsewhere. There are free language manuals
and interpreters/compilers for those who want to stretch their brain
that way.
Gries is a died-in-the-wool iterationist.
His cursory discussion of recursion is not worth much,
but he really knows iteration with while.
http://feeds2.feedburner.com/E-booksDirectory
http://www.freetechbooks.com/rss.php
warmest regards,
Aldo
The MIT Online Course Ware starts here:
http://ocw.mit.edu/OcwWeb/web/help/start/index.htm
I downloaded the Mathematics for Computer Science course: ~9MB. Looks
to be excellent!
Do you have links to share for the other materials?
Quoting Terry Reedy <tjr...@udel.edu>:
> Phil Runciman wrote:
>>
>> Gain access to one of the IEEE or ACM web sites and their resources.
>> I used to sneak into my local university library before the 'Net to
>> read this stuff.
>>
>> Beyond that I check up on the reading lists for CS students from time
>> to time. This often throws up real gems and prevents me from being
>> blind-sided.
>
> For those who are not rich, MIT has put a lot of courseware on the web,
> including in particular, CS, for free. And there is lots more put up
> by professors and departments elsewhere. There are free language
> manuals and interpreters/compilers for those who want to stretch their
> brain that way.
>
> --
> http://mail.python.org/mailman/listinfo/python-list
Much of the material is excellent but IBM got into the huge mess with the 360. Brooks observed failure from the inside and IMHO did a great job of it.
Project managers can never rescue stuffed concepts especially if a lot of money has been spent! Such projects have momentum and roll over anyone who gets in the way.
Brilliant architects are worth their weight in gold. I believe that ICL's VME/B OS began as a skunk works project.* It had such an architect. The latter was the official OS and was pretty good too. I think Warboys took over later once VME/B became official... if anyone out there knows better then please let us know and correct Wikipedia too. The Wikipedia item on VME is too sanitised for my taste. The "truth" is generally far more interesting.
If the software you are developing is going to be used by many people then remaining sharp and on top of your game is so important. Do not program if you are tired or you will spend your life debugging. ;-) I stop coding at 3pm for this reason. I come right again around 10pm!
Yes, despite the above, do read the book, but remember that among the content is a cautionary tale!
Ooops, the above is a bit away from Python. ;-)
Phil
*I was told this by the leader an ICL research team, no less than Alan Sutcliffe himself... many years ago now. (c. May/June 1970)
-----Original Message-----
From: Roy Smith [mailto:r...@panix.com]
Sent: Sunday, 14 June 2009 2:21 p.m.
Subject: Re: Good books in computer science?
In article <mailman.1534.1244926...@python.org>,
"Rhodri James" <rho...@wildebst.demon.co.uk> wrote:
> The Mythical Man-Month (Brooks) is a must.
What's amazing about this book is just how relevant it is today, 35 years
after it was written. Some of the technical details have changed (how many
of us still keep our project notes on microfiche?), but cross out
"microfiche" and write in "wiki" and what he's saying is just as valid
today. It's not about computer science. It's not really even about
software engineering. It's more about general project management than
anything else.
Aloha!
When people wants to know what (good) Python code looks like, I usually
point them to Trac:
Trac is not only a good tool for development written in Python. Trac
also uses Trac to develop Trac (cudos for eating your own dogfood) and
Trac allows easy browsing of the source code.
I still consider myself a Python-n00b and my judgement might be all
wrong. But I believe that the Trac developers follows Pythonic code
rules and the result a prime example of what well written, well document
Python code looks like.
Check for yourself though at:
http://trac.edgewall.org/browser/trunk/trac
- --
Med v�nlig h�lsning, Yours
Joachim Str�mbergson - Alltid i harmonisk sv�ngning.
========================================================================
Kryptoblog - IT-s�kerhet p� svenska
http://www.strombergson.com/kryptoblog
========================================================================
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iEYEARECAAYFAko3RegACgkQZoPr8HT30QEMVwCgrNkOYMGFmhMYunwZqlTFpAkt
He8AoOEXIC/QXkRu+sHtzIz+1+JQZp2F
=o+g8
-----END PGP SIGNATURE-----
This is the best book ever written on computer science
and the first edition is free.
http://www.math.upenn.edu/~wilf/AlgComp3.html
-- Aaron Watters
http://aaron.oirt.rutgers.edu/myapp/amcharts/doc
===
less is more.
Thanks!
--
Aahz (aa...@pythoncraft.com) <*> http://www.pythoncraft.com/
"Many customs in this life persist because they ease friction and promote
productivity as a result of universal agreement, and whether they are
precisely the optimal choices is much less important." --Henry Spencer
> FWIW I actually dislike this book!
Why?
IMHO Burroughs and ICL had better approaches to OS design back then but had less resources to develop their ideas.
However, mainly this period marked a transition from the excitement and discovery phase of computing to commercial power plays and take-overs. The best ideas in a field tend to get lost in the melee of competition. Early computers were rooted in academia and there was a lot of cross fertilisation of ideas and approaches. IMHO commerce affected layers of the stack where it had no useful contribution to make. Vertical integration warred against sound architecture.
The book has an important message and I recommend that people read it. The book is to me, and possibly only me, an icon representing when things went wrong.
To become very good in a practical activity (like programming or
writing stories or playing piano) you have to do many things for a lot
of time.
You have to practice it a lot, but that's not enough. You also must
keep pushing forward the limit of your skills, doing things hard for
you.
Reading smart books and learning from the experts in the field is
usually necessary. Quite often it's useful to read good books not much
related with the activity you are doing too, because the human mind
works better this way.
Another thing you have to do is to keep your eyes open, for example to
be able to understand when your learning is struck in some slow
corner: now and then you will have to meta-learn, that is to change
the way you learn and train yourself. This is a hard and often painful
thing to do, but it's probably necessary if you want to become very
good, because very often you learn in the wrong way, or in a not much
efficient way.
Howard Gardner too has written about such topic.
Bye,
bearophile
Well, it's an opinion, but certainly not one I would agree with!
AFAIAC the IBM 360 got everything right, which is why the instruction set is still
going strong 45 years later (I've used it for every one of those 45 years).
Funny but I was watching an interview/conversation between and older
composer and a young up and coming WFCP (World Famous Concert Pianist)
the other day. The composer had been teaching the pianist to
understand the works he was playing ... anyway, the old guy remarked
that when he was younger he wanted to be a WFCP too, but that he
lacked a crucial ability that the young star had. What was it? "I
lack the ability to make myself practise as well and for as long as
you do."
-in 117815 20090617 221804 Phil Runciman <ph...@aspexconsulting.co.nz> wrote:
->Because it reminds me of when things went badly wrong. IBM360, Von Neumann =
->architecture, no hardware stacks ...
->
->IMHO Burroughs and ICL had better approaches to OS design back then but had=
->less resources to develop their ideas.=20
->
->However, mainly this period marked a transition from the excitement and dis=
->covery phase of computing to commercial power plays and take-overs. The bes=
->t ideas in a field tend to get lost in the melee of competition. Early comp=
->uters were rooted in academia and there was a lot of cross fertilisation of=
->ideas and approaches. IMHO commerce affected layers of the stack where it =
->had no useful contribution to make. Vertical integration warred against sou=
->nd architecture.
->
->The book has an important message and I recommend that people read it. The =
->book is to me, and possibly only me, an icon representing when things went =
->wrong.
-Well, it's an opinion, but certainly not one I would agree with!
-AFAIAC the IBM 360 got everything right, which is why the instruction set is still
-going strong 45 years later (I've used it for every one of those 45 years).
Yes, I was afraid someone would use that sort of argument. Sadly, having the best
instruction set does not lead to commercial success. If it did then Interdata would
still be with us. They used IBM360 instructions.
How many instruction sets have you used? I have used at least 9.(I nearly missed
the DG Nova). KDF9 had the best set for general computing that I had the privilege
of using but that is not to say it was the best. The Burroughs B series or PDP11 may
have been better... and doubtless there are many more candidates.
What I can say is that for scientific/engineering calculations the RPN of KDF9 was
Great because assembler was no harder than using algol60 for the calculations part of
the problems I worked on.
Oh yes, I even used assembler on the IBM360 series. It was a 360/50. The experience
Did impact on the force of my observations! FWIW I learned it using the training material
For the ICL System 4 which was superior to IBM's. The ICL System 4 was not a success...
despite its instruction set. ;-)
-AFAIAC the IBM 360 got everything right
How many known bugs did the OS end up with? I know it hit 50,000+ and counting. LOL
Suffice to say we are on a journey and Python is part of the scenery.
Phil
> What I can say is that for scientific/engineering calculations the RPN of
> KDF9 was Great because assembler was no harder than using algol60 for the
> calculations part of the problems I worked on.
Unfortunately, we had to learn the hard way that machine instruction sets
must be designed for efficiency of execution, not ease of use by humans.
Stack-based architectures, for all their charm, cannot match register-based
ones in this regard.
> > Vivaldi vs. Mozart
> >
> > And the latter especially had definitely mastered his editor. Just
> > think of the sheer volume of the coding he managed during his short
> > life.
> >
> > Not many bugs either…
>
> I thought Vivaldi did more. The style of music was such that they
> could virtually sketch it out in shorthand, and leave it to the
> copyists to expand to proper notation for the musicians to play. I
> imagine that it was also the job of copyists to fix the typos.
100 years before Frederick W. Taylor was born..?
Vivaldi ran a school for musically-minded young women, I heard, so his
alumni may have pitched in. Mozart on the other hand, pretty much must
have spent his days coding. It has been estimated that the fastest
copyist would need years to manually reproduce the sum total of his
manuscripts.
Mind you, that's only stuff I read years ago, and even though I looked
around a bit, I have no evidence to corroborate.
> In other words, high productivity was a direct consequence of adoption
> of a cookie-cutter style.
It looks like we pretty much agree.
You make it sound like it was Vivaldi who invented Pacbase. :-)
Maybe I'm nitpicking, but the one thing I don't understand is how you
practice programming.
The term makes obvious sense when you're talking about your golf swing,
acquiring competitive driving skills, playing tetris..
But programming..??
CJ
>How many instruction sets have you used? I have used at least 9.
IBM 1401
IBM 1410
IBM 7090/7094
IBM 1620
IBM 360
IBM System/7
IBM 1130
IBM 1800
IBM Series/1
Intel 8080 etc
Motorola 6800 etc
Texas 9900 (my second favourite)
plus a bunch of IBM microprocessor cards (eg Woodstock).
I also recommend Eric S. Roberts "Thinking Recursively". I don't know
if it can be considered a classic, but a good programmer needs to be
able to understand and do recursion, and I found this book a very
readable introduction.
It may also help if you bring a tighter focus to your search. The
domain of programming can be divided up into large subdomains, each
with its own specialized types of problems, techniques and classics.
Here are some subdomains that I can think of off the top of my head:
system programming -- dealing with interacting with the computer at
the bits and bytes level
scientific programming -- dealing with algorithms
business programming -- dealing with data structures and the events
that change them
embedded & real-time programming -- dealing with controlling machines
... and there are probably others, such as writing compilers/
interpreters, and robotics programming.
It is practice in the same way as learning to write well requires
practice. Writing good code is a writing skill, as well as a
precision of thought exercise. The basics of Computer Science are
well-covered in TAOCP Volumes 1-5 (not all yet available in stores :-).
You _must know data structures and fundamental algorithms, but after
that what you write is a way of expressing clearly what you learn in a
field. The field may be as narrow as "the field of Ink-Jet Printer
automated testing for models XXX through XYZ of manufacturer Z," but
in some sense the programs should clearly expr4ess that knowledge.
If you read books on learning to write clearly, even if they are
oriented to (non-fiction) writing in English, none of them advocate
intensive study of a theory with little practice. You can follow
the advice in those books (with a "loose" interpretation of the
instructions) and improve your code. What the best teach you is
be succinct, clear, unambiguous, and try new things regularly. It
is only this variation that can help you get better.
Read what others write about how to write code, but remember you
will have your own style. Take what others write about how to code
as a cook does a recipe: you should be understand what is being
attempted, try it the authors way to see what new might surprise you,
and carry away only what you find you can incorporate into your
own process. How we pull stuff from our brains is as varied as the
brains themselves. We bring a host of experiences to our writing,
and we should similarly bring that to the programs we write.
--Scott David Daniels
Scott....@Acm.Org
For programming practice I do the problems of http://projecteuler.net/
I'm on the Eulerians page (best performers on 25 last problems).
There is not a single VB programmer in the top 100.
(Lots of Python programmers, C-family, also Haskel, APL, LISP
Algol, Forth, Perl and I repeat not a single VB programmer.)
Currently the top place is a Python programmer.
These programs may be very demanding, minutes on very fast systems.
Bad algorithms take days, weeks or literally forever.
Interestingly the factor 5 between Python and C is irrelevant compared
to a good algorithm, apparently.
>--
>Steven
Groetjes Albert
--
--
Albert van der Horst, UTRECHT,THE NETHERLANDS
Economic growth -- being exponential -- ultimately falters.
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
The remark of Feynman goes to the heart of science and mathematics.
(Try understanding some number theory or string theory by just
reading about it.)
>
>Whether Feynman did or didn't say that, it's clearly untrue: many people do
>without understanding. Many people can cook, some people are expert cooks,
This is even a classical lack of logic skills. Feynman says a->b,
and you attack b->a.
<SNIP>
Thanks. I lost that title a while ago, must buy.
Also "Numerical Recipe's in FORTRAN/Pascal/C"
(Have they done Python yet?)
>
>Structure and Interpretation of Computer Programs by Harold Abelson
>and Gerald Jay Sussman (online at mitpress.mit.edu/sicp)
Wait a few months, a third edition is in the works.
> Also "Numerical Recipe's in FORTRAN/Pascal/C"
> (Have they done Python yet?)
They haven't done Python AFAIK. I liked the C version but the
licensing of the software is pretty evil and so I'm a bit turned off
to the series these days. I think the hardcore numerics crowd never
liked the book anyway.
Who got his start *doing* calculations for the Manhattan (atomic bomb)
project, and checking them against real results. Like it or not, they
'did' it is a big way.
He got his Nobel Prize for finding out how to *do* calculations that
matched quantum mechanics experiments. (or something like that).
His early bobby was picking locks and cracking safes -- mostly as a way
to understand them. It was not enough for him to just read about them.
tjr
My opinion is that the text itself is a pretty good introduction to the workings
of a broad variety of numerical algorithms. For any particular area, there are
probably better books that go into more depth and are closer to the state of the
art, but I don't think there are any books that cover the wide swath numerical
algorithms that NR does. In that regard, I treat it like Wikipedia: a good place
to start, not the best place to stop.
I think the code succeeds reasonably well for teaching the algorithms, but I
don't think they are well-engineered for production use. There are usually
better libraries with better licenses.
--
Robert Kern
"I have come to believe that the whole world is an enigma, a harmless enigma
that is made terrible by our own mad attempt to interpret it as though it had
an underlying truth."
-- Umberto Eco
Time ago I have solved some of them with Python, D and C (some of them
are quite hard for me), I have tried to produce very fast code (like a
D generator for prime numbers that's like 100 times faster of some
'fast' C# prime generators I've seen in that forum).
But then I have stopped because to me they seem a waste of time, they
look too much academic, they don't exercise the right muscles of the
mind. They may be good if you want to become good in performing
numerical number theory, and bad for everyone else.
Seeing how many people like to do those Project Euler puzzles, I
presume my ideas aren't shared by most people.
I am now using some solutions of mine of those problems to spot
"performance bugs" in a new D compiler (and even in ShedSkin), so they
are somewhat useful again :-)
Bye,
bearophile
Knuth.
I.e. "The Art of Computer Programming" by Prof. Knuth
Your library should have a copy (it's a multi-volume opus), if not
consider donating yours after you read them.
And both these 'grounds' seem to cause more argument and less
suggestions for good books.
Let me therefore try to find a middle-ground and make a suggestion
that I used to make to my students when I taught them programming:
Read the Python Manual -- specifically the library. It contains a
fairly good conspectus of modern day IT/CS.
Some examples of what I mean:
Want to study TDD? Read unittest and doctest and then go on to
reading (and practising) Kent Beck etc
Want to get into unix system programming? Nothing like playing around
with os.path and stat before burining your hands with C.
Networking protocols? smtplib, urllib, ftplib etc
Low level networking? socket, select etc
Algorithms? Good to get your feet on the ground with timeit
Ive found twisted is a good excuse to study lot of CS arcana
ranging from laziness of lambdas, event driven programming