Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Ask for book for efficient coding in C

17 views
Skip to first unread message

MBALOVER

unread,
Feb 20, 2010, 6:02:35 PM2/20/10
to
Hi all,

Do you know what book that discusses what we should and should not do
in C programming? It is not a book teaching about syntax, etc. but a
book teaches us the experience to optimize our code.

For example, I read somewhere that we should avoid function calls to
make the program run fast. Calling (and returning from) a function is
time-consuming, and not because the content of the function takes time
to execute — just getting into the function, and back out of it, uses
processor time.

I want to find some experiences like this.

Do you know any book discussing this?

Thank you.

Richard Heathfield

unread,
Feb 20, 2010, 6:07:52 PM2/20/10
to
MBALOVER wrote:
> Hi all,
>
> Do you know what book that discusses what we should and should not do
> in C programming? It is not a book teaching about syntax, etc. but a
> book teaches us the experience to optimize our code.
>
> For example, I read somewhere that we should avoid function calls to
> make the program run fast. Calling (and returning from) a function is
> time-consuming, and not because the content of the function takes time
> to execute � just getting into the function, and back out of it, uses

> processor time.
>
> I want to find some experiences like this.
>
> Do you know any book discussing this?

I do, but I'm not going to tell you what it is - for two reasons.
Firstly, because the book I have in mind discusses lots of other things
as well; it's not dedicated solely to the subject of optimisation. And
the second reason is that you can get most or maybe all of what it has
to say, from the Web:

http://leto.net/docs/C-optimization.php

Not sure if that's the owner's link, but it looks good.

--
Richard Heathfield <http://www.cpax.org.uk>
Email: -http://www. +rjh@
"Usenet is a strange place" - dmr 29 July 1999
Sig line vacant - apply within

MBALOVER

unread,
Feb 20, 2010, 6:47:29 PM2/20/10
to
Thanks Richard,

I am reading the web site and it is really helpful.

Thanks a lot.


On Feb 20, 6:07 pm, Richard Heathfield <r...@see.sig.invalid> wrote:
> MBALOVER wrote:
> > Hi all,
>
> > Do you know what book that discusses what we should and should not do
> > in C programming? It is not a book teaching about syntax, etc. but a
> > book teaches us the experience to optimize our code.
>
> > For example, I read somewhere that we should avoid function calls to
> > make the program run fast.  Calling (and returning from) a function is
> > time-consuming, and not because the content of the function takes time

> > to execute just getting into the function, and back out of it, uses

Hamiral

unread,
Feb 20, 2010, 11:53:51 PM2/20/10
to
Richard Heathfield wrote:
> http://leto.net/docs/C-optimization.php
>
> Not sure if that's the owner's link, but it looks good.
>

To answer to the OP :
"Part of the clarity is making hunks of code into functions when
appropriate. The cost of a function call is extremely small on modern
machines, so optimization is NOT a valid excuse for writing ten-page
functions."
This is an extract from the website you gave, and every programmmer (any
language) should stick to that.

Ham

Anand Hariharan

unread,
Feb 21, 2010, 12:00:36 AM2/21/10
to
On Feb 20, 5:47 pm, MBALOVER <mbalov...@gmail.com> wrote:
> Thanks Richard,
>
> I am reading the web site and it is really helpful.
>
> Thanks a lot.
>
> On Feb 20, 6:07 pm, Richard Heathfield <r...@see.sig.invalid> wrote:
>
>
>
> > MBALOVER wrote:
> >

Please read up on usenet etiquette; in particular, please do not top-
post, quote signatures.


> > > For example, I read somewhere that we should avoid function calls to
> > > make the program run fast.  Calling (and returning from) a function is
> > > time-consuming, and not because the content of the function takes time
> > > to execute just getting into the function, and back out of it, uses
> > > processor time.
>


Depending upon what "avoid function calls" means, that is very bad
advice (results in spaghetti code) or if it is taken to mean inlining
(implementations provide them as optimisation options), may or may not
lead to faster programs:

http://www.parashift.com/c++-faq-lite/inline-functions.html#faq-9.3

(Note that the link is about a C++ keyword, but the specific answer is
meaningful for C as well.)

- Anand

Malcolm McLean

unread,
Feb 21, 2010, 3:09:19 AM2/21/10
to
On Feb 21, 6:53 am, Hamiral <hami...@hamham.com> wrote:
>
> "Part of the clarity is making hunks of code into functions when
> appropriate. The cost of a function call is extremely small on modern
> machines, so optimization is NOT a valid excuse for writing ten-page
> functions."
>
Normally you're right. However there might be situations where the
program doesn't have a few tight inner loops (which are usually the
only things worth optimising). In this case, the function call
overhead may be significant.

santosh

unread,
Feb 21, 2010, 3:30:00 AM2/21/10
to
Malcolm McLean <malcolm...@btinternet.com> writes:

Did you mean to say that function call overhead may be significant in
the case of programs having inner loops?

One side benefit of keeping functions relatively small is that it
becomes easier for the compiler to inline calls to such functions,
rather than massive monolithic ones. However, as in everything else,
you can take this strategy too far.


Malcolm McLean

unread,
Feb 21, 2010, 7:49:25 AM2/21/10
to
On Feb 21, 10:30 am, santosh <santosh....@gmail.com> wrote:
>
> Did you mean to say that function call overhead may be significant in
> the case of programs having inner loops?
>
Every program that takes a long time to execute will perform a large
number of operations, which means it must either have one big loop
which means most of the code is executed many many times, or several
inner loops, such that the inner loops are executed many many times,
but the rest of the code only a few times.
The second situation is by far the most common, but the first is
possible. If the first, you may find that breaking the program down
into functions has an appreciable effect on execution time. In the
second case, only adding or removing function calls from the inner
loops will have any appreciable effect.

Eric Sosman

unread,
Feb 21, 2010, 9:39:44 AM2/21/10
to
On 2/20/2010 6:02 PM, MBALOVER wrote:
> Hi all,
>
> Do you know what book that discusses what we should and should not do
> in C programming? It is not a book teaching about syntax, etc. but a
> book teaches us the experience to optimize our code.
>
> For example, I read somewhere that we should avoid function calls to
> make the program run fast. Calling (and returning from) a function is
> time-consuming, and not because the content of the function takes time
> to execute � just getting into the function, and back out of it, uses

> processor time.
>
> I want to find some experiences like this.
>
> Do you know any book discussing this?

If you find such a book, burn it.

Don't burn the author, not yet, because it's just barely
possible he might be saying something useful. However, your
understanding of C -- of programming in general, I'd guess --
is not yet advanced enough to let you distinguish folly from
useful advice.

For example, the advice that you "read somewhere" is folly,
at least in the form you've repeated it. This means either that
the writer was a fool, or (more likely) that you've omitted the
important context that might -- might! -- make the advice not
be foolhardy. Why would you leave such things out? Because,
probably, you're not yet equipped to recognize their importance.
Advice is situational; advice that is good in one situation may
be terrible in another, and you cannot use the advice wisely
until you've learned to recognize the situations.

Here's my bit of advice: First, the most important thing
about a program is that it does what it is supposed to -- if
it fails to do that, or does it wrong, the speed seldom matters.
So, concentrate first on getting your code to work, because a
fast failure is of no use to anybody.

Second, once the code is working you may -- may! -- find
that it's too slow for your purposes. If so, you must *measure*
its behavior to discover where the time is going. You must also
realize that a lot of what you measure will be specific to the
particular compiler and machine and so on that you're using, and
may not be transferable to other compilers, other machines. Make
changes as your *measurements* suggest, and *measure* the speedup;
if a change doesn't produce a speedup, rescind it. Once the
program is "fast enough," stop!

I'll leave you with Michael Jackson's oft-quoted advice:

FIRST LAW OF PROGRAM OPTIMIZATION: Don't do it.

SECOND LAW OF PROGRAM OPTIMIZATION (for experts only):
Don't do it yet.

He's smarter than both of us put together; heed his message.

--
Eric Sosman
eso...@ieee-dot-org.invalid

Richard

unread,
Feb 21, 2010, 10:25:56 AM2/21/10
to
Eric Sosman <eso...@ieee-dot-org.invalid> writes:

> On 2/20/2010 6:02 PM, MBALOVER wrote:
>> Hi all,
>>
>> Do you know what book that discusses what we should and should not do
>> in C programming? It is not a book teaching about syntax, etc. but a
>> book teaches us the experience to optimize our code.
>>
>> For example, I read somewhere that we should avoid function calls to
>> make the program run fast. Calling (and returning from) a function is
>> time-consuming, and not because the content of the function takes time

>> to execute — just getting into the function, and back out of it, uses


>> processor time.
>>
>> I want to find some experiences like this.
>>
>> Do you know any book discussing this?
>
> If you find such a book, burn it.
>
> Don't burn the author, not yet, because it's just barely
> possible he might be saying something useful. However, your
> understanding of C -- of programming in general, I'd guess --
> is not yet advanced enough to let you distinguish folly from
> useful advice.
>


Wow total and utter elitist nonsense.

Any such book can only add to knowledge and gives people food for
thought.

The link Heathtfield gave was full of good advice.

People often pick C because of the "bare metal" proximity it gives and
being aware of certain ways of keeping ones code lean and mean can only
be a benefit.

Richard Heathfield

unread,
Feb 21, 2010, 10:53:43 AM2/21/10
to
Eric Sosman wrote:
> On 2/20/2010 6:02 PM, MBALOVER wrote:
>> Hi all,
>>
>> Do you know what book that discusses what we should and should not do
>> in C programming? It is not a book teaching about syntax, etc. but a
>> book teaches us the experience to optimize our code.
>>
>> For example, I read somewhere that we should avoid function calls to
>> make the program run fast. Calling (and returning from) a function is
>> time-consuming, and not because the content of the function takes time
>> to execute � just getting into the function, and back out of it, uses
>> processor time.
>>
>> I want to find some experiences like this.
>>
>> Do you know any book discussing this?
>
> If you find such a book, burn it.

I cannot imagine that you are advocating burning books on optimisation.
Presumably you only intend to advocate burning /bad/ books on optimisation?

The Web article I cited upthread was written by Mikey Lee, once a
regular contributor to this newsgroup and a bit of an expert on
optimisation (and particularly on when not to do it).

Eric Sosman

unread,
Feb 21, 2010, 11:08:52 AM2/21/10
to
On 2/21/2010 10:53 AM, Richard Heathfield wrote:
> Eric Sosman wrote:
>> On 2/20/2010 6:02 PM, MBALOVER wrote:
>>> Hi all,
>>>
>>> Do you know what book that discusses what we should and should not do
>>> in C programming? It is not a book teaching about syntax, etc. but a
>>> book teaches us the experience to optimize our code.
>>>
>>> For example, I read somewhere that we should avoid function calls to
>>> make the program run fast. Calling (and returning from) a function is
>>> time-consuming, and not because the content of the function takes time
>>> to execute � just getting into the function, and back out of it, uses
>>> processor time.
>>>
>>> I want to find some experiences like this.
>>>
>>> Do you know any book discussing this?
>>
>> If you find such a book, burn it.
>
> I cannot imagine that you are advocating burning books on optimisation.
> Presumably you only intend to advocate burning /bad/ books on optimisation?

Not even that: My advice was directed specifically to the O.P.,
a person to whom Jackson's Second Law does not yet apply. At his
present stage of development, optimization books good or bad can
only do him harm.

--
Eric Sosman
eso...@ieee-dot-org.invalid

Richard Harter

unread,
Feb 21, 2010, 12:11:38 PM2/21/10
to
On Sun, 21 Feb 2010 05:53:51 +0100, Hamiral <ham...@hamham.com>
wrote:


As an addendum, my view is that one should create small functions
as much as possible when creating the initial version of code.
In part this is an intellectual divide and conquer strategy.
Using small functions reduces design and coding work.

Once the initial version is created, compiled, and working, then
the code can be refined. Some functions can be consolidated, and
others eliminated. When everything is in place, it is easier to
recognize infelicities.

Having many small functions works well with profilers. A
profiler can tell you which functions are consuming the most
time, which means you can cherry pick the code to be optimized.
Sometimes the right thing to do is to tune the code in a
function. Sometimes the thing to do is to reduce the number of
calls to the function. The point is that it is easier to
optimize code if you have more information.

This message brought to you by The National Platitude
Distribution Service.


Richard Harter, c...@tiac.net
http://home.tiac.net/~cri, http://www.varinoma.com
Infinity is one of those things that keep philosophers busy when they
could be more profitably spending their time weeding their garden.

Branimir Maksimovic

unread,
Feb 21, 2010, 4:40:10 PM2/21/10
to
http://www.intel.com/Assets/PDF/manual/248966.pdf
http://www.amd.com/us-en/assets/content_type/white_papers_and_tech_docs/25112.PDF
http://focus.ti.com/lit/an/spraac3/spraac3.pdf
http://www.ibm.com/developerworks/systems/library/es-plib1app.html

You just have to know that low level optimizations are not effective
universally. When writing portable software you can disregard hardware
specific optimizations and concentrate on algorithm specific
optimization.

Greets

Richard Bos

unread,
Feb 24, 2010, 12:16:20 PM2/24/10
to
MBALOVER <mbal...@gmail.com> wrote:

> I am reading the web site and it is really helpful.

Be aware that at your current level of proficiency (i.e., given that you
have to ask these questions in the first place!), the most important
section by far for you is the one titled "GOTCHAS".

Richard

Andrew Poelstra

unread,
Feb 24, 2010, 3:28:48 PM2/24/10
to

Glancing through the text in question, it's strewn with
warnings and cautions and "profile first!" and "fix bugs
first!".

So even if the OP drops 50% of the sentences he reads
into the bit bucket, statistically he'll be okay.


Nick Keighley

unread,
Feb 25, 2010, 5:19:12 AM2/25/10
to
On 21 Feb, 12:49, Malcolm McLean <malcolm.mcle...@btinternet.com>
wrote:

> On Feb 21, 10:30 am, santosh <santosh....@gmail.com> wrote:

> > Did you mean to say that function call overhead may be significant in
> > the case of programs having inner loops?
>
> Every program that takes a long time to execute will perform a large
> number of operations, which means it must either have one big loop
> which means most of the code is executed many many times, or several
> inner loops, such that the inner loops are executed many many times,
> but the rest of the code only a few times.

or its got some slow i/o or its swopping or it's doing some
unnecessary stuff

(I know "unnecessary stuff" may be covered by your "inner loops" but I
think the emphasis is important)

> The second situation is by far the most common, but the first is
> possible. If the first, you may find that breaking the program down
> into functions has an appreciable effect on execution time.

to increase it?

> In the
> second case, only adding or removing function calls from the inner
> loops will have any appreciable effect.

or removing the loops.

Micro-optisations like moving code in and out of functions will
normally not have much effect. First fix your algorithms, including
the removal of unnecessary stuff. And use profiling to find those
inner loops.


Nick Keighley

unread,
Feb 25, 2010, 5:26:18 AM2/25/10
to
On 21 Feb, 17:11, c...@tiac.net (Richard Harter) wrote:
> On Sun, 21 Feb 2010 05:53:51 +0100, Hamiral <hami...@hamham.com>
> wrote:
> >Richard Heathfield wrote:


> As an addendum, my view is that one should create small functions
> as much as possible when creating the initial version of code.
> In part this is an intellectual divide and conquer strategy.
> Using small functions reduces design and coding work.

to add to the platitudes. The saving in intellectual cost is also
important. If the program is easy to understand it will be easier to
optimise.

I've seen programs suffer from a sort of Inflation (I'm thinking the
cosmological type). The program becomes so large and complex no one
can understand it. The same things are done in different ways. Copy-
Paste is the anti-pattern d'jour. A vicious cycle sets in, the program
becomes larger and harder to understand because it is large and hard
to understand. The change rate is high because the bug rate is high.
The bug rate increases because the change rate is high.

Nick Keighley

unread,
Feb 25, 2010, 5:30:49 AM2/25/10
to
On 21 Feb, 15:53, Richard Heathfield <r...@see.sig.invalid> wrote:
> Eric Sosman wrote:
> > On 2/20/2010 6:02 PM, MBALOVER wrote:
> >> Hi all,
>
> >> Do you know what book that discusses what we should and should not do
> >> in C programming? It is not a book teaching about syntax, etc. but a
> >> book teaches us the experience to optimize our code.
>
> >> For example, I read somewhere that we should avoid function calls to
> >> make the program run fast.  

I'm sorry, as stated that's terrible advice. I encountered a
programming standard that encouraged all functions to use only one
parameter ("then the compiler will use a register rather than the
stack"). This led to packs of feral Global Variables roaming the
address space.


> >> Calling (and returning from) a function is
> >> time-consuming, and not because the content of the function takes time

> >> to execute — just getting into the function, and back out of it, uses

christian.bau

unread,
Feb 25, 2010, 6:01:39 PM2/25/10
to
On Feb 21, 4:08 pm, Eric Sosman <esos...@ieee-dot-org.invalid> wrote:

>      Not even that: My advice was directed specifically to the O.P.,
> a person to whom Jackson's Second Law does not yet apply.  At his
> present stage of development, optimization books good or bad can
> only do him harm.

There's a very simple rule that applies to the most experienced and
the most novice programmer: Don't optimise it unless you have measured
it. Measure it first. Decide whether investing time to optimise it
could possibly improve it buy an amount that is worth the work. If it
is worth it, optimise it _and measure it again_. If it didn't improve,
throw the optimisations away.

The important thing: Measure it. If you are not willing to spend the
time to measure the speed of your code, that on its own proves that
you shouldn't waste your time optimising it.

Nick

unread,
Feb 26, 2010, 2:58:33 AM2/26/10
to
"christian.bau" <christ...@cbau.wanadoo.co.uk> writes:

Among other things, and not as often spoken of, this will let you be
absolutely sure all the "scaffolding" of your program works - option
reading and parsing, I/O, data preparation etc. Once you have a
"simple" program producing the right output, and it's too slow, /then/
you can start optimising it, and if the difficult code fails to work
properly, you have a reference to compare it to.
--
Online waterways route planner | http://canalplan.eu
Plan trips, see photos, check facilities | http://canalplan.org.uk

Malcolm McLean

unread,
Feb 26, 2010, 4:25:51 AM2/26/10
to
On Feb 26, 1:01 am, "christian.bau" <christian....@cbau.wanadoo.co.uk>
wrote:

>
> There's a very simple rule that applies to the most experienced and
> the most novice programmer: Don't optimise it unless you have measured
> it. Measure it first.
>
That's right for "this program only" functions. If a function is
reusable, you have to consider whether it is worth investing the time
to make a Porche, or if a Trabby is good enough.

Ian Collins

unread,
Feb 26, 2010, 4:29:14 AM2/26/10
to

Reused doesn't matter. Whether or not a function is reusable, start
simple and optimise only if is shown to be a performance bottleneck.

--
Ian Collins

io_x

unread,
Feb 26, 2010, 12:13:34 PM2/26/10
to
"Nick Keighley" <nick_keigh...@hotmail.com> ha scritto nel messaggio
news:64f08995-132e-4254...@l19g2000yqb.googlegroups.com...

On 21 Feb, 12:49, Malcolm McLean <malcolm.mcle...@btinternet.com>
to increase it?

> In the
> second case, only adding or removing function calls from the inner
> loops will have any appreciable effect.

or removing the loops.

Micro-optisations like moving code in and out of functions will
normally not have much effect. First fix your algorithms, including
the removal of unnecessary stuff. And use profiling to find those
inner loops.

#micro-optisations could have one effect: make to think

Dann Corbit

unread,
Feb 26, 2010, 4:07:04 PM2/26/10
to
In article <6a50a5e9-62b1-4277-9e72-22ecde876730
@q16g2000yqq.googlegroups.com>, christ...@cbau.wanadoo.co.uk says...
>
> On Feb 21, 4:08ᅵpm, Eric Sosman <esos...@ieee-dot-org.invalid> wrote:
>
> > ᅵ ᅵ ᅵNot even that: My advice was directed specifically to the O.P.,
> > a person to whom Jackson's Second Law does not yet apply. ᅵAt his

> > present stage of development, optimization books good or bad can
> > only do him harm.
>
> There's a very simple rule that applies to the most experienced and
> the most novice programmer: Don't optimise it unless you have measured
> it. Measure it first. Decide whether investing time to optimise it
> could possibly improve it buy an amount that is worth the work. If it
> is worth it, optimise it _and measure it again_. If it didn't improve,
> throw the optimisations away.
>
> The important thing: Measure it. If you are not willing to spend the
> time to measure the speed of your code, that on its own proves that
> you shouldn't waste your time optimising it.

I agree with this wholeheartedly.

There is one thing though, and it has to do with initial creation of the
code.

If you know that a particular function is asymptotically superior and
you know that it is possible for the data set to grow, choosing the more
efficient algorithm is usually the safer choice. Asymptotically
superior algorithms can definitely be inferior with smaller data sets,
but they won't go into the crapper when the data blows up into a giant
set that you were not expecting.

So (IMO) the initial design phase is a good place for some thought in
this regard.

After the code is written, never try to optimize without measuring
first.

websnarf

unread,
Feb 26, 2010, 7:04:27 PM2/26/10
to
On Feb 26, 1:07 pm, Dann Corbit <dcor...@connx.com> wrote:
> In article <6a50a5e9-62b1-4277-9e72-22ecde876730
> @q16g2000yqq.googlegroups.com>, christian....@cbau.wanadoo.co.uk says...

> > The important thing: Measure it. If you are not willing to spend the
> > time to measure the speed of your code, that on its own proves that
> > you shouldn't waste your time optimising it.
>
> I agree with this wholeheartedly.
>
> There is one thing though, and it has to do with initial creation of the
> code.
>
> If you know that a particular function is asymptotically superior and
> you know that it is possible for the data set to grow, choosing the more
> efficient algorithm is usually the safer choice.  Asymptotically
> superior algorithms can definitely be inferior with smaller data sets,
> but they won't go into the crapper when the data blows up into a giant
> set that you were not expecting.
>
> So (IMO) the initial design phase is a good place for some thought in
> this regard.
>
> After the code is written, never try to optimize without measuring
> first.

I have two main comments on this:

1) You can *measure* performance before you even write code. That
speaks to your algorithmic complexity point, but can equally apply to
novel code you are writing for the first time. For example, deciding
on which data structure you should use to hold the dictionary of the
english language, it might actually pay to do some back of the
envelope calculations to see whether or not your structure fits into
the on chip L2 cache of your CPU.

2) If you are writing a *library* (that you expect to be reused) then
you don't get to say what the final usage of the code is. In other
words whether or not your code is optimal will depend on some other
code written by someone else at some later time. Usually people need
the library to exist before they can write code that uses it and
someone runs a benchmark against it. As such it make sense to perform
up-front optimization that is appropriate, with *all expected usage*.

Portability, re-entrancy, maintainability and performance optimization
are all aspects of the quality of the code you produce. This
obsession with performance optimization being some kind of demon that
will cause your code to rot is just silly. Because I have spent so
much of my career optimizing code, I just do it naturally as I am
writing and designing it, so it all just comes for free anyways. The
way I make sure the quality of the code is not compromised is that I
also simultaneously write code for other quality factors as well.
That, to me, is just what it means to be a modern professional
computer programmer.

You can take this sort of hands-off "oh I never optimize until I am
totally satisfied with the code I've written that I will need to
rewrite to optimize anyways" approach if you like. But its worth
asking yourself whether or not that attitude is more dogmatic than
useful.

To the OP: the post by Branimir Maksimovic contains 4 links any one of
which is far more useful than the rest of this discussion combined.

--
Paul Hsieh
http://www.pobox.com/~qed/
http://bstring.sf.net/

Tim Rentsch

unread,
Mar 22, 2010, 10:48:34 AM3/22/10
to
websnarf <webs...@gmail.com> writes:

> On Feb 26, 1:07 pm, Dann Corbit <dcor...@connx.com> wrote:
>> In article <6a50a5e9-62b1-4277-9e72-22ecde876730
>> @q16g2000yqq.googlegroups.com>, christian....@cbau.wanadoo.co.uk says...
>> > The important thing: Measure it. If you are not willing to spend the
>> > time to measure the speed of your code, that on its own proves that
>> > you shouldn't waste your time optimising it.
>>
>> I agree with this wholeheartedly.
>>
>> There is one thing though, and it has to do with initial creation of the
>> code.
>>
>> If you know that a particular function is asymptotically superior and
>> you know that it is possible for the data set to grow, choosing the more
>> efficient algorithm is usually the safer choice. Asymptotically
>> superior algorithms can definitely be inferior with smaller data sets,
>> but they won't go into the crapper when the data blows up into a giant
>> set that you were not expecting.
>>
>> So (IMO) the initial design phase is a good place for some thought in
>> this regard.
>>
>> After the code is written, never try to optimize without measuring
>> first.
>
> I have two main comments on this:
>

> 1) You can *measure* performance before you even write code. [snip]

It's not possible to measure the performance of anything if the thing
to be measured doesn't exist. You can analyze what you expect the
performance might be, or you can measure the performance of something
else, but you can't measure the performance (or anything else) of a
piece of code until the code has been written.

Dann Corbit

unread,
Mar 22, 2010, 6:49:22 PM3/22/10
to
In article <kfnzl20...@x-alumni2.alumni.caltech.edu>, txr@x-
alumni2.alumni.caltech.edu says...

O(f(n)) is a measure.

Phil Carmody

unread,
Mar 22, 2010, 7:27:44 PM3/22/10
to

Without a scale.

Phil
--
I find the easiest thing to do is to k/f myself and just troll away
-- David Melville on r.a.s.f1

Tim Rentsch

unread,
Mar 26, 2010, 8:08:08 AM3/26/10
to
Dann Corbit <dco...@connx.com> writes:

A bound on the worst-case order of algorithm run-time is a
measure, at least in one sense of the word measure. However,
this result isn't really _measured_, it comes out of analyzing
the algorithm mathematically. And in any case we can't
determine than an algorithm is O(f(n)) before the algorithm
exists.

Dann Corbit

unread,
Mar 26, 2010, 3:00:40 PM3/26/10
to
In article <kfntys3...@x-alumni2.alumni.caltech.edu>, txr@x-
alumni2.alumni.caltech.edu says...

True, but we can apply algorithms of known complexity so that we know
how the program will behave when the problem scales.

It also matters how well I understand the input data domain.

When I refer to measuring performance before I write code, I am
describing the selection of known algorithms using worst case and
average case complexity to put a crude bound on performance so that when
the problem scales larger the tool set will not go into the crapper.

Another way of saying the same thing is that we write better code when
we start with a good design. One common reason that we end up profiling
code is that a poor choice was made in the design phase.

Branimir Maksimovic

unread,
Mar 26, 2010, 3:19:19 PM3/26/10
to
On Mon, 22 Mar 2010 07:48:34 -0700
Tim Rentsch <t...@x-alumni2.alumni.caltech.edu> wrote:

>
> It's not possible to measure the performance of anything if the thing
> to be measured doesn't exist. You can analyze what you expect the
> performance might be, or you can measure the performance of something
> else, but you can't measure the performance (or anything else) of a
> piece of code until the code has been written.

Well, you can;t measure performance till you know on what hardware
it will run. For example fast jpeg algorithm can be order of magnitude
slower than slower algorithm but more compact code.
For example if you have 64kb of fast memory and 32 mb of external
memory .
Measuring performance is meaningless if not applied to particular
hardware.
Algorithms that operate on data that fits in CPU cache are not of
concern.

Greets

--
http://maxa.homedns.org/

Sometimes online sometimes not


Ben Pfaff

unread,
Mar 26, 2010, 3:41:17 PM3/26/10
to
Branimir Maksimovic <bm...@hotmail.com> writes:

> Well, you can;t measure performance till you know on what hardware
> it will run. For example fast jpeg algorithm can be order of magnitude
> slower than slower algorithm but more compact code.
> For example if you have 64kb of fast memory and 32 mb of external
> memory .
> Measuring performance is meaningless if not applied to particular
> hardware.

In the cache of cache size and design this is not necessarily
true. Consider cache-oblivious algorithms, for example (see
http://en.wikipedia.org/wiki/Cache-oblivious_algorithm for an
introduction).
--
Ben Pfaff
http://benpfaff.org

Branimir Maksimovic

unread,
Mar 26, 2010, 3:48:55 PM3/26/10
to

Thanks. That's exactly what I need.


Greets!

Tim Rentsch

unread,
Mar 26, 2010, 7:03:17 PM3/26/10
to
Dann Corbit <dco...@connx.com> writes:

I understand your point, and I agree that the end result is
almost always better if we do good design up front. But I think
it's dishonest to call that sort of up-front design effort
"measurement", and certainly it is _not_ measuring performance.

Tim Rentsch

unread,
Mar 26, 2010, 7:09:41 PM3/26/10
to
Branimir Maksimovic <bm...@hotmail.com> writes:

> On Mon, 22 Mar 2010 07:48:34 -0700
> Tim Rentsch <t...@x-alumni2.alumni.caltech.edu> wrote:
>
>>
>> It's not possible to measure the performance of anything if the thing
>> to be measured doesn't exist. You can analyze what you expect the
>> performance might be, or you can measure the performance of something
>> else, but you can't measure the performance (or anything else) of a
>> piece of code until the code has been written.
>
> Well, you can;t measure performance till you know on what hardware

> it will run. [snip example]

Perhaps not run-time performance, but there are other
performance metrics, and some of these certainly can
be measured. How many function calls are executed,
as one example.

> Measuring performance is meaningless if not applied to particular

> hardware. [snip elaboration]

Perhaps not as meaningful as we would like, but not completely
meaningless either.

0 new messages