Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Sometimes optimization *does* matter

29 views
Skip to first unread message

Juha Nieminen

unread,
Dec 19, 2022, 4:33:48 AM12/19/22
to
You might sometimes hear someone express a sentiment along the lines of
"it's not worth trying to optimize this. What does it matter if it
takes 10 seconds unoptimized while an optimized version could take
5 seconds? Who cares? I can wait for the extra seconds. It's not such
a big deal. A quick&dirty implementation is just fine."

But what about an unoptimized quick&dirty implementation taking
*one month* for a task that can be computed in *less than a second*?

Recently Matt Parker was wondering about groups of words with no
shared letters and he wrote a program in Python that found all
groups of five words from a dictionary that shared no letters.
His program took a month to run.

https://www.youtube.com/watch?v=c33AZBnRHks

When he implemented that program he didn't even think that there's
something horribly wrong with that runtime. Any experienced
programmer will have at least a gut instinct that such a task should
run at a minimum in less than a minute, probably in a few seconds,
perhaps even less than a second.

It didn't take long for actual programmers to submit their versions
of the program, with the record holder taking less than a tenth of
a second (written in C++).

Sometimes optimization *does* matter. Sometimes expertise in programming
and algorithms does make quite a difference. It can be the difference
between the program running for a month vs. running less than a second.

Marcel Mueller

unread,
Dec 19, 2022, 4:11:41 PM12/19/22
to
Am 19.12.22 um 10:33 schrieb Juha Nieminen:
> You might sometimes hear someone express a sentiment along the lines of
> "it's not worth trying to optimize this. What does it matter if it
> takes 10 seconds unoptimized while an optimized version could take
> 5 seconds? Who cares? I can wait for the extra seconds. It's not such
> a big deal. A quick&dirty implementation is just fine."
>
> But what about an unoptimized quick&dirty implementation taking
> *one month* for a task that can be computed in *less than a second*?

I had no such case so far. But I remember an optimization of a monthly
job from 3 weeks - it was challenging to get an interval where none of
the required computers had some maintenance - to some minutes.


> Sometimes optimization *does* matter. Sometimes expertise in programming
> and algorithms does make quite a difference. It can be the difference
> between the program running for a month vs. running less than a second.

I prefer to use optimization by default, especially when the additional
effort is moderate. Sooner or later many of the quick and dirty
solutions need to be refactored. This is more work than doing it
immediately and invalidates testing. Sometimes late optimizations also
require public API changes, which can cause significant additional
effort in other teams.

I am sure, that several people will disagree and say that choosing
appropriate data structures and algorithms by default is waste of time.
But this simply does not fit to my 4 decades of programming experience.


Marcel

red floyd

unread,
Dec 19, 2022, 6:45:30 PM12/19/22
to
On 12/19/2022 1:11 PM, Marcel Mueller wrote:
> Am 19.12.22 um 10:33 schrieb Juha Nieminen:
>> You might sometimes hear someone express a sentiment along the lines of
>> "it's not worth trying to optimize this. What does it matter if it
>> takes 10 seconds unoptimized while an optimized version could take
>> 5 seconds? Who cares? I can wait for the extra seconds. It's not such
>> a big deal. A quick&dirty implementation is just fine."
>>
>> But what about an unoptimized quick&dirty implementation taking
>> *one month* for a task that can be computed in *less than a second*?
>
> I had no such case so far. But I remember an optimization of a monthly
> job from 3 weeks - it was challenging to get an interval where none of
> the required computers had some maintenance - to some minutes.
>

I had a couple early on in my career.

I optimized some code that took 4 hours to run to the point where it
only took 15 minutes. It was heavily I/O bound, so I wound up
buffering the data. But this was not premature optimization. It
came from analyzing where the bottlenecks were. It made it possible for
full system builds to be built during working hours, instead of
overnight.

Another one came from a hardware upgrade. The hardware guys added a
SCSI interface to our system (circa 1988/1989), so IPLs, which used to
take over 40 minutes only took 30 seconds. This radically changed the
way our system test guys worked. Before, if the system indicated some
corruption (we had no memory protection on the earlier system), they'd
try to go on with the corrupted system, and test other functions. With
the 60-fold reduction in program load, they'd restart clean, reload the
database, and continue with the testing, guaranteeing a good system.



Öö Tiib

unread,
Dec 20, 2022, 4:15:04 AM12/20/22
to
I have had also interesting experience. Tool made around 2004 preparing
some files took hours to run. Only few people on planet ran it only few
times per month. The time complexity was unreasonable and also lack
of concurrent processing but no one really cared ... what mattered was
that it worked correctly.

Then around 2011 arose an idea that same algorithm is useful elsewhere
and for wider audience only performance is unacceptable for that. So there
was budget and we optimized it to take counted seconds. All was fine, we
could use it for other purpose.

But the developers responsible for original tool wanted the improved
algorithm too. Because better is better. I gave it to them. They put it in.
The first reaction to that was opposition and weird FUD. Maybe we had
changed outcome, lost correctness, lost quality or compression rate of
output. :D Optimization can take some convenient "work hours" away
from someone. So be careful with it. ;)

0 new messages