CCL: Cleaning up dusty deck fortran...
- From: "Perry E. Metzger"
<perry(a)piermont.com>
- Subject: CCL: Cleaning up dusty deck fortran...
- Date: Sun, 25 Sep 2005 16:23:15 -0400
Sent to CCL by: "Perry E. Metzger" [perry(a)piermont.com]
--Replace strange characters with the "at" sign to recover email
address--.
> It's sorta fun living in a time of bloody well infinite computer power.
> We've got to figure out how to develop for these systems, and more
> importantly rid ourselves of the biases we've grown up with. There's
> way less which is "too slow" anymore, and the sooner we shed that
> notion the better. We don't understand all the physics, and we need
> newer code and implementations, but it's WAY better running 3-minute
> than 3-day test jobs :-).
I generally agree with what you said, but I will make one final
comment. In most fields that computers are applied to, computer time
is not precious. If, however, your code takes four days to run (as it
can if you're doing a complicated simulation), or it can run fast but
only on the one Beowulf cluster you share with many other people,
computer time *is* precious. Weather prediction, computational
chemistry and high end computer graphics are areas where optimization
does indeed count.
Pike's comment on optimization is a common piece of advice
-- do not prematurely optimize. However, if a simulation is going to
take 30 hours and you want to do hundreds of them over coming years,
getting a factor of 10 out of your run time is worth a day of your
human time many times over. Know *when* to optimize, and know *how*.
I will note that generally speaking what counts is not saving two
cycles here and three cycles there, but rather using the right tools
to figure out what it is that is taking your code the longest time,
and also picking the right algorithms. Know HOW to optimize. The
difference between the right algorithm and the wrong one is enormous,
and most importantly is often not a constant factor but rather a
complexity order. Given the nature of what we're talking about,
understanding numerical analysis tricks and the nature of problems
like numerical stability is often also crucial. Sometimes, it is also
important to understand how to cycle shave -- knowing when you can use
single precision over double precision, or how to use the vector units
on modern Pentium/Athlon hardware, can actually make a difference, but
that's a rarer discipline and usually only worth it if it makes the
difference between a 300 day computer run and a 3 day run.
Joe mentioned (correctly) that for mere USERS of this sort of software
a lot of what I've said isn't relevant, and that is true. I'm
implicitly directing my comments at those that actually write code. If
you do write code, and that code is the core of what you do for a
living, you owe it to yourself to learn the fundamentals of computer
science -- data structures, algorithms, and clean software engineering
technique. It may seem like a distraction if what you are trying to do
is do modeling and not per se computer work, but you *are* doing
computer work to do the modeling, and the time spent reading good
books on the subject and getting comfortable with your tools *will*
reduce the overall human time spent in dealing with the software as
well as the overall time your software is burning up compute time on
supercomputer clusters.
Don't be pennywise and pound foolish. Learn how your tools work. It
will save you endless amounts of heartache in the long run -- and you
*do* plan on having a long career, yes?
Perry