Everybody has his own ideas about what, when, how to optimize code.
I'd like to ask you something about NOT optimizing.
In your opinion when is better NOT to optimize code?
Thanks
When it is not worth it
ie Dont spend an hour cleaning a spot of a window and 2 minutes cleaning the rest of the window
On the flip side (when to optimise)
Scoop the dog poop before mowing the lawn!
:lol sorry long day
in techy terms.... dont optimise something that will run once for 1000 clocks before something that runs repeatedly for 1000 clocks.... prioritize
PS Dont try to optimise Lingo's code
1) when the expected gains are not worth the expected effort to achieve them
2) when there's something more appropriate to do with your time
3) when you mis-interpret "optimise" to mean faster or smaller instead of "more appropriate" e.g. "optimising" an input routine to run in 20usec instead of 25usec even though it takes the user 10,000 times that long to press each key is not really optimising at all.
i think the key is to learn to design code well
and - along the way, learn to optimize instructions - it should become habit
at the end of the day, you may sit and write a program
if it is designed well and you have developed habits of selecting the right instruction sequences,
you won't have a lot of optimizing to do
you might test the code under a variety of conditions - see if something needs your attention
only then do you really need to buckle down and explore alternative code possibilities
i have not reached this point yet - i am trying to learn good habits
i had developed pretty good habits in the days of DOS and the 8088
very little of that applies anymore - in fact, in some ways, i now have bad habits because of it
I remember a comment in a forum years ago where the member said "optimisation is like a discussion with you wife".
At the risk of flippancy, always optimise (write optimised code in the first place), ensure you have the fastest MessageBoxA on the planet, never give up on optimising Lingo's algos once you have learnt to understand them :bg and when you get tired of all this, only bother where it matters.
The path is worth taking as you learn the difference, make the decision too early in the learning curve and you will never write optimised code.
Learn the rough 90 / 10 rule, 90% of code does nothing in particular that matters, the remaining 10% does the grunt work, learn this and you have acheived a 90% reduction in your target range to optimise.
The unit cost of optimising code starts to pay off when the code is re-usable, if you are only going to use it once, who cares, if its going to be used billions of times because its faster than sloppy alternatives, grind it down to the last PicoSecond as its unit cost is still far lower than rarely ever used slow code.
Remember, I have a messageBoxA that is nearly a whole picosecond faster than yours, kewel huh ?
Any further comments would put me at loggerheads with forum sensibility rules. :P
Something I can add, recalling from my own experience:
- When your boss thinks you are spending too much time around a project.
- When your ASM was working fine on a 64K IBM mainframe but now it
doesn't seem to work anymore on the new NCR computer your company
got for 10 million $ and you are too old or bored to rethink your way of coding.
- When you reinvent the wheel just before discovering a better one was already
there.
- When you try to modify an optimized program but you are not able to
read the code you wrote yourself.
- When before you finish to optimize, the tech has already changed and you
have to change everything in order to have your program working again.
And so many more...
Quote from: hutch-- on April 27, 2010, 01:07:49 AM
Learn the rough 90 / 10 rule, 90% of code does nothing in particular that matters, the remaining 10% does the grunt work, learn this and you have acheived a 90% reduction in your target range to optimise.
I misunderstood what people meant by the 90/10 rule for a very long time, because I didn't even consider that 90% of code the "real" code. It's usually miscellaneous stuff that in a perfect world would only be 1% of code, but because nobody bothered to make a decent standard library it ends up being 90%. I thought people meant that 10% of the code that matters took 90% of the time, which is not nearly as often true. In the AQUA@Home (http://aqua.dwavesys.com/) Fokker-Planck simulation, about 99% of the time is spent in about 160 lines of code, a bit under 10% of all lines of code, but I'd only consider about 200 lines of it "real" code.
Obviously you wouldn't optimize the code that doesn't matter (except maybe to simplify it), but people often take it one step further and use the 90/10 rule as an excuse to never optimize. 10% of 1 million lines is still an astonomical amount to optimize if you realize that it's too slow after the fact, so then they just won't bother. :wink
IMO optimization is generally a waste of time, I tend to try to optimize library routines and snippets if I will be using them often, however, as Dave said, if you use good coding practices and learn a few tricks that is generally enough. Remember that there are few programmers here that can outperform an optimizing compiler even in assembly language, not because they can't do it but because a compiler doesn't have to have binary output that is readable or debuggable in any sense. Also a compiler never forgets any of its tricks ;) There are a few places where it can be used, scanning large amounts of data comes to mind as a great target for optimizing as well as sort routines, but in both those cases a good look at the data involved and good algorithm design will generally make optimization unnecessary.
Edgar
I strongly disagree, and have several speedups of more than 10x on top of multi-threading under my belt to back it up (not to brag, but I HATE it when people talk as if it's "beating the compiler" (http://ndickson.wordpress.com/2010/04/02/importance-of-explicit-vectorization-for-cpu-and-gpu-software-performance/)), e.g. a Suzuki-Trotter decomposition quantum monte carlo application that takes months on thousands of computers (http://arxiv.org/ftp/arxiv/papers/1004/1004.0024.pdf), generation of very difficult maximum independent set problems (http://ndickson.wordpress.com/2010/01/04/the-power-of-optimization/), a Fokker-Planck classical physics simulation that takes days on thousands of computers, a quantum physics density matrix simulation that took days each time we needed to run it and now takes 30 minutes, etc.
I don't view it as "beating the compiler", since it's not a fair fight. The compiler doesn't stand a chance: it doesn't know anything about what your program is trying to do, only how you told it to do it, so it can't make intelligent decisions, e.g. choosing generating 4x as many random numbers in vector form 4x faster by knowing that you really just wanted a lot of random numbers. The developer, though, does know that they just wanted a lot of random numbers, so could choose to replace it with a vectorized random number generation implementation. It's making these intelligent decisions about the implementation that matters most in optimization, not whether you missed a clock cycle or two.
Quote from: donkey on April 30, 2010, 03:41:51 AM
IMO optimization is generally a waste of time....
There are a few places where it can be used, scanning large amounts of data comes to mind as a great target for optimizing as well as sort routines, but in both those cases a good look at the data involved and good algorithm design will generally make optimization unnecessary.
Edgar
Well, in my opinion a good algorithm design is the best way of optimizing :lol
Quote from: Neo on April 30, 2010, 07:56:52 AM
choosing generating 4x as many random numbers in vector form 4x faster by knowing that you really just wanted a lot of random numbers. The developer, though, does know that they just wanted a lot of random numbers, so could choose to replace it with a vectorized random number generation implementation. It's making these intelligent decisions about the implementation that matters most in optimization, not whether you missed a clock cycle or two.
In one of the
ASM for FUN topic we are trying to implement that, so we made an
intelligent decision ! :P
Enjoy
Frank
Who can't is against optimization :wink
Quote from: frktons on April 26, 2010, 11:00:11 PM
Everybody has his own ideas about what, when, how to optimize code.
I'd like to ask you something about NOT optimizing.
In your opinion when is better NOT to optimize code?
Thanks
when you quit asm programming, for another hobby
> Who can't is against optimization :wink
:bg
Hey if it's for your hobby projects, optimise it to hell !! If you're at your work and producing code.. think whether for the particular implementation you are making, is the tradeoff worth it for the extra man hours you are putting in to get it to run that bit faster ? Really depends on the situation. However I wouldn't want to explain to my boss I spent 2 days optimising something I spent 1 hour making to cut it down to half the speed for a routine that is rarely called ^_^
Quote from: Slugsnack on April 30, 2010, 04:22:30 PM
I wouldn't want to explain to my boss I spent 2 days optimising something I spent 1 hour making to cut it down to half the speed for a routine that is rarely called ^_^
Neither would I. :lol
Quote from: hutch-- on April 30, 2010, 04:17:19 PM
> Who can't is against optimization :wink
:bg
:lol
i can usually find enough 50 cent words to justify anything to the boss - lol
most managers are insecure enough that they don't want to let on that they didn't understand a word you said :lol