I have received very good advice on optimization in the venerable DDT manual of the DEC-10 on which I was allowed to work during my Ph.D. (because I had to use a CODASYL database, which was only available on this system at ETH Zurich in 1978). The DEC-10 system was a streamlined system with a very clean architecture. The full set of documentation --- including the handwritten notes on "internals of the OS" was on a small bookshelf in my office!
In this collection was a manual for DDT to help debuging programs. DEC-10 DDT and the chemical DDT "are effective against mutually distinct bugs" was the pun on a footnote on the first page! On the same first page it contained the advice about optimization:
According to Zipf's law [Blog/zipf.md] only a small amount of the code is consuming a large percentage of the running time of a program. Optimizing other parts is useless -- the example in the DDT manual showed how to improve the initializing routines, consuming 5% of the running time by 50% (unreasonably large improvement) makes the full program running faster by only 2.5% - most likely not worth the effort spent on optimizing.
My experience with the only program where I found that it ran too slow - my relational database in Pascal PANDA - I found that 50% of the running time was consumed in managing storage of page cache; a total of a single page of code. Optimizing the remainder would not have had any beneficial effect.
Therefore: wirte programs straight forward, easy to write, read and understand. Optimize afterwards, when it is demonstrated that it is too slow and optimize only after identifying the parts which are time consuming. In the long run, this rule saves a lot of (human) time in not wasting a programmer's time to read, understand and debug faulty optimized code.
Produced with SGG on with master5.dtpl.