We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%.
40 years ago, Donald Knuth called out premature optimization by calling it the root of all evil. Today, programming languages have changed a lot but this statement might still be true.
Premature optimization is when you make code more complex in the name of efficiency without data that it's actually needed.
Having a look at the definition, the key takeaway seems to be that you shouldn't optimize your code until you have data that proves that you need to optimize something. Such data might come from user feedback for example.
Clean code and readability should come first before you start improving minor implementation details that have close to no effect on the overall performance of your application, and in the worst case, increasing complexity.
If you've come to a point where you know that you have to improve performance, you shouldn't just go ahead and start optimizing
for loop that crosses your way.
You can start by analyzing your website with tools like WebPageTest for example, which will help you find bottlenecks. Then, with this data backing you up, you can optimize the parts that truly impact performance.
Having a proper monitoring system in place is key maintaining good performance in the long run. I wrote an article about how to integrate WebPageTest into your CI pipeline with TravisCI.
Premature pessimization is when you write code that is slower than it needs to be, usually by asking for unnecessary extra work, when equivalently complex code would be faster and should just naturally flow out of your fingers.
As we will see later, using
Some people could argue that this is premature optimization. However,
const is also more idiomatic.
There are no trade-offs in using
var. This would be a case of premature pessimization.
It's important to be aware that performant and clean code is not mutually exclusive.
If you can write performant and clean code, why not do it?
Firefox published some optimization strategies for its engine SpiderMonkey, that also align with writing clean code.
SpiderMonkey attempts to optimize element accesses that always refer to the same object.
In ES6 syntax this could mean using
const when possible, which already tells the engine that this variable won't be re-assigned.
The engine also optimizes the use of variables where it's clear that it's always of the same type. Using TypeScript could make this even easier for you during development.
This might be the most important optimization. The JIT compiler detects what functions are being used the most and marks them as hot. The engine will try to optimize those functions.
I'd say no. Yes, they are optimizations but they also contribute to making your code more readable. According to the definition, you can only speak of premature optimization if you also increase complexity. Those examples actually decrease complexity.
Optimizations like these might add up to a noticeable impact on the performance of your application, while they are also essential to writing clean code.
I think premature pessimization might be a more common problem than the other way around. When hacking together a prototype or working on a small project, it's not important to focus on small performance optimizations but this shouldn't be an excuse for writing bad code. If you're used to writing high-quality code you'd probably also do so when working on smaller projects.
Whether improving performance or not, the primary goal should always be to write clean and maintainable code that other people like to work with. Adapting best practices could lead to code that is more performant and more readable at the same time.
It's also worth considering the impact of each decision. Choosing a database technology is a more important and long-term decision than choosing the right way to iterate over an array.
To sum it up, optimization is good and important but always with having the big picture in mind. Don't waste too much time on optimizing something that you think will improve performance. Make decisions based on data by monitoring the performance and identifying bottlenecks.