(One of my summaries of the May 2023 Dutch PyGrunn conference).
Mere mortals? There are two kinds pf people in the world: those that like to optimize and those that are mere mortals :-) He’ll try to get us over to the optimization side.
Python is pretty slow. It is getting faster and faster all the time, though. And speed isn’t everything: readability and maintainability are also important. And a large community. Optimization at the language level will be talked about in the “python under the hood” talk later during this conference.
Something you often hear:
Premature optimizaton is bad. Worrying about efficiency in the wrong places and at the wrong times is a real problem accourding to Donald Knuth.
Micro-optimization is bad. But… is that so? A small part of your code might be called lots of times. Have you profiled it? Look at the ROI (return on investment). Time spend optimizing code that isn’t the actual problem is time that is wasted. Time spend on slow code that is called a lot, that’s a good thing.
What he’s suggesting is what he calls opportunistic optimization. It is a bit like the “boy scouts’ rule”: make the code a little bit better than when you found it. Passively keep an eye out for simple inprovements that you can do on the codebase. If you do something, it should have a significant performance improvement in the context of the piece of code you’re working on.
For this you’ll need to know your tech stack well enough to spot common
improvement possibilities. A good point to start are python’s build-in data
structures, use them a lot and know how to use them. These are implemented in
very efficient C code. Lists, sets, dicts, generators. List comprehensions are
often both more readable and much quicker than a for
loop. f-strings
instead of string concatenation. Data classes.
Some comments:
Most of the time, you won’t need to do anything, If you don’t spot a possible optimization, the code is probably OK.
Optimization is good, but don’t change the way the program works, don’t change the flow.
Don’t use dirty tricks and loopholes to gain a bit of performance.
Don’t compromise the readability of the code!
He showed a couple of clear examples. for
loops to list
comprehensions. frozenset
for filtering out duplicates from an unchanging
set. Not compiling a regex all the time, but doing it only once. from
functools import cache, cached_property
. from itertools import islice
.
Also look at third party libs. If you have large arrays, “numpy” will improve your performance a lot, for instance.
Code optimization is balancing scarce system resources (cpu, memory) with scarce developer time. “Opportunistic optimization” might be a good approach.
My name is Reinout van Rees and I program in Python, I live in the Netherlands, I cycle recumbent bikes and I have a model railway.
Most of my website content is in my weblog. You can keep up to date by subscribing to the automatic feeds (for instance with Google reader):