(One of my summaries of the Pycon NL one-day conference in Utrecht, NL).
Full title: tooling with purpose: making smart choices as you build.
Aris uses python and data to answers research questions about everything under the ground (as geophysicist).
As a programmer you have to make lots of choices. Python environment, core project tooling, project-specific tooling, etc.
First: python environment management: pyenv/venv/pip, poetry, uv. And conda/pixi for
the scientific python world. A show of hands showed uv
to be real popular.
Now core project tooling. Which project structure? Do you use a template/cookiecutter for it? Subdirectories? A testing framework? Pytest is the default, start with that. (He mentioned “doctests” becoming very popular: that surprised me, as they were popular before 2010 and started to be considered old and deprecated after 2010. I’ll need to investigate a bit more).
Linting and type checking? Start with ruff
for formatting/checking. Mypy is the
standard type checker, but pyright/vscode and pyre are options. And the new ty
is
alpha, but looks promising.
Also, part of the core tooling: do you document your code? At least a README.
For domain specific tooling there are so many choices. It is easy to get lost. What to use for data storage? Web/API? Visualization tools. Scientific libraries.
Choose wisely! With great power comes great responsibility, but with great power also comes the burden of decision-making. Try to standardize. Enforce policies. Try to keep it simple.
Be aware of over-engineering. Over-engineering often comes with good intentions. And… sometimes complexity is the right path. As an example, look at database choices. You might wonder between SQL or a no-sql database and whether you need to shard your database. But often a simple sqlite database file is fast enough!
Configuration management: start with a simple os.getenv()
and grab settings from
environment variables. Only start using .toml files when that no longer fits your use
case.
Web/api: start simple. You probably don’t need authentication from the start if it is just a quick prototype. Get something useful working, first. Once it works, you can start working on deployment or a nicer frontend.
Async code is often said to be faster. But debugging is time-consuming and hard. Error handling is different. It only really pays off when you have many, many concurrent operations. Profile your code before you start switching to async. It won’t speed up CPU-bound code.
Logging: just start using with the built-in logging module. Basic logging is better than no logging. Don’t start the Perfect Fancy Logging Setup until you have the basics running.
Testing is good and recommended, but don’t go overboard. Don’t “mock” everything to get 100% coverage. Those kinds of tests break often. And often the tests test the mock instead of your actual code. Aim for the same amount of test code compared to your actual code.
Some closing comments:
Sometimes simple choices are better.
Don’t let decision=making slow you down. Start making prototypes.
One-size-fits-all solutions don’t exist. Evaluate for your use case.
If you are an experienced developer, help your colleagues. They have to make lots of choices.
Early-career developer? Luckily a lot of choices are already made for you due to company policy or because the project you’re working on already made most choices for you :-)
Unrelated photo from our 2025 holiday in Austria: Neufelden station. From a 1991 train trip. I remembered the valley as being beautiful. As we now do our family holidays by train, I knew where to go as soon as Austria was chosen as destination.
My name is Reinout van Rees and I program in Python, I live in the Netherlands, I cycle recumbent bikes and I have a model railway.
Most of my website content is in my weblog. You can keep up to date by subscribing to the automatic feeds (for instance with Google reader):