Poetry Versions

Poetry provides a all in one solution to Python packaging. I want to focus on why I was quite hard on Poetry in my last post, specifically on its default version capping and solver quirks, and also a few other negative things. This is a followup to Should you have upper version bounds, which you should read before this post.

Why so hard on Poetry?

Regardless of the tone of the rest of this post, I do like Poetry, and I provide it as one of eight backends for the Scikit-HEP/cookie project generation cookiecutter and use it for several projects, including at least one library. I have great respect for what they managed to pull off, and they were one of the first alternatives to the standard tools, which was great. It’s wildly popular, though, so I don’t think I need to sing it’s praises, but rather issue warnings about some of the decisions it makes. And also point out pdm provides the same benefits, but lets you select your version capping strategy, and doesn’t cap as badly, and follows more PEP standards.

I believe most users don’t realise it has a unique, slow, and opinionated solver. Also, Poetry users are often intimidated by the plethora of tools it can replace, like setuptools/flit, venv/virtualenv, pip, pip-tools, wheel, twine, bump2version, and nox/tox; and that sort of user is very easily influenced by the defaults and recommendations they are seeing, since they do not have enough experience in the Python ecosystem to know when a recommendation is a bad one. The draw of mix-and-match tends to come later once they start having stronger opinions on the way things should work.

Not only does running poetry add <package> automatically use ^<latest>, but generating a new project adds both a caret cap to pytest and Python itself! And if you have python = ^3.6, all Poetry users who depend on your package will have to have a cap on the Python version. It doesn’t matter if you’ve read the version capping discussion agree with every single line; if you depend on just a single package that caps Python and use Poetry, you must add the cap. And, if that dependency (after reading this discussion) removes the cap, you will still be capped. Even if they removed the cap in a patch version so therefore it does not apply to you anymore.

Example (click to expand)

Let’s say I depend on library A=~1.0, and A==1.0.0 caps Python to <3.10. Poetry will force me to also cap my library to at most <3.10 in my pyproject.toml.

Now let’s say A==1.0.1 is released, and it loosened the cap to <3.11. My package now is not constrained to 3.11 by my dependencies, since I allow A==1.0.* , except by Poetry forcing me to write <3.10 in my pyproject.toml. Now I have to update, anything that depends on me has to update, and so on down the chain.

If I dependent on A=1.0.0, then this would be more reasonable. But you can’t predict the future, specifically that your dependencies may loosen or remove upper bounds; in fact, unless they are abandoned, that’s exactly what they will do over time!

I believe a resolver should only force limits on you if you pin a dependency exactly. Any pin that allows a single “newer” version of any form should never force you to duplicate the limits in those unpinned dependencies in your file. However, Poetry developers have said “this behavior by Poetry will never change”. I personally believe caps should only be made for known incompatibilities, but it doesn’t matter, I can’t use Poetry and a single dependency that caps Python version without being forced to do so myself. Even if I’m making a simple library that uses textbook Python with uncapped dependencies that I know will update to the new Python.

I’ve discussed this in Poetry, and as a result, instead of fixing the resolver, fixing the default add, and/or fixing the default template, they have a page describing why you should always cap your dependencies. As I’ve pointed out, this reasoning is invalid - you can’t ensure your code works forever by adding pins; just the opposite, in fact, you will have reduced future compatibility - especially important for a library. And if you are available to make quick updates, you can quickly update to add a pin if something breaks (and then fix it). You can ask a user to pin in an issue until you fix it if you are not available for a quick release. A user can use a locking system (like Poetry provides). Etc. Anything is better than solver errors when they are invalid.

Also, this is heavily inspired by JavaScript, where version capping is the social norm - but JavaScript has a nested package system; you do not share dependencies with anyone else. See this section of my previous post as to why this is completely impossible and destructive if applied to a system like Python and scaled to all your dependencies. This is not a social or technically feasible norm for Python.

I also do not like the dependency syntax in Poetry using TOML tables. I have one complaint with the standard dependency syntax; there was no ability for one extra to depend on another, but this was solved in pip 21.2+, and Poetry’s new syntax doesn’t actually solve that. Instead, it seems to be overly complex, depends on long inline TOML tables (which are slightly broken in TOML for users IMO since they arbitrarily don’t support newlines or trailing commas), and require as much or more repetition, and don’t actually support exposing the “dev” portion directly as an extra. If you have an extremely complex set of dependencies, maybe this would be reasonable, but I’ve avoided mixing really complex projects and Poetry.

I have also asked for Poetry to also support PEP 621, and so far they have held back, saying their system is “better” than supporting a standard they helped develop, maybe because they are unhappy that no one else liked their dependency syntax? Now GitHub has added support for Poetry’s pyproject.toml (and poetry.lock) as a replacement for setup.py for their dependency scanning, but not the standards-based settings that would have also benefited flit, pdm, trampolim, and whey (and probably many more in the future, including setuptools). Also, you have to learn the standard syntax anyway for PEP 518’s requires field that Poetry depends on to work, so you are always going to have to learn the PEP 440 syntax to use Poetry anyway.

Poetry was also very slow to support installing binary wheels Apple Silicon, or even macOS 11; while most1 of the PyPA tooling supported it quickly. This means that things like NumPy installed from source, which made Poetry basically useless for scientific work for quite a while on macOS, where source installs for NumPy don’t work. I would like to see them prioritize patch releases if there’s an entire OS affected - their own pinning system forces users to make patch releases more often, but they haven’t been doing so themselves.

My recommendation would be to consider it if you are writing an application and maybe for a library, but just make sure you fix the restrictive limits and understand the limitations and quirks. As long as you know what you are doing, it can be a great system. The “all-in-one” structure is really impressive, and using it is fun. I think the new plugin system will likely make it even more popular. But using individual tools is more Pythonic, and lets you select exactly what you like best for each situation. Flit is just as simple or simpler than Poetry for configuration, and supports PEP 621 (even if rather secretly at the moment). Setuptools is not that bad as long as you use setup.cfg instead of setup.py, and has setuptools_scm, which is really nice for some workflows. I would recommend reading either https://packaging.python.org/tutorials/packaging-projects or https://scikit-hep.org/developer to see what the composable, standard tools look like.



  1. Pipenv doesn’t count in “most” here… ↩︎

comments powered by Disqus