There is no "coupling" inherent to native Python tools. They can generally be installed in a separate virtual environment from the one they operate on.
For example, with Pip you simply pass the `--python` option to have it install for a different environment - you can create your venvs `--without-pip`, and share a copy of Pip across all of them. If you use Pipx, you can expose and use Pipx's vendored Pip for this, as I described in a recent blog post (https://zahlman.github.io/posts/2025/01/07/python-packaging-...).
Twine and Build don't care about your project's environment at all - Twine uploads a specified file, and Build by default creates a new environment for the build (and even withough build isolation, it only requires your build environment to contain the build backend, and the build environment doesn't have to be the one you use for the rest of development). Setuptools, similarly, either gets installed by the build frontend into an isolated build environment, or can reside in a separate dedicated build environment. It doesn't actually operate on your project's environment - it only operates on your project's source tree.
Sometimes it takes a lot of words to debunk a misconception. What you initially said didn't have anything to do with setup effort, but also there is quite little setup effort actually described here.
In that case I have absolutely no idea what point you're trying to make. What coupling are you talking about, and why is it a problem? What do you mean by "carting around a bunch of Python stuff", and why and how is it "useful" to avoid that?
Not the person you’re replying to, but if I’m write Python 3.13, and my linter requires Python <= 3.11, now I have to install 2 Pythons. It’s nice to not have to consider that case at all.
As a developer, with a real use case for these tools, I plan to support multiple Python versions anyway, and have them all installed for testing purposes. There only needs to be one actual installation of each; everything else is done through venvs, which are trivial to spin up.
If I want to ship this to colleagues (some of whom probably won't know what a venv is but do have to write python every now and again) I only have to worry about a single binary. Getting consistent python environments on peoples machines, particularly on windows, is expensive (salary).
>won't know what a venv is but do have to write python every now and again
In practical terms, you have to understand what a venv is in order to be a Python developer, for the same reason you have to understand environment variables to do anything serious in the shell. Learning the fundamentals is a one-time cost.
For example, with Pip you simply pass the `--python` option to have it install for a different environment - you can create your venvs `--without-pip`, and share a copy of Pip across all of them. If you use Pipx, you can expose and use Pipx's vendored Pip for this, as I described in a recent blog post (https://zahlman.github.io/posts/2025/01/07/python-packaging-...).
Twine and Build don't care about your project's environment at all - Twine uploads a specified file, and Build by default creates a new environment for the build (and even withough build isolation, it only requires your build environment to contain the build backend, and the build environment doesn't have to be the one you use for the rest of development). Setuptools, similarly, either gets installed by the build frontend into an isolated build environment, or can reside in a separate dedicated build environment. It doesn't actually operate on your project's environment - it only operates on your project's source tree.