Even as an occasional casual Python user of several years, I noticed how much simpler it is to check out new projects using uv compared to other tools. It's such a relief because I used to encounter so many weird compatibility issues with Python, I guess mostly related to global installs of runtime versions and dependencies. In the past year or so, the situation seems to have dramatically improved thanks to uv.
Interesting. I basically dont touch python unless I have too becayse the lack of tooling. How does this tooling compare to an experience like working in Go?
uv run foo.py
# creates a virtual env, reads foo.py to see httpx is a dependency, installs in the ephemeral venv then runs the script
The above is from memory typed on a phone so maybe some minor syntax issues but the point i tried to make was we can kinda emulate the convenience of statically compiled binaries a-la Go these days
Just curious where do you see it not rival Go? Go is my main, but I do help with some python projects and I was really happy migrating from poetry to uv, with my feeling being that uv brings Go's devex to Python. The biggest feature is probably the default of auto-sync, the only way to have reproducible builds, of course along with auto provision python.
Personally I can't think of anything from Go's build system I miss now - the languages are very different for sure, but I guess we're talking about the build system only.
As someone who strongly favours Java and Python, Go is kinda the ultimate when it comes to out of the box tooling. I think Java still has the crown overall but that's because of tooling in the ecosystem, not tooling that comes with a jvm.
Want to profile your go? pprof built in (to be fair python has had cProfile forever but the Go version is more convenient to read the output).
Want to run some tests, or better yet some benchmarks? A good take on the problem space is just built in. You can safely go with the default and don't need to spend mental tax credits on selecting the best benchmarking lib from the ecosystem.
Stuff like go fmt is just taken for granted but even in the python world, there are still some non-black (and compatibles like ruff) flavoured formatters floating around - probably the most common on GH even today in Python is no formatter.
Can go on and on - go generate (maybe a tiny bit less relevant with generics being available today?), go tool, go vet, ...
I don't mean to be rude, but I don't get how this is any better. Feels too manual to type "uv -add dep script.py" instead, I feel the automation tool I'm waiting for will scan my script, auto-import all the deps I'm calling in the script while ignoring the ones that I forget to use, AND set up the env with all the deps AND run the code in the same one liner. To me, uv add X is no different than running env/pip install requirements.txt.
What people like about this workflow is that you're not maintaining a separate venv or a separate requirement and it's declarative rather than imperative, this gives you two big advantages:
First, you can move that script to a different machine and do `uv run {script}`, no need to recreate a venv or provide install instructions (I believe uv will now even grab an appropriate version of Python if you don't have it?). This comes from PEP 723, and multiple tools support doing this, such as hatch.
Second, when you "add" a requirement instead of "install" a requirement it manages that with the knowledge of all requirements that were added before. For example, if I `pip install foo` and then `pip install bar` pip does not consider foo or it's dependencies as required when installing bar, so it's possible that you can break `foo` by installing completely incompatible dependencies. But when you "add foo" and then "add bar" from uv (and other tools that are declarative, like Poetry) your environment gets updated to take everything into account.
If managing Python dependencies is second nature to you then these might seem like extra concepts to keep in your head, but lots of people do find these useful because they find they can think less about Python dependencies.
I really like using UV. I introduced it at work for builds and it made everything a lot faster which was awesome. Now I can remove the other components of the build process and just use one.
I am interested in how they're going to make money eventually, but right now it's working for me.
Does anyone have an idea about how they're going to monetize?
I feel like that's the biggest question I have about Astral. I wonder what they have in the tank. All of this software is great, but I'd like to see them get some kind of benefit, if only to assure me that they'll continue to exist and make awesome software.
(And also so they'll implement the `pip download` functionality I'd like!)
Astral's focus has been to support the simplest use case, pure Python project with a standard layout. Their aim has been that most users, and especially beginners, should be able to use it with zero configuration.
As such they do not currently support C extensions, nor running arbitrary code during the build process. I imagine they will add features slowly over time, but with the continued philosophy of the simple and common cases should be zero configuration.
For Python experts who don't have special needs from a build backend I would recommend flit_core, simplest and most stable build backend, or hatching, very stable and with lots of features. While uv_build is great, it does mean that users building (but not installing) your project need to be able to run native code, rather than pure Python. But this is a pretty small edge case that for most people it won't be an issue.
I tried installing a Python project last week after years of avoiding it like the plague.
brew install didn't work, use python3 not python, no pip pre-installed, ensurepip is crashing, you need to run sudo commands to fix this, after 1hour of struggle, repo didn't work anyway. how do people work like this?
You can just use uv now, that's the whole point, it will let you install any recent version of python and you can easily handle it from there. It'll also handle dependencies and one-off scripts for which you don't want to create a whole project/venv.
By rule, you should never meddle with the globally installed python because so many packages will try to look for the system installed Python and use it, better let your package manager handle it.
They don’t. That’s a sign that the local system is severely broken, and should be rebuilt to be stable. uv will still work in that case, but you’re going to constantly hit other points of friction on a mismanaged system which will waste time.
I've been coding in Python for 10+ years but I can never really get python's tooling ecosystem. It seems that there's always a newer shiny choice. easy_install, pip, conda, virtualenv, pipenv, setup_tools, hatchling, setuptools-scm, uv, requirements.txt, pyproject.toml...
I wish python can provide an "official" solution to each problem (like in rust, there's cargo, end of story), or at lease, an official document describing the current best practice to do things.
For the last year or so, I've been trying to provide an alternative guide that stays abreast the best options and provides simple guides: https://pydevtools.com/.
People have been asking how Astral is going to monetize. Given the "AI" posts from Astral adjacent people, I'm now considering that they might release "AI" tools for an integrated "developing" workflow.
The python team build python. I suspect that insulates them from the actual problems of trying to build things with python. Also this sort of thing gets bogged down in approval processes. People have spent decades trying to "fix" python packaging; the important thing is that uv doesn't change any of that, it's a drop in replacement.
Yeah it reminds me of bun, for node, in that way (and that probably exposes how well I understand tooling). It's like somebody just swept everything off the table and started over.
I never learned python the way I wanted to because for years I would first look at the excruciating transition from v2 to v3 and just not see a point of entry for a newb like me.
Now the same thing is happening with tooling for v3. pip? pepenv? python pip? python3 pip? I don't freakin' know. Now there's uv, and I'm kinda excited to try again.
For years Python has been build on a skeleton crew.
Especially compared to billions poured into Javascript ecosystem due to it being native language of the web.
The money for Python was never there, even when it become the top language of data engineering and data science. Even now, devs that improve Python ecosystem get fired: https://bsky.app/profile/snarky.ca/post/3lp5w5j5tws2i
If you take a look at the forum (https://discuss.python.org/), the core team basically doesn't have the resources to do anything else expect maintaining Python. They aren't paid and have to pick their battles.
They spend a lot of time on improving Python itself and then you have pip which is a way to install packages and that's it; it's not a package manager nor a python version manager.
To be fair here, the recent PEPs encourage external build back ends (which this submission is about).
That said, the people left in the CPython team generally have a low regard for bloat-free, correct and fast solutions, so external solutions are most welcome.
I was looking for Astral’s future plans to make money. Simonw already answered in another post [1] tldr - keep tooling open and free forever, build enterprise services (like a private package registry) on top.
Good thing to highlight. I'm not sure I'd bet on the game plan, but uv is an incredibly useful tool which I also wouldn't have bet on. Hopefully Simonw is right, and Astral can maintain as is.
Well, that's basically the core of Anaconda, and it's working for them.
That said, I've checked Anaconda's site, and while it used to be "Anaconda [Python] Commercial Distribution", "On-Prem repositories", "Cloud notebooks and training"... during the last year they've changed their product name to "Anaconda AI Platform", and all it's about "The operating system for AI", "Tools for the Complete AI Lifecycle". Eeeeh, no thanks.
not sure i hold out much long term hope for them either. both of these companies can eventually make money in a way that isnt shady - just not enough money to satisfy their VCs.
uv has a super power that it doesn't much talk about - seamlessly managing monorepos. I'd been using pants before, but it's such a pain to setup and maintain. uv just kinda works like you'd hope.
What a great thing to see on HackerNews this morning. Any day I can replace another tool in my team’s processes with a fast, stable, and secure solution from Astral is a great day. Thanks Astral for all the amazing work you do!
So if you've imported pytest and want to run it, you can’t just «uv run pytes», but have to create a script with a uv-shebang which runs pytest for you?
And how does that work on Windows, which to my knowledge doesn’t even support shebangs?
There's rather too many gleeful exclamation marks. Here's a rare case where a karmascope would be useful; I see https://news.ycombinator.com/user?id=the_mitsuhiko has 15025 karma, so they're almost certainly real.
I wouldn't be so sure about spelling mistakes. Even before LLMs, YouTube bots made a lot of mistakes (probably because gives the impression that it's a human typing). Currently, it's impossible to distinguish between a human and LLM comment.
Interesting that this was pushed to the bottom of the replies (despite being at the top at 20 upvotes). Did all the above comments get a coordinated signal to upvote beyond that number, or is a HN mod compromised?
Building tooling in Rust? Blasphemy! You should have used Node.js, because teaching Rust to people is too hard! And it's not doing any CPU heavy computations anyway, so Node.js is fine!
I'm continuing to be amaxed at the astral team and what they do for Python. It's become so "bad" now that when I use Rust or OCaml I find myself constantly annoyed by the build systems. What a great time to be alive!
What does uv do that Cargo does not? Cargo has been excellent in my experience, to the point that (in comparison to CMake and wanting to flee it) it is a large part of why I initially learned Rust.
Just the git code according to their'd README.md, however it seems heavily influenced by it.
Before uv I was doing everything in a devcontainer on my Mac since that was easiest, but uv is super fast that I skip that unless I have some native libraries that I need for Linux.