> However, pip has some really gnarly internal infrastructure that prevents it from taking advantage of a lot of uv's good ideas (which in turn are not all original).
FWIW, as a pip maintainer, I don't strongly agree with this statement, I think if pip had the same full time employee resources that uv has enjoyed over the last year that a lot of these issues could be solved.
I'm not saying here that pip doesn't have some gnarly internal details, just that the bigger thing holding it back is the lack of maintainer resources.
> For just one example: uv can quickly install previously installed packages by hard-linking a bunch of files from the cache. For pip to follow suit, it would have to completely redo its caching strategy from the ground up, because right now its cache is designed to save only download effort and not anything else about the installation process.
I actually think this isn't a great example, evidenced by the lack of a download or wheel command from uv due to those features not aligning with uv's caching strategy.
That said, I do think there are other good examples to your point, like uv's ability to prefetch package metadata, I don't think we're going to be able to implement that in pip any time soon due to probably the need for a complete overhaul of the resolver.
> FWIW, as a pip maintainer, I don't strongly agree with this statement, I think if pip had the same full time employee resources that uv has enjoyed over the last year that a lot of these issues could be solved.
Fair enough. I'm sure if someone were paying me a competitive salary to develop my projects, they'd be getting done much faster, too.
> I actually think this isn't a great example, evidenced by the lack of a download or wheel command from uv due to those features not aligning with uv's caching strategy.
I guess you're talking about the fact that uv's cache only stores the unpacked version, rather than the original wheel? I'm planning to keep the wheel around, too. But my point was more that because of this cache structure, pip can't even just grab the wheel from its cache without hitting the Internet, on top of not having a place to put a cache of the unpacked files.
> uv's ability to prefetch package metadata,
You mean, as opposed to obtaining it per version, lazily? Because it does seem like the .metadata file system works pretty well nowadays.
> I don't think we're going to be able to implement that in pip any time soon due to probably the need for a complete overhaul of the resolver.
Ugh, yeah. I know the resolution logic has been extracted as a specific package, but it's been challenging trying to figure out how to actually use that in a project that isn't pip.
> I guess you're talking about the fact that uv's cache only stores the unpacked version, rather than the original wheel? I'm planning to keep the wheel around, too. But my point was more that because of this cache structure, pip can't even just grab the wheel from its cache without hitting the Internet, on top of not having a place to put a cache of the unpacked files.
I'm talking about the fact there is no `uv pip download` or `uv pip wheel`.
I'm sure we'll discuss when you add this feature to uv, but I personally find uv's cache already grows too big too fast, so adding more to it makes me concerned.
> You mean, as opposed to obtaining it per version, lazily? Because it does seem like the .metadata file system works pretty well nowadays.
Yeah, one of the way uv speeds up resolving is it pre-downloads .metadata files and checks if their requirements are identical to versions it had already checked, so it can quickly rule them out. It's a clever use of a collector and an advanced resolver.
>FWIW, as a pip maintainer, I don't strongly agree with this statement, I think if pip had the same full time employee resources that uv has enjoyed over the last year that a lot of these issues could be solved.
No. This something people tell each other, that it's a lack of resources, but in reality almost all OSS projects with long standing flaws don't have a resources problem. They have a prioritization problem, where they actively ignore and refuse to work on things that affect users every single day of usage.
There are features in FreeCAD that are straight up just broken that you hit every single day of using FreeCAD. When a simple low cost fix is suggested you get immense pushback, because of how "impure" the fix is despite being entirely cosmetic, also being reversible and having no long term impact on maintenance.
Progress happens when you sidestep those who block progress. That's what realthunder was doing and that's what Astral is doing with uv. No more bullshit excuses.
This is what forks are for. Alternately, money would probably help. Otherwise it's a bit rude to impose your priorities on others.
Perhaps some larger projects lose track of what they intended to prioritize. This is also a resources problem. There's nobody available to do management (and probably also not a social structure internally that would allow for it).
> because of how "impure" the fix is despite being entirely cosmetic, also being reversible and having no long term impact on maintenance.
The developers will always be in a better position than end users to assess the long-term maintenance impact.
> The developers will always be in a better position than end users to assess the long-term maintenance impact.
At best developers may be in a better position to assess long-term consequences. The history of major software projects is also the history of many projects making technical decisions that look good in the short-term but which turn out to impede actually achieving the desired features/performance/etc, and thus lead to major refactors/rewrites.
imtringued is right though. Obstinate maintainership is a real failure mode of open source (and other) projects and it's often true that patch rejections involving phrases like "maintenance burden" are often pretextual. It's a shame to have to resort to creative destruction to improve things. Forks have a low success probability.
> Obstinate maintainership is a real failure mode of open source
I agree, but I also think sometimes that people think it's obstinate maintainership and it's not. For example, large PRs are hard to merge from a contributor who is fresh to the project because there is so much careful review that needs to be done, it's very resource intensive.
That said, one of my goals of becoming a maintainer has to be to try and making submitting a PR more friendly. Feel free to submit a PR to pip, I will be happy to help you get over any teething issues.
> No. This something people tell each other, that it's a lack of resources,
Well uv, as an OSS project, has come along with around 2 orders of magnitude more manpower than pip and has solved a lot of problems
> but in reality almost all OSS projects with long standing flaws don't have a resources problem. They have a prioritization problem, where they actively ignore and refuse to work on things that affect users every single day of usage.
I wasn't making a statement about OSS projects in general, but I agree that pip has prioritization problems but I would argue that it stems from the lack of resources.
The people who do donate their spare time are in fact only likely to donate that time on things that interest them. If there were resources to have someone act like a project manager, and other to follow their lead, then the prioritization problems could be fixed, but there aren't those resources.
> No more bullshit excuses
Then contribute your full time hours to pip, or other OSS that needs fixing, rather than arm chair commenting on hacker news.
FWIW, as a pip maintainer, I don't strongly agree with this statement, I think if pip had the same full time employee resources that uv has enjoyed over the last year that a lot of these issues could be solved.
I'm not saying here that pip doesn't have some gnarly internal details, just that the bigger thing holding it back is the lack of maintainer resources.
> For just one example: uv can quickly install previously installed packages by hard-linking a bunch of files from the cache. For pip to follow suit, it would have to completely redo its caching strategy from the ground up, because right now its cache is designed to save only download effort and not anything else about the installation process.
I actually think this isn't a great example, evidenced by the lack of a download or wheel command from uv due to those features not aligning with uv's caching strategy.
That said, I do think there are other good examples to your point, like uv's ability to prefetch package metadata, I don't think we're going to be able to implement that in pip any time soon due to probably the need for a complete overhaul of the resolver.