-
Notifications
You must be signed in to change notification settings - Fork 739
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Introduce a --multi-version
preference mode
#8686
base: main
Are you sure you want to change the base?
Conversation
629a274
to
205ea52
Compare
To make this work "as intended", I guess we'd have to fork when we notice that a package has a |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like it!
We can add a reference to this in https://docs.astral.sh/uv/reference/build_failures/#fixes-and-workarounds, since this fixes one of the numpy bullet points better.
205ea52
to
c8ce01b
Compare
Do you think we should do this "automatically", without the need to repeat the dependency declarations? I don't know how we could, it seems extremely hard, since we'd then need to fork after we get distribution metadata. |
@konstin -- Perhaps, in addition to this... when |
I think we could, with that exact caveat you mentioned that we first have to look at the wheels and then fork afterwards which changes some assumptions we currently have (fwiw requires-python doesn't help us here #8492 (comment)).
Good question. As an example, each numpy release seems to support 4 Python minor versions, so when trying to make as few forks as possible and seeing a wider range than 4 Python minor versions from oldest to latest supported, we could do with two fork. E.g. with Personally, I'm a fan of fewer version (less error surface, less to test) and fewer forks (less versions and better performance), but that's not universal. There can be users that explicitly want that behavior, since they test any Python minor anyway and want to get as much dependency-version coverage as possible. If their users use |
This is a bit of conflict between projects and libraries. For a project, I could see a universal solve being preferred (might not truly be best, as fixes for newer Python versions - or just in general - are often introduced in newer versions! But someone might want to find a single set of compatible versions). But for a library, they will almost universally want to test on the latest versions of packages, as anyone using the package will get the latest versions when they install. If library X updates and breaks you, you'd not know because you'd have an older version of X that supported an old Python, while your users on newer versions of Python would get broken. NumPy and the core Scientific Python libraries support Python for 36 months, following SPEC 0 (used to be 42 months when it was NEP 29). They can do that because they are fairly stable. Most smaller projects support a wider range, since 3.7-3.9 is heavily used in practice, and while old versions of NumPy and such work fine on older Pythons, if you are writing a new or changing library, there is no old stable version to rely on. Couldn't you either solve once per Python version then merge identical solutions, or only fork on the highest minimum bound found? The last 3-5 Python versions will usually have a single resolution, then the older ones will have unique per-Python resolutions. |
An important point is a forked resolution is never "broken" - it's just a little bigger, but a universal install can be. At best, you might be protected against a new breaking change from a library (but only due to your minimum Python version!), but you'll never get old packages on new Pythons that don't work even after a fix has been released. |
In my opinion, this issue is not unique to the scientific package ecosystem or tied to whether you package an app or a library. I think the main question here is: how do we interpret requirements as specified in Taking my example from #8492:
I have always interpreted this to mean "take the highest possible version of As far as I know, pip has always used the first strategy, so I am pretty sure that most Python users will expect that to be the "correct" behavior. So even though I do see the value of having the exact same set of package versions across all Python versions, this is probably not what people expect (at least if they have been using pip, which is still the standard and official package manager in Python). I'm not sure if I'm qualified to advise on how to implement this behavior or if it should be the default, but I like the idea of resolving everything separately per Python version (and then merging where possible). And if that was not already clear, I think this should be the default behavior simply because people will be expecting it. |
I'd like to second everything @cbrnr has said. FWIW I'm pretty sure conda and mamba behave in the same way (always retrieve the newest possible version for any combination of Python and required packages). |
crates/uv-settings/src/settings.rs
Outdated
default = "\"fewest\"", | ||
value_type = "str", | ||
example = r#" | ||
resolution = "latest" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry for the drive by comment, but should this be multi-version-mode = "latest"
? settings.md
uses resolution
in the examples too
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you yeah, this is wrong.
I don't anticipate making this the default. |
I'm happy if we get a per-Python resolution with |
Yeah this PR sort of does it but I'm not happy with the result yet, still too much work for users :) |
+1 |
The problem is that it's shared with Also, again, finding a unique set of packages that "works" over the whole range doesn't help if you get an old version of a package that's actually broken on newer Python versions but you don't know it. It's quite common that packages ship fixes for newer Python versions! Personally, in the Python ecosystem at least, I believe it's best to assume newer versions of packages are better, most library authors try to have reasonable backward compatibility and deprecation periods/warnings, and mostly ship fixes and improvements. |
1013c8f
to
86665de
Compare
I agree with @henryiii, as I'm also using |
Yeah the current iteration of this PR is such that (1) we'll solve for each Python minor automatically, no additional user input required; and (2) you can set this as an environment variable or in your |
Do you just want a way to specify the locking range separately from the |
86665de
to
4d7d411
Compare
4d7d411
to
844b835
Compare
forks.push(marker); | ||
} | ||
|
||
forks |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@konstin -- Re-requesting review as I changed this setting to also automatically solve for each Python minor.
I've always wanted that, and I've argued that tools like Poetry should support a way to specify the locking range separately. But we already have this in uv: [tool.uv]
environments = ["python_version >= '3.10'"] But I'm really hoping here is that there will be some sort of reasonable default that will be friendly toward libraries, at least in library mode. Most tools with locking solvers optimize really heavily on the project case, and have hidden issues for the very common case of working on libraries. I do a ton of teaching of python packaging, and it would be nice if the defaults were reasonable and I don't have to spend 10 or 20 minutes teaching them what a locking solver is, how to set better defaults, and why old buggy versions of dependencies are suddenly popping up. Regardless of what the solution might be, it would be very nice if depending on a package like NumPy or Sphinx worked out of the box and didn't resolve old versions on newer pythons when working on libraries! (Also, for anything that supports a Python range, I think you need to test at least at the lowest and highest versions of that range, I don't think you can get away with saying a single set of dependencies was found so it all will work.) |
@henryiii -- I'm just trying to understand -- are you arguing that we should have defaults that are framed around library usage? Or you want straightforward settings that work well for libraries? |
I'd like both. :) Having a setting would be good, good defaults would be fantastic. :) The fact you know the difference due to the presence of build-system gives me hope on the defaults. |
Sharing an alternative pitch of what we could do: The goal is to use the latest version of each package for each python (minor) version. Whether we would use a different version of a package depends on the requires-python of the packages. We can use this by forking each time we didn't select a package due to the requires python bound being to high, forking with the python version we didn't select. Assuming that newer versions only increase the upper bound of python versions supported (by having built wheels for later versions), and us selecting the latest version, we get the most python version support possible. (For simplicity, i'm ignoring the no source dist cases here) Implementation-wise, we would add a second fork point. Currently, we're only forking after retrieving and before adding dependencies, now we also fork in the version selection. Version selection is a hot path, so we need to be efficient in tracking whether we skipped a version due to requires python. numpy requires python reference: https://gist.github.com/konstin/4d97de8458999f874912d628e88a0c49 |
I think what we have here is "fine" though it has the major disadvantage that we have to encode the maximum "known" Python version. Konsti's solution is definitely better, though more work to implement. |
Summary
By default, we try to minimize the number of versions selected for each package when forking. This PR adds
--multi-version latest
which instead optimizes for choosing the latest version in each fork, and automatically forks over the set of supported Python minor versions.Closes #7190.