-
Notifications
You must be signed in to change notification settings - Fork 239
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
py-build-cmake cross-compile Windows arm64 #1557
Conversation
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
I think one of the solutions to the 'dirty hack' is to modify py-build-cmake such that it can recognize environment variable ( I have created PR tttapa/py-build-cmake#15 that make this happen. Once that PR is merged, I can then merge https://github.com/laggykiller/cibuildwheel/tree/py_build_cmake_env_var into main and add to this PR. Edit: Somehow the PR is not working, can anyone help? Or any better idea? |
btw, you may see a sample of cross-compile Windows arm64 in py-build-cmake project here: https://github.com/laggykiller/rlottie-python |
A few thoughts. First, we really shouldn't be doing backend-by-backend hacks here, especially for very rarely used backends. There are 18 matches for py-build-cmake on GitHub. It's better for backends to use what we provide (for setuptools) for now, while cross compiling is being worked on. That's what scikit-build-core does. (FYI, is there a reason you can't use scikit-build-core? It supports cross-compilation for Windows ARM on cibuildwheel already. You just have to be careful with the SOABI, scikit-build-core gives to the correct value, but FindPython doesn't) Second, I am curious about the long term plans for it. I'm okay with multiple build backends, but if @tttapa is interested in helping with scikit-build-core, having more effort into one package would be best long term, I think. Also, since py-build-cmake is based on using flit-core internals, I'd rather expect it not to be viable long term, as it will need to adapt to changes in Flit. Scikit-build-core was built from the ground-up, has extensive tests, etc. Third, py-build-cmake hasn't had a commit since April. Finally, there is an attempt to standardize cross compiling, starting with PEP 720, which highlights the current status of cross compilation. I don't think we should be adding to many more hacks until that's settled. |
I agree with Henry that most of the tasks handled by this PR should be done inside of the backend, not inside of cibuildwheel. It should now work out of the box with cibw, see e.g. https://github.com/tttapa/py-build-cmake-example/blob/e4172e092e8945beaa09dc509b7bf0f9bd77c807/.github/workflows/wheels.yml#L152
I agree. py-build-cmake was born out of the necessity to easily cross-compile binary packages, mostly for Linux on ARM. At the time, there were no alternatives: PEP 517 was still very new, and setuptools-based tools like scikit-build did not provide the necessary settings to allow for painless cross-compilation, at least not without manually overriding messy setuptools methods.
Currently, py-build-cmake uses
Why make changes if it works? :)
It's good to see this effort. This is definitely something I'll look into. Packaging binary Python packages has been quite a pain compared to e.g. Julia (where cross-compilation is the default, resulting in portable packages without accidental library dependencies and libc versioning troubles). |
@tttapa thanks for your hardwork! I am testing it |
Excellent! @laggykiller, could you see if that works for you?
There's only scikit-build/scikit-build#1013, and nothing detailed in scikit-build-core yet. Though that would make sense to open. :) I'm curious as to how you handle linux - it's hard due to manylinux & the RH dev toolkit that's used to provide a newer GCC with the old GLIBC. Without that workaround, you are stuck with compilers that are too old to work on the special architectures (at least that's what NumPy hit, They needed GCC > 6 or 7 to work on ARM, IIRC). I've heard recently that the zig has something can compile manylinux C++, but I haven't looked into it yet. And you could statically link, of course, but then you are trading larger binaries for a faster compile time, and that's usually not worth it, IMO, when you can emulate. (Unless you time out emulating, of course). I'd love to hear your thoughts and ideas, and what you'd do the same way and what you'd change. PS, I've also focused on using FindPython & back porting it to older Pythons, but if it made sense to have a custom module for cross-compiling, scikit-build-core supports Python packages supplying CMake modules.
If that's all it's using it for, then why not use pyproject-metadata? scikit-build-core and meson-python both use that. |
Also, would you be interested in adding a "see also" link to the readme or docs to scikit-build-core? We've had one to py-build-cmake for a couple of months. :)
Well, usually I at least bump the pre-commit hooks via pre-commit.ci every week or month. https://learn.scientific-python.org/development/guides/style :) |
@tttapa Seems to be working well! I'm gonna close this and related PRs. This saves much headache on my side! https://github.com/laggykiller/rlottie-python/blob/master/.github/workflows/build.yml |
One small thing to ask though: Currently it depends on btw Consider updating documentation, specifically https://cibuildwheel.readthedocs.io/en/stable/faq/#windows-arm64 to mention that |
In short, I use my own cross-compilation toolchains that I build using crosstool-ng. This allows me to select an older GLIBC version with the latest GCC (I use C++17 and C++20 in many projects). AFAIK, there's no way around statically linking the C++ standard library. Newer versions of GCC require newer versions of libstdc++, and since you cannot assume that they're available on the target system, you have to include them in your package somehow. What a tool like cibuildwheel needs then, is a Docker container (or just a big archive, really) containing:
To initiate the build, cibuildwheel should pass to the build backend: 1. the path to the compilers, 2. the root path of both Python installations, and 3. the path to a user-provided staging area containing pre-built dependencies for the host. IMHO, the Python build tools should only provide the necessary information (i.e. filesystem paths) to the native build tools. Handling all the quirks of cross-compilation is beyond their scope, and tools like CMake usually do a good job of supporting cross-compilation already, with the advantage of being the de facto standard with plenty of online resources. Determining This is getting quite long already, what would be the best place to discuss further? discuss.python.org? A GitHub discussion? In which repository?
Done! (I'll merge it into main shortly)
Thanks for the link!
I'll see if I can open a PR after I make the next stable release. |
From https://cibuildwheel.readthedocs.io/en/stable/faq/#windows-arm64
This PR would allow py-build-cmake projects to cross-compile on Windows to arm64.
Before merging, please read the following:
Note that
py-build-cmake.cross.toml
has to be either present inside the project to be build, or-C--cross=/path/to/my-cross-config.toml
flag has to be present during build. The former method is less clean, so I opted for the second option.Unfortunately, in
cibuildhwheel/windows.py
:The
extra_flags
comes frombuild_options.config_settings
, butsetup_python()
and in turnsetup_py_build_cmake_cross_compile()
does not have access tobuild_options.config_settings
. Hence I had to create the 'dirty hack'. Any advice on making it cleaner before merging?Also, code searching for suitable
cl.exe
from Visual Studio feels a bit long and might not work in all cases. Any suggestion to improve it before merging?Useful references:
https://tttapa.github.io/py-build-cmake/Cross-compilation.html
https://tttapa.github.io/py-build-cmake/Config.html
https://stackoverflow.com/questions/66063056/cross-compiling-for-x64-arm-with-msvc-tools-on-windows