Issue
I work on a cross-platform C component which is mostly used within a larger C project. it is also used by some people via Python (cffi or ctypes) as it offers substantial performance gains over optimised Numpy implementations of the algorithm we've tried to build. I'm trying to figure out: what is a sensible way to setup a continuous integration system that is able to build the required dynamic libraries (on multiple platforms), include them all in a Python distribution of the module and then test that module on all supported platforms. I'm also trying to get some indication of whether my current idea (below) is sensible or silly.
My feeling is that the CI would look like that:
- build the libraries and collect the dynamic library artefacts
- put the artefacts into the local python package
- use
python3 -m build --sdist
on a single platform or something to create the package - install and test the created distribution artefact on multiple platforms.
This seems to me slightly different from the Numpy and Scipy cases (which also build source code) as they are primarily Python packages where this is primarily a C source-code package that happens to include some binding libraries which some people want to use. The C library contains it's own tests as well which need to be run and the "python package" would be some kind of extra job.
The repository looks something like this:
/my_library/CMakeLists.txt
/my_library/include/public_api_c.h (and other stuff)
/my_library/source/implementation.c (and other stuff)
/my_library/frontend/dll_linkage/dll_api_implementation.c
/my_library/python/test/ (all pytests go in here for the python package)
/my_library/python/src/my_library/ (all python for the module goes in here)
/my_library/python/src/my_library/bin (where the binary artefacts go)
/my_library/python/pyproject.toml (and a few other things go here)
/my_library/matlab/... (matlab implementations etc)
/c_dependency_1/CMakeLists.txt (and other stuff)
/c_dependency_2/CMakeLists.txt (and other stuff)
/CMakeLists.txt (top-level CMakeLists that imports dependencies and defines what things to build)
Which means if someone gets the repository, they get access to the python and matlab implementations as well - they just won't have the binaries to use the faster python bindings.
It would be awesome if anyone could point me at some other projects that do this (particularly if those projects use CMake as a build system for the C projects) or can provide some feedback or information on what things I need to be careful of. Also, the package resides in a subdirectory of the main repository - is this going to be a problem?
Solution
Your approach could work. Selecting which platform DSO the python package should "wrap" at runtime adds complexity. You could consider a different approach, where instead you build one python wheel per platform/architecture. All those wheels are pushed to a python package index and the python install tool (pip) running on the actual "client" platform will select the matching to install. Basically you'd do the platform/arch parameterization on the "outside" of the python package instead of on the "inside". To build those wheels you can leverage cibuildwheel. Have a look at this github action workflow for a complete example of building wheels on different platforms and testing and uploading them. That example uses cmake internally to build clang-format.
Answered By - renefritze Answer Checked By - Cary Denson (PHPFixing Admin)
0 Comments:
Post a Comment
Note: Only a member of this blog may post a comment.